997 resultados para Query Optimization
Resumo:
A baculovirus-insect cell expression system potentially provides the means to produce prophylactic HIV-1 virus-like particle (VLP) vaccines inexpensively and in large quantities. However, the system must be optimized to maximize yields and increase process efficiency. In this study, we optimized the production of two novel, chimeric HIV-1 VLP vaccine candidates (GagRT and GagTN) in insect cells. This was done by monitoring the effects of four specific factors on VLP expression: these were insect cell line, cell density, multiplicity of infection (MOI), and infection time. The use of western blots, Gag p24 ELISA, and four-factorial ANOVA allowed the determination of the most favorable conditions for chimeric VLP production, as well as which factors affected VLP expression most significantly. Both VLP vaccine candidates favored similar optimal conditions, demonstrating higher yields of VLPs when produced in the Trichoplusia ni Pro insect cell line, at a cell density of 1 × 106 cells/mL, and an infection time of 96 h post infection. It was found that cell density and infection time were major influencing factors, but that MOI did not affect VLP expression significantly. This work provides a potentially valuable guideline for HIV-1 protein vaccine optimization, as well as for general optimization of a baculovirus-based expression system to produce complex recombinant proteins. © 2009 American Institute of Chemical Engineers.
Resumo:
Vehicular Ad-hoc Networks (VANET) have different characteristics compared to other mobile ad-hoc networks. The dynamic nature of the vehicles which act as routers and clients are connected with unreliable radio links and Routing becomes a complex problem. First we propose CO-GPSR (Cooperative GPSR), an extension of the traditional GPSR (Greedy Perimeter Stateless Routing) which uses relay nodes which exploit radio path diversity in a vehicular network to increase routing performance. Next we formulate a Multi-objective decision making problem to select optimum packet relaying nodes to increase the routing performance further. We use cross layer information for the optimization process. We evaluate the routing performance more comprehensively using realistic vehicular traces and a Nakagami fading propagation model optimized for highway scenarios in VANETs. Our results show that when Multi-objective decision making is used for cross layer optimization of routing a 70% performance increment can be obtained for low vehicle densities on average, which is a two fold increase compared to the single criteria maximization approach.
Resumo:
Distributed Genetic Algorithms (DGAs) designed for the Internet have to take its high communication cost into consideration. For island model GAs, the migration topology has a major impact on DGA performance. This paper describes and evaluates an adaptive migration topology optimizer that keeps the communication load low while maintaining high solution quality. Experiments on benchmark problems show that the optimized topology outperforms static or random topologies of the same degree of connectivity. The applicability of the method on real-world problems is demonstrated on a hard optimization problem in VLSI design.
Resumo:
In this paper, we will discuss the issue of rostering jobs of cabin crew attendants at KLM. Generated schedules get easily disrupted by events such as illness of an employee. Obviously, reserve people have to be kept 'on duty' to resolve such disruptions. A lot of reserve crew requires more employees, but too few results in so-called secondary disruptions, which are particularly inconvenient for both the crew members and the planners. In this research we will discuss several modifications of the reserve scheduling policy that have a potential to reduce the number of secondary disruptions, and therefore to improve the performance of the scheduling process.
Resumo:
The Cross-Entropy (CE) is an efficient method for the estimation of rare-event probabilities and combinatorial optimization. This work presents a novel approach of the CE for optimization of a Soft-Computing controller. A Fuzzy controller was designed to command an unmanned aerial system (UAS) for avoiding collision task. The only sensor used to accomplish this task was a forward camera. The CE is used to reach a near-optimal controller by modifying the scaling factors of the controller inputs. The optimization was realized using the ROS-Gazebo simulation system. In order to evaluate the optimization a big amount of tests were carried out with a real quadcopter.
Resumo:
As business process management technology matures, organisations acquire more and more business process models. The management of the resulting collections of process models poses real challenges. One of these challenges concerns model retrieval where support should be provided for the formulation and efficient execution of business process model queries. As queries based on only structural information cannot deal with all querying requirements in practice, there should be support for queries that require knowledge of process model semantics. In this paper we formally define a process model query language that is based on semantic relationships between tasks in process models and is independent of any particular process modelling notation.
Resumo:
Building information modeling (BIM) is an emerging technology and process that provides rich and intelligent design information models of a facility, enabling enhanced communication, coordination, analysis, and quality control throughout all phases of a building project. Although there are many documented benefits of BIM for construction, identifying essential construction-specific information out of a BIM in an efficient and meaningful way is still a challenging task. This paper presents a framework that combines feature-based modeling and query processing to leverage BIM for construction. The feature-based modeling representation implemented enriches a BIM by representing construction-specific design features relevant to different construction management (CM) functions. The query processing implemented allows for increased flexibility to specify queries and rapidly generate the desired view from a given BIM according to the varied requirements of a specific practitioner or domain. Central to the framework is the formalization of construction domain knowledge in the form of a feature ontology and query specifications. The implementation of our framework enables the automatic extraction and querying of a wide-range of design conditions that are relevant to construction practitioners. The validation studies conducted demonstrate that our approach is significantly more effective than existing solutions. The research described in this paper has the potential to improve the efficiency and effectiveness of decision-making processes in different CM functions.
Resumo:
Aims This research sought to determine optimal corn waste stream–based fermentation medium C and N sources and incubation time to maximize pigment production by an indigenous Indonesian Penicillium spp., as well as to assess pigment pH stability. Methods and Results A Penicillium spp. was isolated from Indonesian soil, identified as Penicillium resticulosum, and used to test the effects of carbon and nitrogen type and concentrations, medium pH, incubation period and furfural on biomass and pigment yield (PY) in a waste corncob hydrolysate basal medium. Maximum red PY (497·03 ± 55·13 mg l−1) was obtained with a 21 : 1 C : N ratio, pH 5·5–6·0; yeast extract-, NH4NO3-, NaNO3-, MgSO4·7H2O-, xylose- or carboxymethylcellulose (CMC)-supplemented medium and 12 days (25°C, 60–70% relative humidity, dark) incubation. C source, C, N and furfural concentration, medium pH and incubation period all influenced biomass and PY. Pigment was pH 2–9 stable. Conclusions Penicillium resticulosum demonstrated microbial pH-stable-pigment production potential using a xylose or CMC and N source, supplemented waste stream cellulose culture medium. Significance and Impact of the Study Corn derived, waste stream cellulose can be used as a culture medium for fungal pigment production. Such application provides a process for agricultural waste stream resource reuse for production of compounds in increasing demand.
Resumo:
PURPOSE: The purpose of this study was to examine the influence of three different high-intensity interval training (HIT) regimens on endurance performance in highly trained endurance athletes. METHODS: Before, and after 2 and 4 wk of training, 38 cyclists and triathletes (mean +/- SD; age = 25 +/- 6 yr; mass = 75 +/- 7 kg; VO(2peak) = 64.5 +/- 5.2 mL x kg(-1) min(-1)) performed: 1) a progressive cycle test to measure peak oxygen consumption (VO(2peak)) and peak aerobic power output (PPO), 2) a time to exhaustion test (T(max)) at their VO(2peak) power output (P(max)), as well as 3) a 40-km time-trial (TT(40)). Subjects were matched and assigned to one of four training groups (G(2), N = 8, 8 x 60% T(max) at P(max), 1:2 work:recovery ratio; G(2), N = 9, 8 x 60% T(max) at P(max), recovery at 65% HR(max); G(3), N = 10, 12 x 30 s at 175% PPO, 4.5-min recovery; G(CON), N = 11). In addition to G(1), G(2), and G(3) performing HIT twice per week, all athletes maintained their regular low-intensity training throughout the experimental period. RESULTS: All HIT groups improved TT(40) performance (+4.4 to +5.8%) and PPO (+3.0 to +6.2%) significantly more than G(CON) (-0.9 to +1.1%; P < 0.05). Furthermore, G(1) (+5.4%) and G(2) (+8.1%) improved their VO(2peak) significantly more than G(CON) (+1.0%; P < 0.05). CONCLUSION: The present study has shown that when HIT incorporates P(max) as the interval intensity and 60% of T(max) as the interval duration, already highly trained cyclists can significantly improve their 40-km time trial performance. Moreover, the present data confirm prior research, in that repeated supramaximal HIT can significantly improve 40-km time trial performance.
Resumo:
Database security techniques are available widely. Among those techniques, the encryption method is a well-certified and established technology for protecting sensitive data. However, once encrypted, the data can no longer be easily queried. The performance of the database depends on how to encrypt the sensitive data, and an approach for searching and retrieval efficiencies that are implemented. In this paper we analyze the database queries and the data properties and propose a suitable mechanism to query the encrypted database. We proposed and analyzed the new database encryption algorithm using the Bloom Filter with the bucket index method. Finally, we demonstrated the superiority of the proposed algorithm through several experiments that should be useful for database encryption related research and application activities.
Resumo:
Wireless networked control systems (WNCSs) have been widely used in the areas of manufacturing and industrial processing over the last few years. They provide real-time control with a unique characteristic: periodic traffic. These systems have a time-critical requirement. Due to current wireless mechanisms, the WNCS performance suffers from long time-varying delays, packet dropout, and inefficient channel utilization. Current wirelessly networked applications like WNCSs are designed upon the layered architecture basis. The features of this layered architecture constrain the performance of these demanding applications. Numerous efforts have attempted to use cross-layer design (CLD) approaches to improve the performance of various networked applications. However, the existing research rarely considers large-scale networks and congestion network conditions in WNCSs. In addition, there is a lack of discussions on how to apply CLD approaches in WNCSs. This thesis proposes a cross-layer design methodology to address the issues of periodic traffic timeliness, as well as to promote the efficiency of channel utilization in WNCSs. The design of the proposed CLD is highlighted by the measurement of the underlying network condition, the classification of the network state, and the adjustment of sampling period between sensors and controllers. This period adjustment is able to maintain the minimally allowable sampling period, and also maximize the control performance. Extensive simulations are conducted using the network simulator NS-2 to evaluate the performance of the proposed CLD. The comparative studies involve two aspects of communications, with and without using the proposed CLD, respectively. The results show that the proposed CLD is capable of fulfilling the timeliness requirement under congested network conditions, and is also able to improve the channel utilization efficiency and the proportion of effective data in WNCSs.
Resumo:
A user’s query is considered to be an imprecise description of their information need. Automatic query expansion is the process of reformulating the original query with the goal of improving retrieval effectiveness. Many successful query expansion techniques ignore information about the dependencies that exist between words in natural language. However, more recent approaches have demonstrated that by explicitly modeling associations between terms significant improvements in retrieval effectiveness can be achieved over those that ignore these dependencies. State-of-the-art dependency-based approaches have been shown to primarily model syntagmatic associations. Syntagmatic associations infer a likelihood that two terms co-occur more often than by chance. However, structural linguistics relies on both syntagmatic and paradigmatic associations to deduce the meaning of a word. Given the success of dependency-based approaches and the reliance on word meanings in the query formulation process, we argue that modeling both syntagmatic and paradigmatic information in the query expansion process will improve retrieval effectiveness. This article develops and evaluates a new query expansion technique that is based on a formal, corpus-based model of word meaning that models syntagmatic and paradigmatic associations. We demonstrate that when sufficient statistical information exists, as in the case of longer queries, including paradigmatic information alone provides significant improvements in retrieval effectiveness across a wide variety of data sets. More generally, when our new query expansion approach is applied to large-scale web retrieval it demonstrates significant improvements in retrieval effectiveness over a strong baseline system, based on a commercial search engine.
Resumo:
Many successful query expansion techniques ignore information about the term dependencies that exist within natural language. However, researchers have recently demonstrated that consistent and significant improvements in retrieval effectiveness can be achieved by explicitly modelling term dependencies within the query expansion process. This has created an increased interest in dependency-based models. State-of-the-art dependency-based approaches primarily model term associations known within structural linguistics as syntagmatic associations, which are formed when terms co-occur together more often than by chance. However, structural linguistics proposes that the meaning of a word is also dependent on its paradigmatic associations, which are formed between words that can substitute for each other without effecting the acceptability of a sentence. Given the reliance on word meanings when a user formulates their query, our approach takes the novel step of modelling both syntagmatic and paradigmatic associations within the query expansion process based on the (pseudo) relevant documents returned in web search. The results demonstrate that this approach can provide significant improvements in web re- trieval effectiveness when compared to a strong benchmark retrieval system.
Resumo:
In the electricity market environment, coordination of system reliability and economics of a power system is of great significance in determining the available transfer capability (ATC). In addition, the risks associated with uncertainties should be properly addressed in the ATC determination process for risk-benefit maximization. Against this background, it is necessary that the ATC be optimally allocated and utilized within relative security constraints. First of all, the non-sequential Monte Carlo stimulation is employed to derive the probability density distribution of ATC of designated areas incorporating uncertainty factors. Second, on the basis of that, a multi-objective optimization model is formulated to determine the multi-area ATC so as to maximize the risk-benefits. Then, the solution to the developed model is achieved by the fast non-dominated sorting (NSGA-II) algorithm, which could decrease the risk caused by uncertainties while coordinating the ATCs of different areas. Finally, the IEEE 118-bus test system is served for demonstrating the essential features of the developed model and employed algorithm.