990 resultados para fermentation optimization
Resumo:
Vehicular Ad-hoc Networks (VANET) have different characteristics compared to other mobile ad-hoc networks. The dynamic nature of the vehicles which act as routers and clients are connected with unreliable radio links and Routing becomes a complex problem. First we propose CO-GPSR (Cooperative GPSR), an extension of the traditional GPSR (Greedy Perimeter Stateless Routing) which uses relay nodes which exploit radio path diversity in a vehicular network to increase routing performance. Next we formulate a Multi-objective decision making problem to select optimum packet relaying nodes to increase the routing performance further. We use cross layer information for the optimization process. We evaluate the routing performance more comprehensively using realistic vehicular traces and a Nakagami fading propagation model optimized for highway scenarios in VANETs. Our results show that when Multi-objective decision making is used for cross layer optimization of routing a 70% performance increment can be obtained for low vehicle densities on average, which is a two fold increase compared to the single criteria maximization approach.
Resumo:
Distributed Genetic Algorithms (DGAs) designed for the Internet have to take its high communication cost into consideration. For island model GAs, the migration topology has a major impact on DGA performance. This paper describes and evaluates an adaptive migration topology optimizer that keeps the communication load low while maintaining high solution quality. Experiments on benchmark problems show that the optimized topology outperforms static or random topologies of the same degree of connectivity. The applicability of the method on real-world problems is demonstrated on a hard optimization problem in VLSI design.
Resumo:
In this paper, we will discuss the issue of rostering jobs of cabin crew attendants at KLM. Generated schedules get easily disrupted by events such as illness of an employee. Obviously, reserve people have to be kept 'on duty' to resolve such disruptions. A lot of reserve crew requires more employees, but too few results in so-called secondary disruptions, which are particularly inconvenient for both the crew members and the planners. In this research we will discuss several modifications of the reserve scheduling policy that have a potential to reduce the number of secondary disruptions, and therefore to improve the performance of the scheduling process.
Resumo:
The Cross-Entropy (CE) is an efficient method for the estimation of rare-event probabilities and combinatorial optimization. This work presents a novel approach of the CE for optimization of a Soft-Computing controller. A Fuzzy controller was designed to command an unmanned aerial system (UAS) for avoiding collision task. The only sensor used to accomplish this task was a forward camera. The CE is used to reach a near-optimal controller by modifying the scaling factors of the controller inputs. The optimization was realized using the ROS-Gazebo simulation system. In order to evaluate the optimization a big amount of tests were carried out with a real quadcopter.
Resumo:
PURPOSE: The purpose of this study was to examine the influence of three different high-intensity interval training (HIT) regimens on endurance performance in highly trained endurance athletes. METHODS: Before, and after 2 and 4 wk of training, 38 cyclists and triathletes (mean +/- SD; age = 25 +/- 6 yr; mass = 75 +/- 7 kg; VO(2peak) = 64.5 +/- 5.2 mL x kg(-1) min(-1)) performed: 1) a progressive cycle test to measure peak oxygen consumption (VO(2peak)) and peak aerobic power output (PPO), 2) a time to exhaustion test (T(max)) at their VO(2peak) power output (P(max)), as well as 3) a 40-km time-trial (TT(40)). Subjects were matched and assigned to one of four training groups (G(2), N = 8, 8 x 60% T(max) at P(max), 1:2 work:recovery ratio; G(2), N = 9, 8 x 60% T(max) at P(max), recovery at 65% HR(max); G(3), N = 10, 12 x 30 s at 175% PPO, 4.5-min recovery; G(CON), N = 11). In addition to G(1), G(2), and G(3) performing HIT twice per week, all athletes maintained their regular low-intensity training throughout the experimental period. RESULTS: All HIT groups improved TT(40) performance (+4.4 to +5.8%) and PPO (+3.0 to +6.2%) significantly more than G(CON) (-0.9 to +1.1%; P < 0.05). Furthermore, G(1) (+5.4%) and G(2) (+8.1%) improved their VO(2peak) significantly more than G(CON) (+1.0%; P < 0.05). CONCLUSION: The present study has shown that when HIT incorporates P(max) as the interval intensity and 60% of T(max) as the interval duration, already highly trained cyclists can significantly improve their 40-km time trial performance. Moreover, the present data confirm prior research, in that repeated supramaximal HIT can significantly improve 40-km time trial performance.
Resumo:
Wireless networked control systems (WNCSs) have been widely used in the areas of manufacturing and industrial processing over the last few years. They provide real-time control with a unique characteristic: periodic traffic. These systems have a time-critical requirement. Due to current wireless mechanisms, the WNCS performance suffers from long time-varying delays, packet dropout, and inefficient channel utilization. Current wirelessly networked applications like WNCSs are designed upon the layered architecture basis. The features of this layered architecture constrain the performance of these demanding applications. Numerous efforts have attempted to use cross-layer design (CLD) approaches to improve the performance of various networked applications. However, the existing research rarely considers large-scale networks and congestion network conditions in WNCSs. In addition, there is a lack of discussions on how to apply CLD approaches in WNCSs. This thesis proposes a cross-layer design methodology to address the issues of periodic traffic timeliness, as well as to promote the efficiency of channel utilization in WNCSs. The design of the proposed CLD is highlighted by the measurement of the underlying network condition, the classification of the network state, and the adjustment of sampling period between sensors and controllers. This period adjustment is able to maintain the minimally allowable sampling period, and also maximize the control performance. Extensive simulations are conducted using the network simulator NS-2 to evaluate the performance of the proposed CLD. The comparative studies involve two aspects of communications, with and without using the proposed CLD, respectively. The results show that the proposed CLD is capable of fulfilling the timeliness requirement under congested network conditions, and is also able to improve the channel utilization efficiency and the proportion of effective data in WNCSs.
Resumo:
In the electricity market environment, coordination of system reliability and economics of a power system is of great significance in determining the available transfer capability (ATC). In addition, the risks associated with uncertainties should be properly addressed in the ATC determination process for risk-benefit maximization. Against this background, it is necessary that the ATC be optimally allocated and utilized within relative security constraints. First of all, the non-sequential Monte Carlo stimulation is employed to derive the probability density distribution of ATC of designated areas incorporating uncertainty factors. Second, on the basis of that, a multi-objective optimization model is formulated to determine the multi-area ATC so as to maximize the risk-benefits. Then, the solution to the developed model is achieved by the fast non-dominated sorting (NSGA-II) algorithm, which could decrease the risk caused by uncertainties while coordinating the ATCs of different areas. Finally, the IEEE 118-bus test system is served for demonstrating the essential features of the developed model and employed algorithm.
Resumo:
Multi-Objective optimization for designing of a benchmark cogeneration system known as CGAM cogeneration system has been performed. In optimization approach, the thermoeconomic and Environmental aspects have been considered, simultaneously. The environmental objective function has been defined and expressed in cost terms. One of the most suitable optimization techniques developed using a particular class of search algorithms known as; Multi-Objective Particle Swarm Optimization (MOPSO) algorithm has been used here. This approach has been applied to find the set of Pareto optimal solutions with respect to the aforementioned objective functions. An example of fuzzy decision-making with the aid of Bellman-Zadeh approach has been presented and a final optimal solution has been introduced.
Resumo:
This thesis presents a multi-criteria optimisation study of group replacement schedules for water pipelines, which is a capital-intensive and service critical decision. A new mathematical model was developed, which minimises total replacement costs while maintaining a satisfactory level of services. The research outcomes are expected to enrich the body of knowledge of multi-criteria decision optimisation, where group scheduling is required. The model has the potential to optimise replacement planning for other types of linear asset networks resulting in bottom-line benefits for end users and communities. The results of a real case study show that the new model can effectively reduced the total costs and service interruptions.
Resumo:
The wide applicability of correlation analysis inspired the development of this paper. In this paper, a new correlated modified particle swarm optimization (COM-PSO) is developed. The Correlation Adjustment algorithm is proposed to recover the correlation between the considered variables of all particles at each of iterations. It is shown that the best solution, the mean and standard deviation of the solutions over the multiple runs as well as the convergence speed were improved when the correlation between the variables was increased. However, for some rotated benchmark function, the contrary results are obtained. Moreover, the best solution, the mean and standard deviation of the solutions are improved when the number of correlated variables of the benchmark functions is increased. The results of simulations and convergence performance are compared with the original PSO. The improvement of results, the convergence speed, and the ability to simulate the correlated phenomena by the proposed COM-PSO are discussed by the experimental results.
Resumo:
The K-means algorithm is one of the most popular techniques in clustering. Nevertheless, the performance of the K-means algorithm depends highly on initial cluster centers and converges to local minima. This paper proposes a hybrid evolutionary programming based clustering algorithm, called PSO-SA, by combining particle swarm optimization (PSO) and simulated annealing (SA). The basic idea is to search around the global solution by SA and to increase the information exchange among particles using a mutation operator to escape local optima. Three datasets, Iris, Wisconsin Breast Cancer, and Ripley’s Glass, have been considered to show the effectiveness of the proposed clustering algorithm in providing optimal clusters. The simulation results show that the PSO-SA clustering algorithm not only has a better response but also converges more quickly than the K-means, PSO, and SA algorithms.
Resumo:
This paper presents a new hybrid evolutionary algorithm based on Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO) for daily Volt/Var control in distribution system including Distributed Generators (DGs). Due to the small X/R ratio and radial configuration of distribution systems, DGs have much impact on this problem. Since DGs are independent power producers or private ownership, a price based methodology is proposed as a proper signal to encourage owners of DGs in active power generation. Generally, the daily Volt/Var control is a nonlinear optimization problem. Therefore, an efficient hybrid evolutionary method based on Particle Swarm Optimization and Ant Colony Optimization (ACO), called HPSO, is proposed to determine the active power values of DGs, reactive power values of capacitors and tap positions of transformers for the next day. The feasibility of the proposed algorithm is demonstrated and compared with methods based on the original PSO, ACO and GA algorithms on IEEE 34-bus distribution feeder.
Resumo:
This paper presents a new algorithm based on honey-bee mating optimization (HBMO) to estimate harmonic state variables in distribution networks including distributed generators (DGs). The proposed algorithm performs estimation for both amplitude and phase of each harmonics by minimizing the error between the measured values from phasor measurement units (PMUs) and the values computed from the estimated parameters during the estimation process. Simulation results on two distribution test system are presented to demonstrate that the speed and accuracy of proposed distribution harmonic state estimation (DHSE) algorithm is extremely effective and efficient in comparison with the conventional algorithms such as weight least square (WLS), genetic algorithm (GA) and tabu search (TS).
Resumo:
This paper presents an efficient algorithm for multi-objective distribution feeder reconfiguration based on Modified Honey Bee Mating Optimization (MHBMO) approach. The main objective of the Distribution feeder reconfiguration (DFR) is to minimize the real power loss, deviation of the nodes’ voltage. Because of the fact that the objectives are different and no commensurable, it is difficult to solve the problem by conventional approaches that may optimize a single objective. So the metahuristic algorithm has been applied to this problem. This paper describes the full algorithm to Objective functions paid, The results of simulations on a 32 bus distribution system is given and shown high accuracy and optimize the proposed algorithm in power loss minimization.
Resumo:
A long query provides more useful hints for searching relevant documents, but it is likely to introduce noise which affects retrieval performance. In order to smooth such adverse effect, it is important to reduce noisy terms, introduce and boost additional relevant terms. This paper presents a comprehensive framework, called Aspect Hidden Markov Model (AHMM), which integrates query reduction and expansion, for retrieval with long queries. It optimizes the probability distribution of query terms by utilizing intra-query term dependencies as well as the relationships between query terms and words observed in relevance feedback documents. Empirical evaluation on three large-scale TREC collections demonstrates that our approach, which is automatic, achieves salient improvements over various strong baselines, and also reaches a comparable performance to a state of the art method based on user’s interactive query term reduction and expansion.