947 resultados para Bio-inspired optimization techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Robotics is a field that presents a large number of problems because it depends on a large number of disciplines, devices, technologies and tasks. Its expansion from perfectly controlled industrial environments toward open and dynamic environment presents a many new challenges, such as robots household robots or professional robots. To facilitate the rapid development of robotic systems, low cost, reusability of code, its medium and long term maintainability and robustness are required novel approaches to provide generic models and software systems who develop paradigms capable of solving these problems. For this purpose, in this paper we propose a model based on multi-agent systems inspired by the human nervous system able to transfer the control characteristics of the biological system and able to take advantage of the best properties of distributed software systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This multidisciplinary study concerns the optimal design of processes with a view to both maximizing profit and minimizing environmental impacts. This can be achieved by a combination of traditional chemical process design methods, measurements of environmental impacts and advanced mathematical optimization techniques. More to the point, this paper presents a hybrid simulation-multiobjective optimization approach that at once optimizes the production cost and minimizes the associated environmental impacts of isobutane alkylation. This approach has also made it possible to obtain the flowsheet configurations and process variables that are needed to manufacture isooctane in a way that satisfies the above-stated double aim. The problem is formulated as a Generalized Disjunctive Programming problem and solved using state-of-the-art logic-based algorithms. It is shown, starting from existing alternatives for the process, that it is possible to systematically generate a superstructure that includes alternatives not previously considered. The optimal solution, in the form a Pareto curve, includes different structural alternatives from which the most suitable design can be selected. To evaluate the environmental impact, Life Cycle Assessment based on two different indicators is employed: Ecoindicator 99 and Global Warming Potential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Power systems are large scale nonlinear systems with high complexity. Various optimization techniques and expert systems have been used in power system planning. However, there are always some factors that cannot be quantified, modeled, or even expressed by expert systems. Moreover, such planning problems are often large scale optimization problems. Although computational algorithms that are capable of handling large dimensional problems can be used, the computational costs are still very high. To solve these problems, in this paper, investigation is made to explore the efficiency and effectiveness of combining mathematic algorithms with human intelligence. It had been discovered that humans can join the decision making progresses by cognitive feedback. Based on cognitive feedback and genetic algorithm, a new algorithm called cognitive genetic algorithm is presented. This algorithm can clarify and extract human's cognition. As an important application of this cognitive genetic algorithm, a practical decision method for power distribution system planning is proposed. By using this decision method, the optimal results that satisfy human expertise can be obtained and the limitations of human experts can be minimized in the mean time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Effectively using heterogeneous, distributed information has attracted much research in recent years. Current web services technologies have been used successfully in some non data intensive distributed prototype systems. However, most of them can not work well in data intensive environment. This paper provides an infrastructure layer in data intensive environment for the effectively providing spatial information services by using the web services over the Internet. We extensively investigate and analyze the overhead of web services in data intensive environment, and propose some new optimization techniques which can greatly increase the system’s efficiency. Our experiments show that these techniques are suitable to data intensive environment. Finally, we present the requirement of these techniques for the information of web services over the Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores the use of the optimization procedures in SAS/OR software with application to the contemporary logistics distribution network design using an integrated multiple criteria decision making approach. Unlike the traditional optimization techniques, the proposed approach, combining analytic hierarchy process (AHP) and goal programming (GP), considers both quantitative and qualitative factors. In the integrated approach, AHP is used to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, a GP model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. To facilitate the use of integrated multiple criteria decision making approach by SAS users, an ORMCDM code was implemented in the SAS programming language. The SAS macro developed in this paper selects the chosen variables from a SAS data file and constructs sets of linear programming models based on the selected GP model. An example is given to illustrate how one could use the code to design the logistics distribution network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The purpose of this research is to develop a holistic approach to maximize the customer service level while minimizing the logistics cost by using an integrated multiple criteria decision making (MCDM) method for the contemporary transshipment problem. Unlike the prevalent optimization techniques, this paper proposes an integrated approach which considers both quantitative and qualitative factors in order to maximize the benefits of service deliverers and customers under uncertain environments. Design/methodology/approach – This paper proposes a fuzzy-based integer linear programming model, based on the existing literature and validated with an example case. The model integrates the developed fuzzy modification of the analytic hierarchy process (FAHP), and solves the multi-criteria transshipment problem. Findings – This paper provides several novel insights about how to transform a company from a cost-based model to a service-dominated model by using an integrated MCDM method. It suggests that the contemporary customer-driven supply chain remains and increases its competitiveness from two aspects: optimizing the cost and providing the best service simultaneously. Research limitations/implications – This research used one illustrative industry case to exemplify the developed method. Considering the generalization of the research findings and the complexity of the transshipment service network, more cases across multiple industries are necessary to further enhance the validity of the research output. Practical implications – The paper includes implications for the evaluation and selection of transshipment service suppliers, the construction of optimal transshipment network as well as managing the network. Originality/value – The major advantages of this generic approach are that both quantitative and qualitative factors under fuzzy environment are considered simultaneously and also the viewpoints of service deliverers and customers are focused. Therefore, it is believed that it is useful and applicable for the transshipment service network design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accelerated probabilistic modeling algorithms, presenting stochastic local search (SLS) technique, are considered. General algorithm scheme and specific combinatorial optimization method, using “golden section” rule (GS-method), are given. Convergence rates using Markov chains are received. An overview of current combinatorial optimization techniques is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AMS subject classification: 68Q22, 90C90

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Differential evolution is an optimisation technique that has been successfully employed in various applications. In this paper, we apply differential evolution to the problem of extracting the optimal colours of a colour map for quantised images. The choice of entries in the colour map is crucial for the resulting image quality as it forms a look-up table that is used for all pixels in the image. We show that differential evolution can be effectively employed as a method for deriving the entries in the map. In order to optimise the image quality, our differential evolution approach is combined with a local search method that is guaranteed to find the local optimal colour map. This hybrid approach is shown to outperform various commonly used colour quantisation algorithms on a set of standard images. Copyright © 2010 Inderscience Enterprises Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is directed towards optimizing the radiation pattern of smart antennas using genetic algorithms. The structure of the smart antennas based on Space Division Multiple Access (SDMA) is proposed. It is composed of adaptive antennas, each of which has adjustable weight elements for amplitudes and phases of signals. The corresponding radiation pattern formula available for the utilization of numerical optimization techniques is deduced. Genetic algorithms are applied to search the best phase-amplitude weights or phase-only weights with which the optimal radiation pattern can be achieved. ^ One highlight of this work is the proposed optimal radiation pattern concept and its implementation by genetic algorithms. The results show that genetic algorithms are effective for the true Signal-Interference-Ratio (SIR) design of smart antennas. This means that not only nulls can be put in the directions of the interfering signals but also simultaneously main lobes can be formed in the directions of the desired signals. The optimal radiation pattern of a smart antenna possessing SDMA ability has been achieved. ^ The second highlight is on the weight search by genetic algorithms for the optimal radiation pattern design of antennas having more than one interfering signal. The regular criterion for determining which chromosome should be kept for the next step iteration is modified so as to improve the performance of the genetic algorithm iteration. The results show that the modified criterion can speed up and guarantee the iteration to be convergent. ^ In addition, the comparison between phase-amplitude perturbations and phase-only perturbations for the radiation pattern design of smart antennas are carried out. The effects of parameters used by the genetic algorithm on the optimal radiation pattern design are investigated. Valuable results are obtained. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

3D geographic information system (GIS) is data and computation intensive in nature. Internet users are usually equipped with low-end personal computers and network connections of limited bandwidth. Data reduction and performance optimization techniques are of critical importance in quality of service (QoS) management for online 3D GIS. In this research, QoS management issues regarding distributed 3D GIS presentation were studied to develop 3D TerraFly, an interactive 3D GIS that supports high quality online terrain visualization and navigation. ^ To tackle the QoS management challenges, multi-resolution rendering model, adaptive level of detail (LOD) control and mesh simplification algorithms were proposed to effectively reduce the terrain model complexity. The rendering model is adaptively decomposed into sub-regions of up-to-three detail levels according to viewing distance and other dynamic quality measurements. The mesh simplification algorithm was designed as a hybrid algorithm that combines edge straightening and quad-tree compression to reduce the mesh complexity by removing geometrically redundant vertices. The main advantage of this mesh simplification algorithm is that grid mesh can be directly processed in parallel without triangulation overhead. Algorithms facilitating remote accessing and distributed processing of volumetric GIS data, such as data replication, directory service, request scheduling, predictive data retrieving and caching were also proposed. ^ A prototype of the proposed 3D TerraFly implemented in this research demonstrates the effectiveness of our proposed QoS management framework in handling interactive online 3D GIS. The system implementation details and future directions of this research are also addressed in this thesis. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Développer de nouveaux nanomatériaux, interrupteurs et machines nanométriques sensibles à de petites variations de température spécifiques devrait être de grande utilité pour une multitude de domaines œuvrant dans la nanotechnologie. De plus, l’objectif est de convaincre le lecteur que les nanotechnologies à base d’ADN offrent d’énormes possibilités pour la surveillance de température en temps réel à l’échelle nanométrique. Dans la section Résultats, nous exploitons les propriétés de l’ADN pour créer des thermomètres versatiles, robustes et faciles à employer. En utilisant une série de nouvelles stratégies inspirées par la nature, nous sommes en mesure de créer des nanothermomètres d’ADN capables de mesurer des températures de 25 à 95°C avec une précision de <0.1°C. En créant de nouveaux complexes d’ADN multimériques, nous arrivons à développer des thermomètres ultrasensibles pouvant augmenter leur fluorescence 20 fois sur un intervalle de 7°C. En combinant plusieurs brins d’ADN avec des plages dynamiques différentes, nous pouvons former des thermomètres montrant une transition de phase linéaire sur 50°C. Finalement, la vitesse de réponse et la précision des thermomètres développés et leur réversibilité sont illustrées à l’aide d’une expérience de surveillance de température à l’intérieur d’un unique puits d’un appareil de qPCR. En conclusion, les applications potentielles de tels nanothermomètres en biologie synthétique, imagerie thermique cellulaire, nanomachines d’ADN et livraison contrôlée seront considérées.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work presented in this dissertation is focused on applying engineering methods to develop and explore probabilistic survival models for the prediction of decompression sickness in US NAVY divers. Mathematical modeling, computational model development, and numerical optimization techniques were employed to formulate and evaluate the predictive quality of models fitted to empirical data. In Chapters 1 and 2 we present general background information relevant to the development of probabilistic models applied to predicting the incidence of decompression sickness. The remainder of the dissertation introduces techniques developed in an effort to improve the predictive quality of probabilistic decompression models and to reduce the difficulty of model parameter optimization.

The first project explored seventeen variations of the hazard function using a well-perfused parallel compartment model. Models were parametrically optimized using the maximum likelihood technique. Model performance was evaluated using both classical statistical methods and model selection techniques based on information theory. Optimized model parameters were overall similar to those of previously published Results indicated that a novel hazard function definition that included both ambient pressure scaling and individually fitted compartment exponent scaling terms.

We developed ten pharmacokinetic compartmental models that included explicit delay mechanics to determine if predictive quality could be improved through the inclusion of material transfer lags. A fitted discrete delay parameter augmented the inflow to the compartment systems from the environment. Based on the observation that symptoms are often reported after risk accumulation begins for many of our models, we hypothesized that the inclusion of delays might improve correlation between the model predictions and observed data. Model selection techniques identified two models as having the best overall performance, but comparison to the best performing model without delay and model selection using our best identified no delay pharmacokinetic model both indicated that the delay mechanism was not statistically justified and did not substantially improve model predictions.

Our final investigation explored parameter bounding techniques to identify parameter regions for which statistical model failure will not occur. When a model predicts a no probability of a diver experiencing decompression sickness for an exposure that is known to produce symptoms, statistical model failure occurs. Using a metric related to the instantaneous risk, we successfully identify regions where model failure will not occur and identify the boundaries of the region using a root bounding technique. Several models are used to demonstrate the techniques, which may be employed to reduce the difficulty of model optimization for future investigations.