913 resultados para Multi-objective analysis
Resumo:
This thesis chronicles the design and implementation of a Internet/Intranet and database based application for the quality control of hurricane surface wind observations. A quality control session consists of selecting desired observation types to be viewed and determining a storm track based time window for viewing the data. All observations of the selected types are then plotted in a storm relative view for the chosen time window and geography is positioned for the storm-center time about which an objective analysis can be performed. Users then make decisions about data validity through visual nearest-neighbor comparison and inspection. The project employed an Object Oriented iterative development method from beginning to end and its implementation primarily features the Java programming language. ^
Resumo:
An emergency is a deviation from a planned course of events that endangers people, properties, or the environment. It can be described as an unexpected event that causes economic damage, destruction, and human suffering. When a disaster happens, Emergency Managers are expected to have a response plan to most likely disaster scenarios. Unlike earthquakes and terrorist attacks, a hurricane response plan can be activated ahead of time, since a hurricane is predicted at least five days before it makes landfall. This research looked into the logistics aspects of the problem, in an attempt to develop a hurricane relief distribution network model. We addressed the problem of how to efficiently and effectively deliver basic relief goods to victims of a hurricane disaster. Specifically, where to preposition State Staging Areas (SSA), which Points of Distributions (PODs) to activate, and the allocation of commodities to each POD. Previous research has addressed several of these issues, but not with the incorporation of the random behavior of the hurricane's intensity and path. This research presents a stochastic meta-model that deals with the location of SSAs and the allocation of commodities. The novelty of the model is that it treats the strength and path of the hurricane as stochastic processes, and models them as Discrete Markov Chains. The demand is also treated as stochastic parameter because it depends on the stochastic behavior of the hurricane. However, for the meta-model, the demand is an input that is determined using Hazards United States (HAZUS), a software developed by the Federal Emergency Management Agency (FEMA) that estimates losses due to hurricanes and floods. A solution heuristic has been developed based on simulated annealing. Since the meta-model is a multi-objective problem, the heuristic is a multi-objective simulated annealing (MOSA), in which the initial solution and the cooling rate were determined via a Design of Experiments. The experiment showed that the initial temperature (T0) is irrelevant, but temperature reduction (δ) must be very gradual. Assessment of the meta-model indicates that the Markov Chains performed as well or better than forecasts made by the National Hurricane Center (NHC). Tests of the MOSA showed that it provides solutions in an efficient manner. Thus, an illustrative example shows that the meta-model is practical.
Resumo:
The purpose of this thesis was to identify the optimal design parameters for a jet nozzle which obtains a local maximum shear stress while maximizing the average shear stress on the floor of a fluid filled system. This research examined how geometric parameters of a jet nozzle, such as the nozzle's angle, height, and orifice, influence the shear stress created on the bottom surface of a tank. Simulations were run using a Computational Fluid Dynamics (CFD) software package to determine shear stress values for a parameterized geometric domain including the jet nozzle. A response surface was created based on the shear stress values obtained from 112 simulated designs. A multi-objective optimization software utilized the response surface to generate designs with the best combination of parameters to achieve maximum shear stress and maximum average shear stress. The optimal configuration of parameters achieved larger shear stress values over a commercially available design.
Resumo:
Knowledge of cell electronics has led to their integration to medicine either by physically interfacing electronic devices with biological systems or by using electronics for both detection and characterization of biological materials. In this dissertation, an electrical impedance sensor (EIS) was used to measure the electrode surface impedance changes from cell samples of human and environmental toxicity of nanoscale materials in 2D and 3D cell culture models. The impedimetric response of human lung fibroblasts and rainbow trout gill epithelial cells when exposed to various nanomaterials was tested to determine their kinetic effects towards the cells and to demonstrate the biosensor's ability to monitor nanotoxicity in real-time. Further, the EIS allowed rapid, real-time and multi-sample analysis creating a versatile, noninvasive tool that is able to provide quantitative information with respect to alteration in cellular function. We then extended the application of the unique capabilities of the EIS to do real-time analysis of cancer cell response to externally applied alternating electric fields at different intermediate frequencies and low-intensity. Decreases in the growth profiles of the ovarian and breast cancer cells were observed with the application of 200 and 100 kHz, respectively, indicating specific inhibitory effects on dividing cells in culture in contrast to the non-cancerous HUVECs and mammary epithelial cells. We then sought to enhance the effects of the electric field by altering the cancer cell's electronegative membrane properties with HER2 antibody functionalized nanoparticles. An Annexin V/EthD-III assay and zeta potential were performed to determine the cell death mechanism indicating apoptosis and a decrease in zeta potential with the incorporation of the nanoparticles. With more negatively charged HER2-AuNPs attached to the cancer cell membrane, the decrease in membrane potential would thus leave the cells more vulnerable to the detrimental effects of the applied electric field due to the decrease in surface charge. Therefore, by altering the cell membrane potential, one could possibly control the fate of the cell. This whole cell-based biosensor will enhance our understanding of the responsiveness of cancer cells to electric field therapy and demonstrate potential therapeutic opportunities for electric field therapy in the treatment of cancer.
Resumo:
Mapping of vegetation patterns over large extents using remote sensing methods requires field sample collections for two different purposes: (1) the establishment of plant association classification systems from samples of relative abundance estimates; and (2) training for supervised image classification and accuracy assessment of satellite data derived maps. One challenge for both procedures is the establishment of confidence in results and the analysis across multiple spatial scales. Continuous data sets that enable cross-scale studies are very time consuming and expensive to acquire and such extensive field sampling can be invasive. The use of high resolution aerial photography (hrAP) offers an alternative to extensive, invasive, field sampling and can provide large volume, spatially continuous, reference information that can meet the challenges of confidence building and multi-scale analysis.
Resumo:
Many classical as well as modern optimization techniques exist. One such modern method belonging to the field of swarm intelligence is termed ant colony optimization. This relatively new concept in optimization involves the use of artificial ants and is based on real ant behavior inspired by the way ants search for food. In this thesis, a novel ant colony optimization technique for continuous domains was developed. The goal was to provide improvements in computing time and robustness when compared to other optimization algorithms. Optimization function spaces can have extreme topologies and are therefore difficult to optimize. The proposed method effectively searched the domain and solved difficult single-objective optimization problems. The developed algorithm was run for numerous classic test cases for both single and multi-objective problems. The results demonstrate that the method is robust, stable, and that the number of objective function evaluations is comparable to other optimization algorithms.
Resumo:
This thesis chronicles the design and implementation of a Intemet/Intranet and database based application for the quality control of hurricane surface wind observations. A quality control session consists of selecting desired observation types to be viewed and determining a storm track based time window for viewing the data. All observations of the selected types are then plotted in a storm relative view for the chosen time window and geography is positioned for the storm-center time about which an objective analysis can be performed. Users then make decisions about data validity through visual nearestneighbor comparison and inspection. The project employed an Object Oriented iterative development method from beginning to end and its implementation primarily features the Java programming language.
Resumo:
Thirty seven deep-sea sediment cores from the Arabian Sea were studied geochemically (49 major and trace elements) for four time slices during the Holocene and the last glacial, and in one high sedimentation rate core (century scale resolution) to detect tracers of past variations in the intensity of the atmospheric monsoon circulation and its hydrographic expression in the ocean surface. This geochemical multi-tracer approach, coupled with additional information on the grain size composition of the clastic fraction, the bulk carbonate and biogenic opal contents makes it possible to characterize the sedimentological regime in detail. Sediments characterized by a specific elemental composition (enrichment) originated from the following sources: river suspensions from the Tapti and Narbada, draining the Indian Deccan traps (Ti, Sr); Indus sediments and dust from Rajasthan and Pakistan (Rb, Cs); dust from Iran and the Persian Gulf (Al, Cr); dust from central Arabia (Mg); dust from East Africa and the Red Sea (Zr/Hf, Ti/Al). Corg, Cd, Zn, Ba, Pb, U, and the HREE are associated with the intensity of upwelling in the western Arabian Sea, but only those patterns that are consistently reproduced by all of these elements can be directly linked with the intensity of the southwest monsoon. Relying on information from a single element can be misleading, as each element is affected by various other processes than upwelling intensity and nutrient content of surface water alone. The application of the geochemical multi-tracer approach indicates that the intensity of the southwest monsoon was low during the LGM, declined to a minimum from 15,000-13,000 14C year BP, intensified slightly at the end of this interval, was almost stable during the Bölling, Alleröd and the Younger Dryas, but then intensified in two abrupt successions at the end of the Younger Dryas (9900 14C year BP) and especially in a second event during the early Holocene (8800 14C year BP). Dust discharge by northwesterly winds from Arabia exhibited a similar evolution, but followed an opposite course: high during the LGM with two primary sources-the central Arabian desert and the dry Persian Gulf region. Dust discharge from both regions reached a pronounced maximum at 15,000-13,000 14C year. At the end of this interval, however, the dust plumes from the Persian Gulf area ceased dramatically, whereas dust discharge from central Arabia decreased only slightly. Dust discharge from East Africa and the Red Sea increased synchronously with the two major events of southwest monsoon intensification as recorded in the nutrient content of surface waters. In addition to the tracers of past dust flux and surface water nutrient content, the geochemical multi-tracer approach provides information on the history of deep sea ventilation (Mo, S), which was much lower during the last glacial maximum than during the Holocene. The multi-tracer approach-i.e. a few sedimentological parameters plus a set of geochemical tracers widely available from various multi-element analysis techniques-is a highly applicable technique for studying the complex sedimentation patterns of an ocean basin, and, specifically in the case of the Arabian Sea, can even reveal the seasonal structure of climate change.
Resumo:
A uniform chronology for foraminifera-based sea surface temperature records has been established in more than 120 sediment cores obtained from the equatorial and eastern Atlantic up to the Arctic Ocean. The chronostratigraphy of the last 30,000 years is mainly based on published d18O records and 14C ages from accelerator mass spectrometry, converted into calendar-year ages. The high-precision age control provides the database necessary for the uniform reconstruction of the climate interval of the Last Glacial Maximum within the GLAMAP-2000 project.
Resumo:
An important problem faced by the oil industry is to distribute multiple oil products through pipelines. Distribution is done in a network composed of refineries (source nodes), storage parks (intermediate nodes), and terminals (demand nodes) interconnected by a set of pipelines transporting oil and derivatives between adjacent areas. Constraints related to storage limits, delivery time, sources availability, sending and receiving limits, among others, must be satisfied. Some researchers deal with this problem under a discrete viewpoint in which the flow in the network is seen as batches sending. Usually, there is no separation device between batches of different products and the losses due to interfaces may be significant. Minimizing delivery time is a typical objective adopted by engineers when scheduling products sending in pipeline networks. However, costs incurred due to losses in interfaces cannot be disregarded. The cost also depends on pumping expenses, which are mostly due to the electricity cost. Since industrial electricity tariff varies over the day, pumping at different time periods have different cost. This work presents an experimental investigation of computational methods designed to deal with the problem of distributing oil derivatives in networks considering three minimization objectives simultaneously: delivery time, losses due to interfaces and electricity cost. The problem is NP-hard and is addressed with hybrid evolutionary algorithms. Hybridizations are mainly focused on Transgenetic Algorithms and classical multi-objective evolutionary algorithm architectures such as MOEA/D, NSGA2 and SPEA2. Three architectures named MOTA/D, NSTA and SPETA are applied to the problem. An experimental study compares the algorithms on thirty test cases. To analyse the results obtained with the algorithms Pareto-compliant quality indicators are used and the significance of the results evaluated with non-parametric statistical tests.
Resumo:
Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.
Resumo:
Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.
Resumo:
The Quadratic Minimum Spanning Tree (QMST) problem is a generalization of the Minimum Spanning Tree problem in which, beyond linear costs associated to each edge, quadratic costs associated to each pair of edges must be considered. The quadratic costs are due to interaction costs between the edges. When interactions occur between adjacent edges only, the problem is named Adjacent Only Quadratic Minimum Spanning Tree (AQMST). Both QMST and AQMST are NP-hard and model a number of real world applications involving infrastructure networks design. Linear and quadratic costs are summed in the mono-objective versions of the problems. However, real world applications often deal with conflicting objectives. In those cases, considering linear and quadratic costs separately is more appropriate and multi-objective optimization provides a more realistic modelling. Exact and heuristic algorithms are investigated in this work for the Bi-objective Adjacent Only Quadratic Spanning Tree Problem. The following techniques are proposed: backtracking, branch-and-bound, Pareto Local Search, Greedy Randomized Adaptive Search Procedure, Simulated Annealing, NSGA-II, Transgenetic Algorithm, Particle Swarm Optimization and a hybridization of the Transgenetic Algorithm with the MOEA-D technique. Pareto compliant quality indicators are used to compare the algorithms on a set of benchmark instances proposed in literature.
Resumo:
The Quadratic Minimum Spanning Tree (QMST) problem is a generalization of the Minimum Spanning Tree problem in which, beyond linear costs associated to each edge, quadratic costs associated to each pair of edges must be considered. The quadratic costs are due to interaction costs between the edges. When interactions occur between adjacent edges only, the problem is named Adjacent Only Quadratic Minimum Spanning Tree (AQMST). Both QMST and AQMST are NP-hard and model a number of real world applications involving infrastructure networks design. Linear and quadratic costs are summed in the mono-objective versions of the problems. However, real world applications often deal with conflicting objectives. In those cases, considering linear and quadratic costs separately is more appropriate and multi-objective optimization provides a more realistic modelling. Exact and heuristic algorithms are investigated in this work for the Bi-objective Adjacent Only Quadratic Spanning Tree Problem. The following techniques are proposed: backtracking, branch-and-bound, Pareto Local Search, Greedy Randomized Adaptive Search Procedure, Simulated Annealing, NSGA-II, Transgenetic Algorithm, Particle Swarm Optimization and a hybridization of the Transgenetic Algorithm with the MOEA-D technique. Pareto compliant quality indicators are used to compare the algorithms on a set of benchmark instances proposed in literature.
Resumo:
The current study includes theoretical and methodological reflections on the quality of life in the city of Uberlândia, Minas Gerais. It started from the thought that the quality of life is multifactorial and is permanently under construction and the main objective of analyzing it as one of the componets of Healthy Cities's moviment. The theoretical research focused on the concepts of healthy cities, quality of life, health, sustainability, well-being, happiness, indexes and indicators. From the use of multiple search strategies, documentary and on field of quantitative and qualitative character, this research of exploratory descriptive nature can offers a contribution to the studies on the quality of life in cities. It is proposed that the studies startes to work with some concept, like some notions os life quality adequated for some paticular reality, whose notions can approach concepts already established as health. This step is important on the exploratory researches. The studies may include aspects of objective analysis, subjective or both. The objective dimension, which is most common approach, are traditionally considered variables and indicators related to: the urban infrastructure (health, education, leisure, security, mobility), dwelling (quantitative and qualitative dwlling deficit), the urban structure (density and mix uses), socioeconomic characteristics (age, income, education), urban infrastructure (sanitation, communication), governance (social mobilization and participation). To focus on the subjective dimension, most recent and unusual, it is proposed to consider the (dis)satisfaction, the personal assessment in relation to the objective aspects. In conclusion, being intrinsically related to the health, the quality of life also has a number of determinants, and the ideal of the reach of quality of life depends on the action of all citizens based on the recognition of networks and territories, in a interescalar perspective and intersectoral. Therefore, emphasis in given on the potential of tools, such as the observatories, to monitor and intervent in reality, aiming in a building process of healthy cities.