910 resultados para Multi-Objective Optimization


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiobjective Generalized Disjunctive Programming (MO-GDP) optimization has been used for the synthesis of an important industrial process, isobutane alkylation. The two objective functions to be simultaneously optimized are the environmental impact, determined by means of LCA (Life Cycle Assessment), and the economic potential of the process. The main reason for including the minimization of the environmental impact in the optimization process is the widespread environmental concern by the general public. For the resolution of the problem we employed a hybrid simulation- optimization methodology, i.e., the superstructure of the process was developed directly in a chemical process simulator connected to a state of the art optimizer. The model was formulated as a GDP and solved using a logic algorithm that avoids the reformulation as MINLP -Mixed Integer Non Linear Programming-. Our research gave us Pareto curves compounded by three different configurations where the LCA has been assessed by two different parameters: global warming potential and ecoindicator-99.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary algorithms perform optimization using a population of sample solution points. An interesting development has been to view population-based optimization as the process of evolving an explicit, probabilistic model of the search space. This paper investigates a formal basis for continuous, population-based optimization in terms of a stochastic gradient descent on the Kullback-Leibler divergence between the model probability density and the objective function, represented as an unknown density of assumed form. This leads to an update rule that is related and compared with previous theoretical work, a continuous version of the population-based incremental learning algorithm, and the generalized mean shift clustering framework. Experimental results are presented that demonstrate the dynamics of the new algorithm on a set of simple test problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, numerical simulations are used in an attempt to find optimal Source profiles for high frequency radiofrequency (RF) volume coils. Biologically loaded, shielded/unshielded circular and elliptical birdcage coils operating at 170 MHz, 300 MHz and 470 MHz are modelled using the FDTD method for both 2D and 3D cases. Taking advantage of the fact that some aspects of the electromagnetic system are linear, two approaches have been proposed for the determination of the drives for individual elements in the RF resonator. The first method is an iterative optimization technique with a kernel for the evaluation of RF fields inside an imaging plane of a human head model using pre-characterized sensitivity profiles of the individual rungs of a resonator; the second method is a regularization-based technique. In the second approach, a sensitivity matrix is explicitly constructed and a regularization procedure is employed to solve the ill-posed problem. Test simulations show that both methods can improve the B-1-field homogeneity in both focused and non-focused scenarios. While the regularization-based method is more efficient, the first optimization method is more flexible as it can take into account other issues such as controlling SAR or reshaping the resonator structures. It is hoped that these schemes and their extensions will be useful for the determination of multi-element RF drives in a variety of applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective - To evaluate the association between maintaining joint hospital and maternity pens;and persistence of multi-drug-resistant (MDR) Salmonella enterica serovar Newport on 2 dairy farms. Design - Observational study. Sample Population - Feces and environmental samples from 2 dairy herds. Procedure - Herds were monitored for fecal shedding of S enterica Newport after outbreaks of clinical disease. Fecal and environmental samples were collected approximately monthly from pens housing sick cows and calving cows and from pens containing lactating cows. Cattle shedding the organism were tested serially on subsequent visits to determine carrier status. One farm was resampled after initiation of interventional procedures, including separation of hospital and maternity pens. Isolates were characterized via serotyping, determination of antimicrobial resistance phenotype, detection of the CMY-2 gene, and DNA fingerprinting. Results - The prevalence (32.4% and 33.3% on farms A and B, respectively) of isolating Salmonella from samples from joint hospital-maternity pens was significantly higher than the prevalence in samples from pens housing preparturient cows (0.8%, both farms) and postparturient cows on Farm B (8.8%). Multi-drug-resistant Salmonella Newport was isolated in high numbers from bedding material, feed refusals, lagoon slurry, and milk filters. One cow excreted the organism for 190 days. Interventional procedures yielded significant reductions in the prevalences of isolating the organism from fecal and environmental samples. Most isolates were of the C2 serogroup and were resistant to third-generation cephalosporins. Conclusions and Clinical Relevance - Management practices may be effective at reducing the persistence of MDR Salmonella spp in dairy herds, thus mitigating animal and public health risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of gene guns in ballistically delivering DNA vaccine coated gold micro-particles to skin can potentially damage targeted cells, therefore influencing transfection efficiencies. In this paper, we assess cell death in the viable epidermis by non-invasive near infrared two-photon microscopy following micro-particle bombardment of murine skin. We show that the ballistic delivery of micro-particles to the viable epidermis can result in localised cell death. Furthermore, experimental results show the degree of cell death is dependant on the number of micro-particles delivered per unit of tissue surface area. Micro-particles densities of 0.16 +/- 0.27 (mean +/- S.D.), 1.35 +/- 0.285 and 2.72 +/- 0.47 per 1000 mu m(2) resulted in percent deaths of 3.96 +/- 5.22, 45.91 +/- 10.89, 90.52 +/- 12.28, respectively. These results suggest that optimization of transfection by genes administered with gene guns is - among other effects - a compromise of micro-particle payload and cell death. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiresolution (or multi-scale) techniques make it possible for Web-based GIS applications to access large dataset. The performance of such systems relies on data transmission over network and multiresolution query processing. In the literature the latter has received little research attention so far, and the existing methods are not capable of processing large dataset. In this paper, we aim to improve multiresolution query processing in an online environment. A cost model for such query is proposed first, followed by three strategies for its optimization. Significant theoretical improvement can be observed when comparing against available methods. Application of these strategies is also discussed, and similar performance enhancement can be expected if implemented in online GIS applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La riduzione dei consumi di combustibili fossili e lo sviluppo di tecnologie per il risparmio energetico sono una questione di centrale importanza sia per l’industria che per la ricerca, a causa dei drastici effetti che le emissioni di inquinanti antropogenici stanno avendo sull’ambiente. Mentre un crescente numero di normative e regolamenti vengono emessi per far fronte a questi problemi, la necessità di sviluppare tecnologie a basse emissioni sta guidando la ricerca in numerosi settori industriali. Nonostante la realizzazione di fonti energetiche rinnovabili sia vista come la soluzione più promettente nel lungo periodo, un’efficace e completa integrazione di tali tecnologie risulta ad oggi impraticabile, a causa sia di vincoli tecnici che della vastità della quota di energia prodotta, attualmente soddisfatta da fonti fossili, che le tecnologie alternative dovrebbero andare a coprire. L’ottimizzazione della produzione e della gestione energetica d’altra parte, associata allo sviluppo di tecnologie per la riduzione dei consumi energetici, rappresenta una soluzione adeguata al problema, che può al contempo essere integrata all’interno di orizzonti temporali più brevi. L’obiettivo della presente tesi è quello di investigare, sviluppare ed applicare un insieme di strumenti numerici per ottimizzare la progettazione e la gestione di processi energetici che possa essere usato per ottenere una riduzione dei consumi di combustibile ed un’ottimizzazione dell’efficienza energetica. La metodologia sviluppata si appoggia su un approccio basato sulla modellazione numerica dei sistemi, che sfrutta le capacità predittive, derivanti da una rappresentazione matematica dei processi, per sviluppare delle strategie di ottimizzazione degli stessi, a fronte di condizioni di impiego realistiche. Nello sviluppo di queste procedure, particolare enfasi viene data alla necessità di derivare delle corrette strategie di gestione, che tengano conto delle dinamiche degli impianti analizzati, per poter ottenere le migliori prestazioni durante l’effettiva fase operativa. Durante lo sviluppo della tesi il problema dell’ottimizzazione energetica è stato affrontato in riferimento a tre diverse applicazioni tecnologiche. Nella prima di queste è stato considerato un impianto multi-fonte per la soddisfazione della domanda energetica di un edificio ad uso commerciale. Poiché tale sistema utilizza una serie di molteplici tecnologie per la produzione dell’energia termica ed elettrica richiesta dalle utenze, è necessario identificare la corretta strategia di ripartizione dei carichi, in grado di garantire la massima efficienza energetica dell’impianto. Basandosi su un modello semplificato dell’impianto, il problema è stato risolto applicando un algoritmo di Programmazione Dinamica deterministico, e i risultati ottenuti sono stati comparati con quelli derivanti dall’adozione di una più semplice strategia a regole, provando in tal modo i vantaggi connessi all’adozione di una strategia di controllo ottimale. Nella seconda applicazione è stata investigata la progettazione di una soluzione ibrida per il recupero energetico da uno scavatore idraulico. Poiché diversi layout tecnologici per implementare questa soluzione possono essere concepiti e l’introduzione di componenti aggiuntivi necessita di un corretto dimensionamento, è necessario lo sviluppo di una metodologia che permetta di valutare le massime prestazioni ottenibili da ognuna di tali soluzioni alternative. Il confronto fra i diversi layout è stato perciò condotto sulla base delle prestazioni energetiche del macchinario durante un ciclo di scavo standardizzato, stimate grazie all’ausilio di un dettagliato modello dell’impianto. Poiché l’aggiunta di dispositivi per il recupero energetico introduce gradi di libertà addizionali nel sistema, è stato inoltre necessario determinare la strategia di controllo ottimale dei medesimi, al fine di poter valutare le massime prestazioni ottenibili da ciascun layout. Tale problema è stato di nuovo risolto grazie all’ausilio di un algoritmo di Programmazione Dinamica, che sfrutta un modello semplificato del sistema, ideato per lo scopo. Una volta che le prestazioni ottimali per ogni soluzione progettuale sono state determinate, è stato possibile effettuare un equo confronto fra le diverse alternative. Nella terza ed ultima applicazione è stato analizzato un impianto a ciclo Rankine organico (ORC) per il recupero di cascami termici dai gas di scarico di autovetture. Nonostante gli impianti ORC siano potenzialmente in grado di produrre rilevanti incrementi nel risparmio di combustibile di un veicolo, è necessario per il loro corretto funzionamento lo sviluppo di complesse strategie di controllo, che siano in grado di far fronte alla variabilità della fonte di calore per il processo; inoltre, contemporaneamente alla massimizzazione dei risparmi di combustibile, il sistema deve essere mantenuto in condizioni di funzionamento sicure. Per far fronte al problema, un robusto ed efficace modello dell’impianto è stato realizzato, basandosi sulla Moving Boundary Methodology, per la simulazione delle dinamiche di cambio di fase del fluido organico e la stima delle prestazioni dell’impianto. Tale modello è stato in seguito utilizzato per progettare un controllore predittivo (MPC) in grado di stimare i parametri di controllo ottimali per la gestione del sistema durante il funzionamento transitorio. Per la soluzione del corrispondente problema di ottimizzazione dinamica non lineare, un algoritmo basato sulla Particle Swarm Optimization è stato sviluppato. I risultati ottenuti con l’adozione di tale controllore sono stati confrontati con quelli ottenibili da un classico controllore proporzionale integrale (PI), mostrando nuovamente i vantaggi, da un punto di vista energetico, derivanti dall’adozione di una strategia di controllo ottima.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'obiettivo principale della politica di sicurezza alimentare è quello di garantire la salute dei consumatori attraverso regole e protocolli di sicurezza specifici. Al fine di rispondere ai requisiti di sicurezza alimentare e standardizzazione della qualità, nel 2002 il Parlamento Europeo e il Consiglio dell'UE (Regolamento (CE) 178/2002 (CE, 2002)), hanno cercato di uniformare concetti, principi e procedure in modo da fornire una base comune in materia di disciplina degli alimenti e mangimi provenienti da Stati membri a livello comunitario. La formalizzazione di regole e protocolli di standardizzazione dovrebbe però passare attraverso una più dettagliata e accurata comprensione ed armonizzazione delle proprietà globali (macroscopiche), pseudo-locali (mesoscopiche), ed eventualmente, locali (microscopiche) dei prodotti alimentari. L'obiettivo principale di questa tesi di dottorato è di illustrare come le tecniche computazionali possano rappresentare un valido supporto per l'analisi e ciò tramite (i) l’applicazione di protocolli e (ii) miglioramento delle tecniche ampiamente applicate. Una dimostrazione diretta delle potenzialità già offerte dagli approcci computazionali viene offerta nel primo lavoro in cui un virtual screening basato su docking è stato applicato al fine di valutare la preliminare xeno-androgenicità di alcuni contaminanti alimentari. Il secondo e terzo lavoro riguardano lo sviluppo e la convalida di nuovi descrittori chimico-fisici in un contesto 3D-QSAR. Denominata HyPhar (Hydrophobic Pharmacophore), la nuova metodologia così messa a punto è stata usata per esplorare il tema della selettività tra bersagli molecolari strutturalmente correlati e ha così dimostrato di possedere i necessari requisiti di applicabilità e adattabilità in un contesto alimentare. Nel complesso, i risultati ci permettono di essere fiduciosi nel potenziale impatto che le tecniche in silico potranno avere nella identificazione e chiarificazione di eventi molecolari implicati negli aspetti tossicologici e nutrizionali degli alimenti.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is currently considerable interest in developing general non-linear density models based on latent, or hidden, variables. Such models have the ability to discover the presence of a relatively small number of underlying `causes' which, acting in combination, give rise to the apparent complexity of the observed data set. Unfortunately, to train such models generally requires large computational effort. In this paper we introduce a novel latent variable algorithm which retains the general non-linear capabilities of previous models but which uses a training procedure based on the EM algorithm. We demonstrate the performance of the model on a toy problem and on data from flow diagnostics for a multi-phase oil pipeline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several parties (stakeholders) are involved in a construction project. The conventional Risk Management Process (RMP) manages risks from a single party perspective, which does not give adequate consideration to the needs of others. The objective of multi-party risk management is to assist decision-makers in managing risk systematically and most efficiently in a multi-party environment. Multi-party Risk Management Processes (MRMP) consist of risk identification, structuring, analysis and developing responses from all party perspectives. The MRMP has been applied to a cement plant construction project in Thailand to demonstrate its effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents two hybrid genetic algorithms (HGAs) to optimize the component placement operation for the collect-and-place machines in printed circuit board (PCB) assembly. The component placement problem is to optimize (i) the assignment of components to a movable revolver head or assembly tour, (ii) the sequence of component placements on a stationary PCB in each tour, and (iii) the arrangement of component types to stationary feeders simultaneously. The objective of the problem is to minimize the total traveling time spent by the revolver head for assembling all components on the PCB. The major difference between the HGAs is that the initial solutions are generated randomly in HGA1. The Clarke and Wright saving method, the nearest neighbor heuristic, and the neighborhood frequency heuristic are incorporated into HGA2 for the initialization procedure. A computational study is carried out to compare the algorithms with different population sizes. It is proved that the performance of HGA2 is superior to HGA1 in terms of the total assembly time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method has been constructed for the solution of a wide range of chemical plant simulation models including differential equations and optimization. Double orthogonal collocation on finite elements is applied to convert the model into an NLP problem that is solved either by the VF 13AD package based on successive quadratic programming, or by the GRG2 package, based on the generalized reduced gradient method. This approach is termed simultaneous optimization and solution strategy. The objective functional can contain integral terms. The state and control variables can have time delays. Equalities and inequalities containing state and control variables can be included into the model as well as algebraic equations and inequalities. The maximum number of independent variables is 2. Problems containing 3 independent variables can be transformed into problems having 2 independent variables using finite differencing. The maximum number of NLP variables and constraints is 1500. The method is also suitable for solving ordinary and partial differential equations. The state functions are approximated by a linear combination of Lagrange interpolation polynomials. The control function can either be approximated by a linear combination of Lagrange interpolation polynomials or by a piecewise constant function over finite elements. The number of internal collocation points can vary by finite elements. The residual error is evaluated at arbitrarily chosen equidistant grid-points, thus enabling the user to check the accuracy of the solution between collocation points, where the solution is exact. The solution functions can be tabulated. There is an option to use control vector parameterization to solve optimization problems containing initial value ordinary differential equations. When there are many differential equations or the upper integration limit should be selected optimally then this approach should be used. The portability of the package has been addressed converting the package from V AX FORTRAN 77 into IBM PC FORTRAN 77 and into SUN SPARC 2000 FORTRAN 77. Computer runs have shown that the method can reproduce optimization problems published in the literature. The GRG2 and the VF I 3AD packages, integrated into the optimization package, proved to be robust and reliable. The package contains an executive module, a module performing control vector parameterization and 2 nonlinear problem solver modules, GRG2 and VF I 3AD. There is a stand-alone module that converts the differential-algebraic optimization problem into a nonlinear programming problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project has been undertaken for Hamworthy Hydraulics Limited. Its objective was to design and develop a controller package for a variable displacement, hydraulic pump for use mainly on mobile earth moving machinery. A survey was undertaken of control options used in practice and from this a design specification was formulated, the successful implementation of which would give Hamworthy an advantage over its competitors. Two different modes for the controller were envisaged. One consisted of using conventional hydro-mechanics and the other was based upon a microprocessor. To meet short term customer prototype requirements the first section of work was the realisation of the hydro-mechanical system. Mathematical models were made to evaluate controller stability and hence aid their design. The final package met the requirements of the specification and a single version could operate all sizes of variable displacement pumps in the Hamworthy range. The choice of controller options and combinations totalled twenty-four. The hydro-mechanical controller was complex and it was realised that a micro-processor system would allow all options to be implemented with just one design of hardware, thus greatly simplifying production. The final section of this project was to determine whether such a design was feasible. This entailed finding cheap, reliable transducers, using mathematical models to predict electro-hydraulic interface stability, testing such interfaces and finally incorporating a micro-processor in an interactive control loop. The study revealed that such a system was technically possible but it would cost 60% more than its hydro-mechanical counterpart. It was therefore concluded that, in the short term, for the markets considered, the hydro-mechanical design was the better solution. Regarding the micro-processor system the final conclusion was that, because the relative costs of the two systems are decreasing, the electro-hydraulic controller will gradually become more attractive and therefore Hamworthy should continue with its development.