946 resultados para Particle swarm optimization
Resumo:
This paper describes the optimization of a multiresidue chromatographic analysis for the identification and quantification of 20 pesticides in bovine milk, including three carbamates, a carbamate oxime, six organophosphates, two strobilurins, a pyrethroid, an oxazolidinedione, an aryloxyphenoxypropionate acid/ester, a neonicotinoid, a dicarboximide, and three triazoles. The influences of different chromatographic columns and gradients were evaluated. Furthermore, four different extraction methods were evaluated; each utilized both different solvents, including ethyl acetate, methanol, and acetonitrile, and different workup steps. The best results were obtained by a modified QuEChERS method that lacked a workup step, and that included freezing the sample for 2 hours at -20 ºC. The results were satisfactory, yielding coefficients of variation of less than 20%, with the exception of the 50 g L-1 sample of famoxadone, and recoveries between 70 and 120%, with the exception of acephate and bifenthrin; however, both analytes exhibited coefficients of variation of less than 20%.
Resumo:
The purpose of this thesis was to create design a guideline for an LCL-filter. This thesis reviews briefly the relevant harmonics standards, old filter designs and problems faced with the previous filters. This thesis proposes a modified design method based on the Liserres method presented in the literature. This modified method will take into account network parameters better. As input parameters, the method uses the nominal power, allowed ripple current in converter and network side and desired resonant frequency of the filter. Essential component selection issues for LCL-filter, such as heating, voltage strength, current rating etc. are also discussed. Furthermore, a simulation model used to verify the operation of the designed filter in nominal power use and in transient situations is included in this thesis.
Resumo:
The goal of the Masters thesis is to develop and to analyze the optimization method for finding a geometry shape of classical horizontal wind turbine blades based on set of criteria. The thesis develops a technique that allows the designer to determine the weight of such factors as power coefficient, sound pressure level and the cost function in the overall process of blade shape optimization. The optimization technique applies the Desirability function. It was never used before in that kind of technical problems, and in this sense it can claim to originality of research. To do the analysis and the optimization processes more convenient the software application was developed.
Resumo:
This work is devoted to the development of numerical method to deal with convection diffusion dominated problem with reaction term, non - stiff chemical reaction and stiff chemical reaction. The technique is based on the unifying Eulerian - Lagrangian schemes (particle transport method) under the framework of operator splitting method. In the computational domain, the particle set is assigned to solve the convection reaction subproblem along the characteristic curves created by convective velocity. At each time step, convection, diffusion and reaction terms are solved separately by assuming that, each phenomenon occurs separately in a sequential fashion. Moreover, adaptivities and projection techniques are used to add particles in the regions of high gradients (steep fronts) and discontinuities and transfer a solution from particle set onto grid point respectively. The numerical results show that, the particle transport method has improved the solutions of CDR problems. Nevertheless, the method is time consumer when compared with other classical technique e.g., method of lines. Apart from this advantage, the particle transport method can be used to simulate problems that involve movingsteep/smooth fronts such as separation of two or more elements in the system.
Resumo:
The demand for electricity is constantly growing in contemporary world and, in the same time, quality and reliability requirements are becoming more rigid. In addition, renewable sources of energy have been widely introduced for power generation, and they create specific challenges for the network. Consequently, new solution for distribution system is required, and Low Voltage Direct Current (LVDC) system is the proposed one. This thesis focuses on the investigation of specific cable features for low voltage direct current (LVDC) distribution system. The LVDC system is public 750 VDC distribution system, which is currently being developed at Lappeen-ranta University of Technology. The aspects, considered in the thesis, are reliable and economic power transmission in distribution networks and possible power line communication in the LVDC cable.
Resumo:
In the theory part the membrane emulsification was studied. Emulsions are used in many industrial areas. Traditionally emulsions are prepared by using high shear in rotor-stator systems or in high pressure homogenizer systems. In membrane emulsification two immiscible liquids are mixed by pressuring one liquid through the membrane into the other liquid. With this technique energy could be saved, more homogeneous droplets could be formed and the amount of surfactant could be decreased. Ziegler-Natta and single-site catalysts are used in olefin polymerization processes. Nowadays, these catalysts are prepared according to traditional mixing emulsification. More homogeneous catalyst particles that have narrower particle size distribution might be prepared with membrane emulsification. The aim of the experimental part was to examine the possibility to prepare single site polypropylene catalyst using membrane emulsification technique. Different membrane materials and solidification techniques of the emulsion were examined. Also the toluene-PFC phase diagram was successfully measured during this thesis work. This phase diagram was used for process optimization. The polytetrafluoroethylene membranes had the largest contact angles with toluene and also the biggest difference between the contact angles measured with PFC and toluene. Despite of the contact angle measurement results no significant difference was noticed between particles prepared using PTFE membrane or metal sinter. The particle size distributions of catalyst prepared in these tests were quite wide. This would probably be fixed by using a membrane with a more homogeneous pore size distribution. It is also possible that the solidification rate has an effect on the particle sizes and particle morphology. When polymeric membranes are compared PTFE is probably still the best material for the process as it had the best chemical durability.
Resumo:
N-methylpyrrolidone is a powerful solvent for variety of chemical processes due to its vast chemical properties. It has been used in manufacturing processes of polymers, detergents, pharmaceuticals rubber and many more chemical substances. However, it creates large amount of residue in some of these processes which has to be dealt with. Many well known methods such as BASF in rubber producing units have tried to regenerate the solvent at the end of each run, however, there is still discarding of large amount of residue containing NMP, which over time, could cause environmental concerns. In this study, we have tried to optimize regeneration of the NMP extraction from butadiene production. It is shown that at higher temperatures NMP is separated from the residue with close to 90% efficiency, and the solvent residue proved to be the most effective with a 6: 1 ratio.
Resumo:
New luminometric particle-based methods were developed to quantify protein and to count cells. The developed methods rely on the interaction of the sample with nano- or microparticles and different principles of detection. In fluorescence quenching, timeresolved luminescence resonance energy transfer (TR-LRET), and two-photon excitation fluorescence (TPX) methods, the sample prevents the adsorption of labeled protein to the particles. Depending on the system, the addition of the analyte increases or decreases the luminescence. In the dissociation method, the adsorbed protein protects the Eu(III) chelate on the surface of the particles from dissociation at a low pH. The experimental setups are user-friendly and rapid and do not require hazardous test compounds and elevated temperatures. The sensitivity of the quantification of protein (from 40 to 500 pg bovine serum albumin in a sample) was 20-500-fold better than in most sensitive commercial methods. The quenching method exhibited low protein-to-protein variability and the dissociation method insensitivity to the assay contaminants commonly found in biological samples. Less than ten eukaryotic cells were detected and quantified with all the developed methods under optimized assay conditions. Furthermore, two applications, the method for detection of the aggregation of protein and the cell viability test, were developed by utilizing the TR-LRET method. The detection of the aggregation of protein was allowed at a more than 10,000 times lower concentration, 30 g/L, compared to the known methods of UV240 absorbance and dynamic light scattering. The TR-LRET method was combined with a nucleic acid assay with cell-impermeable dye to measure the percentage of dead cells in a single tube test with cell counts below 1000 cells/tube.
Resumo:
A companys competence to manage its product portfolio complexity is becoming critically important in the rapidly changing business environment. The continuous evolvement of customer needs, the competitive market environment and internal product development lead to increasing complexity in product portfolios. The companies that manage the complexity in product development are more profitable in the long run. The complexity derives from product development and management processes where the new product variant development is not managed efficiently. Complexity is managed with modularization which is a method that divides the product structure into modules. In modularization, it is essential to take into account the trade-off between the perceived customer value and the module or component commonality across the products. Another goal is to enable the product configuration to be more flexible. The benefits are achieved through optimizing complexity in module offering and deriving the new product variants more flexibly and accurately. The developed modularization process includes the process steps for preparation, mapping the current situation, the creation of a modular strategy and implementing the strategy. Also the organization and support systems have to be adapted to follow-up targets and to execute modularization in practice.
Resumo:
The objective of this work was to define the optimal conditions for invertase assay, seeking to determine the ideal parameters for the different isoenzymes of leaf and bark tissues in adult rubber trees. Assays of varying pH, sucrose concentration and temperature of the reaction medium were conducted for the two investigated isoenzymes. The results pointed out the existence of two different pH related isoforms for the two analyzed tissues, with an isoenzyme being more active at pH 5,5 and the other at neutral/alkaline pH. Leaf blade isoenzymes presented similar values for substrate concentration, whereas the bark isoenzyme presented maximum values below those previously reported. The assays at different temperatures presented similar values for leaf isoenzymes, though they have differed significantly among the obtained values.
Resumo:
Mathematical models often contain parameters that need to be calibrated from measured data. The emergence of efficient Markov Chain Monte Carlo (MCMC) methods has made the Bayesian approach a standard tool in quantifying the uncertainty in the parameters. With MCMC, the parameter estimation problem can be solved in a fully statistical manner, and the whole distribution of the parameters can be explored, instead of obtaining point estimates and using, e.g., Gaussian approximations. In this thesis, MCMC methods are applied to parameter estimation problems in chemical reaction engineering, population ecology, and climate modeling. Motivated by the climate model experiments, the methods are developed further to make them more suitable for problems where the model is computationally intensive. After the parameters are estimated, one can start to use the model for various tasks. Two such tasks are studied in this thesis: optimal design of experiments, where the task is to design the next measurements so that the parameter uncertainty is minimized, and model-based optimization, where a model-based quantity, such as the product yield in a chemical reaction model, is optimized. In this thesis, novel ways to perform these tasks are developed, based on the output of MCMC parameter estimation. A separate topic is dynamical state estimation, where the task is to estimate the dynamically changing model state, instead of static parameters. For example, in numerical weather prediction, an estimate of the state of the atmosphere must constantly be updated based on the recently obtained measurements. In this thesis, a novel hybrid state estimation method is developed, which combines elements from deterministic and random sampling methods.
Resumo:
In any decision making under uncertainties, the goal is mostly to minimize the expected cost. The minimization of cost under uncertainties is usually done by optimization. For simple models, the optimization can easily be done using deterministic methods.However, many models practically contain some complex and varying parameters that can not easily be taken into account using usual deterministic methods of optimization. Thus, it is very important to look for other methods that can be used to get insight into such models. MCMC method is one of the practical methods that can be used for optimization of stochastic models under uncertainty. This method is based on simulation that provides a general methodology which can be applied in nonlinear and non-Gaussian state models. MCMC method is very important for practical applications because it is a uni ed estimation procedure which simultaneously estimates both parameters and state variables. MCMC computes the distribution of the state variables and parameters of the given data measurements. MCMC method is faster in terms of computing time when compared to other optimization methods. This thesis discusses the use of Markov chain Monte Carlo (MCMC) methods for optimization of Stochastic models under uncertainties .The thesis begins with a short discussion about Bayesian Inference, MCMC and Stochastic optimization methods. Then an example is given of how MCMC can be applied for maximizing production at a minimum cost in a chemical reaction process. It is observed that this method performs better in optimizing the given cost function with a very high certainty.
Resumo:
Optimointi on tavallinen toimenpide esimerkiksi prosessin muuttamisen tai uusimisen jlkeen. Optimoinnilla pyritn etsimn vaikkapa tiettyjen laatuominaisuuksien kannalta paras tapa ajaa prosessia tai erinisi prosessin osia. Tmn tyn tarkoituksena oli investoinnin jlkeen optimoida nelj muuttujaa, ern runkoon menevn massan jauhatus ja mr, mrkpuristus sek spray trkin mr, kolmen laatuominaisuuden, palstautumislujuuden, geometrisen taivutusjykkyyden ja sileyden, suhteen. Tyt varten tehtiin viisi tehdasmittakaavaista koeajoa. Ensimmisess koeajossa oli tarkoitus list vett tai spray trkki kolmikerroskartongin toiseen kerrosten rajapintaan, toisessa koeajossa muutettiin, jo aiemmin mainitun runkoon menevn massan jauhatusta ja jauhinkombinaatioita. Ensimmisess koeajossa tutkittiin palstautumislujuuden, toisessa koeajossa muiden lujuusominaisuuksien kehittymist. Kolmannessa koeajossa tutkittiin ern runkoon menevn massan jauhatuksen ja mrn sek kenkpuristimen viivapaineen muutoksen vaikutusta palstautumislujuuteen, geometriseen taivutusjykkyyteen sek sileyteen. Neljnness koeajossa yritettiin toistaa edellisen koeajon paras piste ja parametreja hieman muuttamalla saada aikaan vielkin paremmat laatuominaisuudet. Mys tss kokeessa tutkittiin muuttujien vaikutusta palstautumislujuuteen, geometriseen taivutusjykkyyteen ja sileyteen. Viimeisen kokeen tarkoituksena oli tutkia samaisen runkoon menevn massan vhentmisen vaikutusta palstautumislujuuteen. Erinisist vastoinkymisist johtuen, koeajoista saadut tulokset jivt melko laihoiksi. Kokeista kvi kuitenkin ilmi, ett lujuusominaisuudet eivt parantuneet, vaikka jauhatusta jatkettiin. Lujuusominaisuuksien kehittymisen kannalta turha jauhatus pystyttiin siis jttmn pois ja nin sstmn energiaa sek sstymn pitklle viedyn jauhatuksen mahdollisesti aiheuttamilta muilta ongelmilta. Vhemmll jauhatuksella ominaissrmkuorma saatiin mys pidetty alle tehtaalla halutun tason. Puuttuvat lujuusominaisuudet tytyy saavuttaa muilla keinoin.
Resumo:
RBDA (Recovery Boiler Dust Analyzer) on soodakattilan savukaasujen hiukkasten mr mittaava laite. Yksi laitteen mahdollisista sovelluskohteista on soodakattilan nuohouksen optimointi. Tmn tyn tarkoituksena oli parantaa RBD-analysaattorin toimintaa ja erityisesti tutkia laitteen toimivuutta soodakattilan nuohouksen optimoinnis-sa. Soodakattilan nuohouksen optimoinnilla voidaan vhent nuohoukseen kuluneen hyryn mr ja nin saavuttaa taloudellisia sstj. RBD-analysaattorin toimivuutta testattiin toimivan soodakattilan avulla. Tutkimuksen tueksi testeiss kytetyn soodakattilan savukaasujen lmptilat ja tuhkan tarttumisomi-naisuudet selvitettiin. Laitteen toimivuutta testattiin vertaamalla RBDA:n mittaustulosta standardin mukaiseen savukaasujen hiukkasmittausmenetelmn. Mittajrjestelmn soveltuvuutta soodakattilan nuohouksen optimointiin tutkittiin ajamalla nuohoimia ksin eri seisonta-aikojen jlkeen sek mittaamalla soodakattilan normaalia nuohousjrjes-tyst. Mittausten perusteella todettiin, ett RBDA havaitsee savukaasujen plypitoisuuden ja sen muutokset luotettavasti. Nuohouksen optimointiin RBDA:n todettiin soveltuvan ny-kyisess mallissaan vain vedenesilmmittimien osalta. Keittopinnan ja tulistimien nuo-hointen optimointiin ei saatu riittvn luotettavia mittaustuloksia. RBDA:n kehittminen jatkotutkimuksilla todettiin mahdolliseksi.
A Study on Health Effects of Fine Particle Concentrations in Tampere area during 2.5 Years Follow-up