42 resultados para robust and stochastic optimization

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In any decision making under uncertainties, the goal is mostly to minimize the expected cost. The minimization of cost under uncertainties is usually done by optimization. For simple models, the optimization can easily be done using deterministic methods.However, many models practically contain some complex and varying parameters that can not easily be taken into account using usual deterministic methods of optimization. Thus, it is very important to look for other methods that can be used to get insight into such models. MCMC method is one of the practical methods that can be used for optimization of stochastic models under uncertainty. This method is based on simulation that provides a general methodology which can be applied in nonlinear and non-Gaussian state models. MCMC method is very important for practical applications because it is a uni ed estimation procedure which simultaneously estimates both parameters and state variables. MCMC computes the distribution of the state variables and parameters of the given data measurements. MCMC method is faster in terms of computing time when compared to other optimization methods. This thesis discusses the use of Markov chain Monte Carlo (MCMC) methods for optimization of Stochastic models under uncertainties .The thesis begins with a short discussion about Bayesian Inference, MCMC and Stochastic optimization methods. Then an example is given of how MCMC can be applied for maximizing production at a minimum cost in a chemical reaction process. It is observed that this method performs better in optimizing the given cost function with a very high certainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stochastic approximation methods for stochastic optimization are considered. Reviewed the main methods of stochastic approximation: stochastic quasi-gradient algorithm, Kiefer-Wolfowitz algorithm and adaptive rules for them, simultaneous perturbation stochastic approximation (SPSA) algorithm. Suggested the model and the solution of the retailer's profit optimization problem and considered an application of the SQG-algorithm for the optimization problems with objective functions given in the form of ordinary differential equation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämä työ luo katsauksen ajallisiin ja stokastisiin ohjelmien luotettavuus malleihin sekä tutkii muutamia malleja käytännössä. Työn teoriaosuus sisältää ohjelmien luotettavuuden kuvauksessa ja arvioinnissa käytetyt keskeiset määritelmät ja metriikan sekä varsinaiset mallien kuvaukset. Työssä esitellään kaksi ohjelmien luotettavuusryhmää. Ensimmäinen ryhmä ovat riskiin perustuvat mallit. Toinen ryhmä käsittää virheiden ”kylvöön” ja merkitsevyyteen perustuvat mallit. Työn empiirinen osa sisältää kokeiden kuvaukset ja tulokset. Kokeet suoritettiin käyttämällä kolmea ensimmäiseen ryhmään kuuluvaa mallia: Jelinski-Moranda mallia, ensimmäistä geometrista mallia sekä yksinkertaista eksponenttimallia. Kokeiden tarkoituksena oli tutkia, kuinka syötetyn datan distribuutio vaikuttaa mallien toimivuuteen sekä kuinka herkkiä mallit ovat syötetyn datan määrän muutoksille. Jelinski-Moranda malli osoittautui herkimmäksi distribuutiolle konvergaatio-ongelmien vuoksi, ensimmäinen geometrinen malli herkimmäksi datan määrän muutoksille.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the excess returns provided by G10 currency carry trading during the Euro era. The currency carry trade has been a popular trade throughout the past decades offering excess returns to investors. The thesis aims to contribute to existing research on the topic by utilizing a new set of data for the Euro era as well as using the Euro as a basis for the study. The focus of the thesis is specifically on different carry trade strategies’ performance, risk and diversification benefits. The study finds proof of the failure of the uncovered interest rate parity theory through multiple regression analyses. Furthermore, the research finds evidence of significant diversification benefits in terms of Sharpe ratio and improved return distributions. The results suggest that currency carry trades have offered excess returns during 1999-2014 and that volatility plays an important role in carry trade returns. The risk, however, is diversifiable and therefore our results support previous quantitative research findings on the topic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematical models often contain parameters that need to be calibrated from measured data. The emergence of efficient Markov Chain Monte Carlo (MCMC) methods has made the Bayesian approach a standard tool in quantifying the uncertainty in the parameters. With MCMC, the parameter estimation problem can be solved in a fully statistical manner, and the whole distribution of the parameters can be explored, instead of obtaining point estimates and using, e.g., Gaussian approximations. In this thesis, MCMC methods are applied to parameter estimation problems in chemical reaction engineering, population ecology, and climate modeling. Motivated by the climate model experiments, the methods are developed further to make them more suitable for problems where the model is computationally intensive. After the parameters are estimated, one can start to use the model for various tasks. Two such tasks are studied in this thesis: optimal design of experiments, where the task is to design the next measurements so that the parameter uncertainty is minimized, and model-based optimization, where a model-based quantity, such as the product yield in a chemical reaction model, is optimized. In this thesis, novel ways to perform these tasks are developed, based on the output of MCMC parameter estimation. A separate topic is dynamical state estimation, where the task is to estimate the dynamically changing model state, instead of static parameters. For example, in numerical weather prediction, an estimate of the state of the atmosphere must constantly be updated based on the recently obtained measurements. In this thesis, a novel hybrid state estimation method is developed, which combines elements from deterministic and random sampling methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to simulate and to optimize integrated gasification for combine cycle (IGCC) for power generation and hydrogen (H2) production by using low grade Thar lignite coal and cotton stalk. Lignite coal is abundant of moisture and ash content, the idea of addition of cotton stalk is to increase the mass of combustible material per mass of feed use for the process, to reduce the consumption of coal and to increase the cotton stalk efficiently for IGCC process. Aspen plus software is used to simulate the process with different mass ratios of coal to cotton stalk and for optimization: process efficiencies, net power generation and H2 production etc. are considered while environmental hazard emissions are optimized to acceptance level. With the addition of cotton stalk in feed, process efficiencies started to decline along with the net power production. But for H2 production, it gave positive result at start but after 40% cotton stalk addition, H2 production also started to decline. It also affects negatively on environmental hazard emissions and mass of emissions/ net power production increases linearly with the addition of cotton stalk in feed mixture. In summation with the addition of cotton stalk, overall affects seemed to negative. But the effect is more negative after 40% cotton stalk addition so it is concluded that to get maximum process efficiencies and high production less amount of cotton stalk addition in feed is preferable and the maximum level of addition is estimated to 40%. Gasification temperature should keep lower around 1140 °C and prefer technique for studied feed in IGCC is fluidized bed (ash in dry form) rather than ash slagging gasifier

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last decade has shown that the global paper industry needs new processes and products in order to reassert its position in the industry. As the paper markets in Western Europe and North America have stabilized, the competition has tightened. Along with the development of more cost-effective processes and products, new process design methods are also required to break the old molds and create new ideas. This thesis discusses the development of a process design methodology based on simulation and optimization methods. A bi-level optimization problem and a solution procedure for it are formulated and illustrated. Computational models and simulation are used to illustrate the phenomena inside a real process and mathematical optimization is exploited to find out the best process structures and control principles for the process. Dynamic process models are used inside the bi-level optimization problem, which is assumed to be dynamic and multiobjective due to the nature of papermaking processes. The numerical experiments show that the bi-level optimization approach is useful for different kinds of problems related to process design and optimization. Here, the design methodology is applied to a constrained process area of a papermaking line. However, the same methodology is applicable to all types of industrial processes, e.g., the design of biorefiners, because the methodology is totally generalized and can be easily modified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bioprocess technology is a multidisciplinary industry that combines knowledge of biology and chemistry with process engineering. It is a growing industry because its applications have an important role in the food, pharmaceutical, diagnostics and chemical industries. In addition, the current pressure to decrease our dependence on fossil fuels motivates new, innovative research in the replacement of petrochemical products. Bioprocesses are processes that utilize cells and/or their components in the production of desired products. Bioprocesses are already used to produce fuels and chemicals, especially ethanol and building-block chemicals such as carboxylic acids. In order to enable more efficient, sustainable and economically feasible bioprocesses, the raw materials must be cheap and the bioprocesses must be operated at optimal conditions. It is essential to measure different parameters that provide information about the process conditions and the main critical process parameters including cell density, substrate concentrations and products. In addition to offline analysis methods, online monitoring tools are becoming increasingly important in the optimization of bioprocesses. Capillary electrophoresis (CE) is a versatile analysis technique with no limitations concerning polar solvents, analytes or samples. Its resolution and efficiency are high in optimized methods creating a great potential for rapid detection and quantification. This work demonstrates the potential and possibilities of CE as a versatile bioprocess monitoring tool. As a part of this study a commercial CE device was modified for use as an online analysis tool for automated monitoring. The work describes three offline CE analysis methods for the determination of carboxylic, phenolic and amino acids that are present in bioprocesses, and an online CE analysis method for the monitoring of carboxylic acid production during bioprocesses. The detection methods were indirect and direct UV, and laser-induced frescence. The results of this work can be used for the optimization of bioprocess conditions, for the development of more robust and tolerant microorganisms, and to study the dynamics of bioprocesses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Teollisuuden tuotannon eri prosessien optimointi on hyvin ajankohtainen aihe. Monet ohjausjärjestelmät ovat ajalta, jolloin tietokoneiden laskentateho oli hyvin vaatimaton nykyisiin verrattuna. Työssä esitetään tuotantoprosessi, joka sisältää teräksen leikkaussuunnitelman muodostamisongelman. Valuprosessi on yksi teräksen valmistuksen välivaiheita. Siinä sopivaan laatuun saatettu sula teräs valetaan linjastoon, jossa se jähmettyy ja leikataan aihioiksi. Myöhemmissä vaiheissa teräsaihioista muokataan pienempiä kokonaisuuksia, tehtaan lopputuotteita. Jatkuvavaletut aihiot voidaan leikata tilauskannasta riippuen monella eri tavalla. Tätä varten tarvitaan leikkaussuunnitelma, jonka muodostamiseksi on ratkaistava sekalukuoptimointiongelma. Sekalukuoptimointiongelmat ovat optimoinnin haastavin muoto. Niitä on tutkittu yksinkertaisempiin optimointiongelmiin nähden vähän. Nykyisten tietokoneiden laskentateho on kuitenkin mahdollistanut raskaampien ja monimutkaisempien optimointialgoritmien käytön ja kehittämisen. Työssä on käytetty ja esitetty eräs stokastisen optimoinnin menetelmä, differentiaalievoluutioalgoritmi. Tässä työssä esitetään teräksen leikkausoptimointialgoritmi. Kehitetty optimointimenetelmä toimii dynaamisesti tehdasympäristössä käyttäjien määrittelemien parametrien mukaisesti. Työ on osa Syncron Tech Oy:n Ovako Bar Oy Ab:lle toimittamaa ohjausjärjestelmää.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The parameter setting of a differential evolution algorithm must meet several requirements: efficiency, effectiveness, and reliability. Problems vary. The solution of a particular problem can be represented in different ways. An algorithm most efficient in dealing with a particular representation may be less efficient in dealing with other representations. The development of differential evolution-based methods contributes substantially to research on evolutionary computing and global optimization in general. The objective of this study is to investigatethe differential evolution algorithm, the intelligent adjustment of its controlparameters, and its application. In the thesis, the differential evolution algorithm is first examined using different parameter settings and test functions. Fuzzy control is then employed to make control parameters adaptive based on an optimization process and expert knowledge. The developed algorithms are applied to training radial basis function networks for function approximation with possible variables including centers, widths, and weights of basis functions and both having control parameters kept fixed and adjusted by fuzzy controller. After the influence of control variables on the performance of the differential evolution algorithm was explored, an adaptive version of the differential evolution algorithm was developed and the differential evolution-based radial basis function network training approaches were proposed. Experimental results showed that the performance of the differential evolution algorithm is sensitive to parameter setting, and the best setting was found to be problem dependent. The fuzzy adaptive differential evolution algorithm releases the user load of parameter setting and performs better than those using all fixedparameters. Differential evolution-based approaches are effective for training Gaussian radial basis function networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study explores the early phases of intercompany relationship building, which is a very important topic for purchasing and business development practitioners as well as for companies' upper management. There is a lot ofevidence that a proper engagement with markets increases a company's potential for achieving business success. Taking full advantage of the market possibilities requires, however, a holistic view of managing related decision-making chain. Most literature as well as the business processes of companies are lacking this holism. Typically they observe the process from the perspective of individual stages and thus lead to discontinuity and sub-optimization. This study contains a comprehensive introduction to and evaluation of literature related to various steps of the decision-making process. It is studied from a holistic perspective ofdetermining a company's vertical integration position within its demand/ supplynetwork context; translating the vertical integration objectives to feasible strategies and objectives; and operationalizing the decisions made through engagement with collaborative intercompany relationships. The empirical part of the research has been conducted in two sections. First the phenomenon of intercompany engagement is studied using two complementary case studies. Secondly a survey hasbeen conducted among the purchasing and business development managers of several electronics manufacturing companies, to analyze the processes, decision-makingcriteria and success factors of engagement for collaboration. The aim has been to identify the reasons why companies and their management act the way they do. As a combination of theoretical and empirical research an analysis has been produced of what would be an ideal way of engaging with markets. Based on the respective findings the study concludes by proposing a holistic framework for successful engagement. The evidence presented throughout the study demonstrates clear gaps, discontinuities and limitations in both current research and in practical purchasing decision-making chains. The most significant discontinuity is the identified disconnection between the supplier selection process and related criteria and the relationship success factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Terveydenhuollossa käytetään nykyisin informaatioteknologian (IT) mahdollisuuksia parantamaan hoidon laatua, vähentämään hoitoon liittyviä kuluja sekä yksinkertaistamaan ja selkeyttämään laakareiden työnkulkua. Tietojärjestelmät, jotka edustavat jokaisen IT-ratkaisun ydintä, täytyy kehittää täyttämään lukuisia vaatimuksia, ja yksi niistä on kyky integroitua saumattomasti toisten tietojärjestelmien kanssa. Järjestelmäintegraatio on kuitenkin yhä haastava tehtävä, vaikka sita varten on kehitetty useita standardeja. Tässä työssä kuvataan vastakehitetyn lääketieteellisen tietojärjestelmän liittymäratkaisu. Työssä pohditaan vaatimuksia, jotka tällaiselle sovellukselle asetetaan, ja myös tapa, jolla vaatimukset toteutuvat on esitetty. Liittymaratkaisu on jaettu kahteen osaan, tietojärjestelmaliittymään ja "liittymakoneeseen" (interfacing engine). Edellinen on käsittää perustoiminnallisuuden, jota tarvitaan vastaanottamaan ja lähettämään tietoa toisiin järjestelmiin, kun taas jälkimmäinen tarjoaa tuen tuotantoympäristössa käytettäville standardeille. Molempien osien suunnitelu on esitelty perusteellisesti tässä työssä. Ongelma ratkaistiin modulaarisen ja geneerisen suunnittelun avulla. Tämä lähestymistapa osoitetaan työssä kestäväksi ja joustavaksi ratkaisuksi, jota voidaan käyttää tarkastelemaan laajaa valikoimaa liittymäratkaisulle asetettuja vaatimuksia. Lisaksi osoitetaan kuinka tehty ratkaisu voidaan joustavuutensa ansiosta helposti mukauttaa vaatimuksiin, joita ei ole etukäteen tunnistettu, ja siten saavutetaan perusta myös tulevaisuuden tarpeille

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of the Master’s thesis is to develop and to analyze the optimization method for finding a geometry shape of classical horizontal wind turbine blades based on set of criteria. The thesis develops a technique that allows the designer to determine the weight of such factors as power coefficient, sound pressure level and the cost function in the overall process of blade shape optimization. The optimization technique applies the Desirability function. It was never used before in that kind of technical problems, and in this sense it can claim to originality of research. To do the analysis and the optimization processes more convenient the software application was developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cutting of thick section stainless steel and mild steel, and medium section aluminium using the high power ytterbium fibre laser has been experimentally investigated in this study. Theoretical models of the laser power requirement for cutting of a metal workpiece and the melt removal rate were also developed. The calculated laser power requirement was correlated to the laser power used for the cutting of 10 mm stainless steel workpiece and 15 mm mild steel workpiece using the ytterbium fibre laser and the CO2 laser. Nitrogen assist gas was used for cutting of stainless steel and oxygen was used for mild steel cutting. It was found that the incident laser power required for cutting at a given cutting speed was lower for fibre laser cutting than for CO2 laser cutting indicating a higher absorptivity of the fibre laser beam by the workpiece and higher melting efficiency for the fibre laser beam than for the CO2 laser beam. The difficulty in achieving an efficient melt removal during high speed cutting of the 15 mmmild steel workpiece with oxygen assist gas using the ytterbium fibre laser can be attributed to the high melting efficiency of the ytterbium fibre laser. The calculated melt flow velocity and melt film thickness correlated well with the location of the boundary layer separation point on the 10 mm stainless steel cut edges. An increase in the melt film thickness caused by deceleration of the melt particles in the boundary layer by the viscous shear forces results in the flow separation. The melt flow velocity increases with an increase in assist gas pressure and cut kerf width resulting in a reduction in the melt film thickness and the boundary layer separation point moves closer to the bottom cut edge. The cut edge quality was examined by visual inspection of the cut samples and measurement of the cut kerf width, boundary layer separation point, cut edge squareness (perpendicularity) deviation, and cut edge surface roughness as output quality factors. Different regions of cut edge quality in 10 mm stainless steel and 4 mm aluminium workpieces were defined for different combinations of cutting speed and laserpower.Optimization of processing parameters for a high cut edge quality in 10 mmstainless steel was demonstrated

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT Maria Peltola Electrical status epilepticus during sleep – Continuous spikes and waves during sleep Department of Clinical Neurophysiology, University of Turku Department of Clinical Neurophysiology and Department of Pediatric Neurology, Children’s Hospital, Helsinki University Central Hospital Annales Universitatis Turkuensis, Medica-Odontologica, Turku, Finland, 2014 Background: Electrical status epilepticus during sleep (ESES) is an EEG phenomenon of frequent spikes and waves occurring in slow sleep. ESES relates to cognitive deterioration in heterogeneous childhood epilepsies. Validated methods to quantitate ESES are missing. The clinical syndrome, called epileptic encephalopathy with continuous spikes and waves during sleep (CSWS) is pharmacoresistant in half of the patients. Limited data exists on surgical treatment of CSWS. Aims and methods: The effects of surgical treatment were studied by investigating electroclinical outcomes in 13 operated patients (nine callosotomies, four resections) with pharmacoresistant CSWS and cognitive decline. Secondly, an objective paradigm was searched for assessing ESES by the semiautomatic quantification of spike index (SI) and measuring spike strength from EEG. Results: Postoperatively, cognitive deterioration was stopped in 12 (92%) patients. Three out of four patients became seizure-free after resective surgery. Callosotomy resulted in greater than 90% reduction of atypical absences in six out of eight patients. The preoperative propagation of ESES from one hemisphere to the other was associated with a good response. Semiautomatic quantification of SI was a robust method when the maximal interspike interval of three seconds was used to determine the “continuous” discharge in ten EEGs. SI of the first hour of sleep appeared representative of the whole night SI. Furthermore, the spikes’ root mean square was found to be a stable measure of spike strength when spatially integrated over multiple electrodes during steady NREM sleep. Conclusions: Patients with pharmacoresistant CSWS, based on structural etiology, may benefit from resective surgery or corpus callosotomy regarding both seizure outcome and cognitive prognosis. The semiautomated SI quantification, with proper userdefined settings and the new spatially integrated measure of spike strength, are robust and promising tools for quantifying ESES. Keywords: Electrical status epilepticus during sleep, ESES, continuous spikes and waves during sleep, CSWS, epilepsy surgery, spike index, spike strength, RMS TIIVISTELMÄ Maria Peltola Unenaikainen sähköinen status epilepticus Kliininen neurofysiologia, Turun yliopisto Kliininen neurofysiologia ja lastenneurologia, Lasten ja nuorten sairaala, Helsingin yliopistollinen keskussairaala Annales Universitatis Turkuensis, Medica-Odontologica, Turku, Suomi, 2014 Tausta: Sähköinen status epilepticus unessa (ESES) on aivosähkökäyrä (EEG)-ilmiö, jossa hidasaaltounen aikana esiintyy tiheä piikkihidasaaltopurkaus. ESES:n kvantifioimiseen ei ole olemassa validoituja menetelmiä. ESES on liitetty kognitiivisen tason laskuun ja tällöin puhutaan CSWS (continuous spikes and waves during sleep) - oireyhtymästä. CSWS ei vastaa lääkehoitoon puolella potilaista ja sen epilepsiakirurgisesta hoidosta on olemassa vain vähän tietoa. Tavoitteet ja menetelmät: Selvitimme retrospektiivisesti epilepsiakirurgian vaikusta elektrokliinisiin löydöksiin 13:lla lääkeresistenttiä CSWS-oireyhtymää sairastavalla lapsella, joilla oli rakenteellinen aivojen poikkeavuus. Toinen tavoite oli löytää objektiivinen puoliautomaattinen tapa mitata purkauksen määrää ja piikkien voimakkuutta EEG:stä. Tulokset: Kognitiivisen tason jatkuva heikentyminen loppui 12 (92 %) potilaalla leikkauksen jälkeen. Kolme neljästä resektiopotilaasta tuli kohtauksettomaksi. Kallosotomian jälkeen kuudella kahdeksasta potilaasta päivittäiset kohtaukset vähenivät yli 90 %:lla. Purkauksen leviäminen leikkausta edeltävästi vain yhdestä hemisfääristä toiseen liittyi hyvään leikkaushoitovasteeseen. Piikki-indeksi, jossa käytetään jatkuvan purkauksen määritelmänä maksimissaan kolmea sekuntia piikkien välillä, osoittautui luotettavaksi menetelmäksi ESES:n kvantifioimiseen. Useammasta elektrodista integroitu piikkien neliöllinen keskiarvo oli piikin voimakkuuden vakaa mitta häiriintymättömässä NREM-unessa. Päätelmät: Lääkehoidolle vastaamatonta CSWS:ää sairastavat potilaat, joilla on rakenteellinen aivopoikkeavuus ja yhdensuuntainen purkauksen leviämismalli, näyttävät kohtausten vähenemisen lisäksi hyötyvän epilepsiakirurgiasta kognitiivisesti. Puoliautomaattinen piikki-indeksin kvantifiointi sopivilla käyttäjäasetuksilla ja uusi spatiaalisesti integroitu piikin voimakkuuden mittari ovat stabiileja ja lupaavia ESES:n kvantitatiivisia mittareita. Avainsanat: Unenaikainen sähköinen status epilepticus, ESES, CSWS, epilepsiakirurgia, piikki-indeksi, piikin voimakkuus, neliöllinen keskiarvo