76 resultados para Aeroelascity, Optimization, Uncertainty
Resumo:
In this thesis programmatic, application-layer means for better energy-efficiency in the VoIP application domain are studied. The work presented concentrates on optimizations which are suitable for VoIP-implementations utilizing SIP and IEEE 802.11 technologies. Energy-saving optimizations can have an impact on perceived call quality, and thus energy-saving means are studied together with those factors affecting perceived call quality. In this thesis a general view on a topic is given. Based on theory, adaptive optimization schemes for dynamic controlling of application's operation are proposed. A runtime quality model, capable of being integrated into optimization schemes, is developed for VoIP call quality estimation. Based on proposed optimization schemes, some power consumption measurements are done to find out achievable advantages. Measurement results show that a reduction in power consumption is possible to achieve with the help of adaptive optimization schemes.
Resumo:
Russian and Baltic electricity markets are in the process of reformation and development on the way for competitive and transparent market. Nordic market also undergoes some changes on the way to market integration. Old structure and practices have been expired whereas new laws and rules come into force. The master thesis describes structure and functioning of wholesale electricity markets, cross-border connections between different countries. Additionally methods of cross-border trading using different methods of capacity allocation are disclosed. The main goal of present thesis is to study current situation at different electricity markets and observe changes coming into force as well as the capacity and electricity balances forecast in order to optimize short term power trading between countries and estimate the possible profit for the company.
Resumo:
The optimal design of a heat exchanger system is based on given model parameters together with given standard ranges for machine design variables. The goals set for minimizing the Life Cycle Cost (LCC) function which represents the price of the saved energy, for maximizing the momentary heat recovery output with given constraints satisfied and taking into account the uncertainty in the models were successfully done. Nondominated Sorting Genetic Algorithm II (NSGA-II) for the design optimization of a system is presented and implemented inMatlab environment. Markov ChainMonte Carlo (MCMC) methods are also used to take into account the uncertainty in themodels. Results show that the price of saved energy can be optimized. A wet heat exchanger is found to be more efficient and beneficial than a dry heat exchanger even though its construction is expensive (160 EUR/m2) compared to the construction of a dry heat exchanger (50 EUR/m2). It has been found that the longer lifetime weights higher CAPEX and lower OPEX and vice versa, and the effect of the uncertainty in the models has been identified in a simplified case of minimizing the area of a dry heat exchanger.
Resumo:
Tässä diplomityössä optimoitiin nelivaiheinen 1 MWe höyryturbiinin prototyyppimalli evoluutioalgoritmien avulla sekä tutkittiin optimoinnista saatuja kustannushyötyjä. Optimoinnissa käytettiin DE – algoritmia. Optimointi saatiin toimimaan, mutta optimoinnissa käytetyn laskentasovelluksen (semiempiirisiin yhtälöihin perustuvat mallit) luonteesta johtuen optimoinnin tarkkuus CFD – laskennalla suoritettuun tarkastusmallinnukseen verrattuna oli jonkin verran toivottua pienempi. Tulosten em. epätarkkuus olisi tuskin ollut vältettävissä, sillä ongelma johtui puoliempiirisiin laskentamalleihin liittyvistä lähtöoletusongelmista sekä epävarmuudesta sovitteiden absoluuttisista pätevyysalueista. Optimoinnin onnistumisen kannalta tällainen algebrallinen mallinnus oli kuitenkin välttämätöntä, koska esim. CFD-laskentaa ei olisi mitenkään voitu tehdä jokaisella optimointiaskeleella. Optimoinnin aikana ongelmia esiintyi silti konetehojen riittävyydessä sekä sellaisen sopivan rankaisumallin löytämisessä, joka pitäisi algoritmin matemaattisesti sallitulla alueella, muttei rajoittaisi liikaa optimoinnin edistymistä. Loput ongelmat johtuivat sovelluksen uutuudesta sekä täsmällisyysongelmista sovitteiden pätevyysalueiden käsittelyssä. Vaikka optimoinnista saatujen tulosten tarkkuus ei ollut aivan tavoitteen mukainen, oli niillä kuitenkin koneensuunnittelua edullisesti ohjaava vaikutus. DE – algoritmin avulla suoritetulla optimoinnilla saatiin turbiinista noin 2,2 % enemmän tehoja, joka tarkoittaa noin 15 000 € konekohtaista kustannushyötyä. Tämä on yritykselle erittäin merkittävä konekohtainen kustannushyöty. Loppujen lopuksi voitaneen sanoa, etteivät evoluutioalgoritmit olleet parhaimmillaan prototyyppituotteen optimoinnissa. Evoluutioalgoritmeilla teknisten laitteiden optimoinnissa piilee valtavasti mahdollisuuksia, mutta se vaatii kypsän sovelluskohteen, joka tunnetaan jo entuudestaan erinomaisesti tai on yksinkertainen ja aukottomasti laskettavissa.
Resumo:
The objective of the thesis was to examine the possibilities in designing better performing nozzles for the heatset drying oven in Forest Pilot Center. To achieve the objective, two predesigned nozzle types along with the replicas of the current nozzles in the heatset drying oven were tested on a pilot-scale dryer. During the runnability trials, the pilot dryer was installed between the last printing unit and the drying oven. The two sets of predesigned nozzles were consecutively installed in the dryer. Four web tension values and four different impingement air velocities were used and the web behavior during the trial points was evaluated and recorded. The runnability in all trial conditions was adequate or even good. During the heat transfer trials, each nozzle type was tested on at least two different nozzle-to-surface distances and four different impingement air velocities. In a test situation, an aluminum plate fitted with thermocouples was set below a nozzle and the temperature measurement of each block was logged. From the measurements, a heat transfer coefficient profile for the nozzle was calculated. The performance of each nozzle type in tested conditions could now be rated and compared. The results verified that the predesigned simpler nozzles were better than the replicas. For runnability reasons, there were rows of inclined orifices on the leading and trailing edges of the current nozzles. They were believed to deteriorate the overall performance of the nozzle, and trials were conducted to test this hypothesis. The perpendicular orifices and inclined orifices of a replica nozzle were consecutively taped shut and the performance of the modified nozzles was measured as before, and then compared to the performance of the whole nozzle. It was found out, that after a certain nozzle-to-surface distance the jets from the two nozzles would collide, which deteriorates the heat transfer.
Resumo:
This study presents examination of ways to increase power generation in pulp mills. The main purpose was to identify and verify the best ways of power generation growth. The literature part of this study presented operation of energy pulp mill departments, energy consumption and generation by the recovery and power boilers. The second chapter of this part described the main directions for increase of electricity generation rise of black liquor dry solid content, increase of main steam parameters, flue gas heat recovery technologies, feed water and combustion air preheating. The third chapter of the literature part presented possible technical, environment and corrosion risks appeared from described alternatives. In the experimental part of this study, calculations and results of possible models with alternatives was presented. The possible combinations of alternatives were generated in 44 `models of energy pulp mill. The target of this part was define extra electricity generation after alternatives using and estimate profitability of generated models. The calculations were made by computer programme PROSIM. In the conclusions, the results were estimated on the basis of extra electricity generation and equipment design data of models. The profitability of cases was verified by their payback periods and additional incomes.
Resumo:
Metaheuristic methods have become increasingly popular approaches in solving global optimization problems. From a practical viewpoint, it is often desirable to perform multimodal optimization which, enables the search of more than one optimal solution to the task at hand. Population-based metaheuristic methods offer a natural basis for multimodal optimization. The topic has received increasing interest especially in the evolutionary computation community. Several niching approaches have been suggested to allow multimodal optimization using evolutionary algorithms. Most global optimization approaches, including metaheuristics, contain global and local search phases. The requirement to locate several optima sets additional requirements for the design of algorithms to be effective in both respects in the context of multimodal optimization. In this thesis, several different multimodal optimization algorithms are studied in regard to how their implementation in the global and local search phases affect their performance in different problems. The study concentrates especially on variations of the Differential Evolution algorithm and their capabilities in multimodal optimization. To separate the global and local search search phases, three multimodal optimization algorithms are proposed, two of which hybridize the Differential Evolution with a local search method. As the theoretical background behind the operation of metaheuristics is not generally thoroughly understood, the research relies heavily on experimental studies in finding out the properties of different approaches. To achieve reliable experimental information, the experimental environment must be carefully chosen to contain appropriate and adequately varying problems. The available selection of multimodal test problems is, however, rather limited, and no general framework exists. As a part of this thesis, such a framework for generating tunable test functions for evaluating different methods of multimodal optimization experimentally is provided and used for testing the algorithms. The results demonstrate that an efficient local phase is essential for creating efficient multimodal optimization algorithms. Adding a suitable global phase has the potential to boost the performance significantly, but the weak local phase may invalidate the advantages gained from the global phase.
Resumo:
The uncertainty of any analytical determination depends on analysis and sampling. Uncertainty arising from sampling is usually not controlled and methods for its evaluation are still little known. Pierre Gy’s sampling theory is currently the most complete theory about samplingwhich also takes the design of the sampling equipment into account. Guides dealing with the practical issues of sampling also exist, published by international organizations such as EURACHEM, IUPAC (International Union of Pure and Applied Chemistry) and ISO (International Organization for Standardization). In this work Gy’s sampling theory was applied to several cases, including the analysis of chromite concentration estimated on SEM (Scanning Electron Microscope) images and estimation of the total uncertainty of a drug dissolution procedure. The results clearly show that Gy’s sampling theory can be utilized in both of the above-mentioned cases and that the uncertainties achieved are reliable. Variographic experiments introduced in Gy’s sampling theory are beneficially applied in analyzing the uncertainty of auto-correlated data sets such as industrial process data and environmental discharges. The periodic behaviour of these kinds of processes can be observed by variographic analysis as well as with fast Fourier transformation and auto-correlation functions. With variographic analysis, the uncertainties are estimated as a function of the sampling interval. This is advantageous when environmental data or process data are analyzed as it can be easily estimated how the sampling interval is affecting the overall uncertainty. If the sampling frequency is too high, unnecessary resources will be used. On the other hand, if a frequency is too low, the uncertainty of the determination may be unacceptably high. Variographic methods can also be utilized to estimate the uncertainty of spectral data produced by modern instruments. Since spectral data are multivariate, methods such as Principal Component Analysis (PCA) are needed when the data are analyzed. Optimization of a sampling plan increases the reliability of the analytical process which might at the end have beneficial effects on the economics of chemical analysis,
Resumo:
Stratospheric ozone can be measured accurately using a limb scatter remote sensing technique at the UV-visible spectral region of solar light. The advantages of this technique includes a good vertical resolution and a good daytime coverage of the measurements. In addition to ozone, UV-visible limb scatter measurements contain information about NO2, NO3, OClO, BrO and aerosols. There are currently several satellite instruments continuously scanning the atmosphere and measuring the UVvisible region of the spectrum, e.g., the Optical Spectrograph and Infrared Imager System (OSIRIS) launched on the Odin satellite in February 2001, and the Scanning Imaging Absorption SpectroMeter for Atmospheric CartograpHY (SCIAMACHY) launched on Envisat in March 2002. Envisat also carries the Global Ozone Monitoring by Occultation of Stars (GOMOS) instrument, which also measures limb-scattered sunlight under bright limb occultation conditions. These conditions occur during daytime occultation measurements. The global coverage of the satellite measurements is far better than any other ozone measurement technique, but still the measurements are sparse in the spatial domain. Measurements are also repeated relatively rarely over a certain area, and the composition of the Earth’s atmosphere changes dynamically. Assimilation methods are therefore needed in order to combine the information of the measurements with the atmospheric model. In recent years, the focus of assimilation algorithm research has turned towards filtering methods. The traditional Extended Kalman filter (EKF) method takes into account not only the uncertainty of the measurements, but also the uncertainty of the evolution model of the system. However, the computational cost of full blown EKF increases rapidly as the number of the model parameters increases. Therefore the EKF method cannot be applied directly to the stratospheric ozone assimilation problem. The work in this thesis is devoted to the development of inversion methods for satellite instruments and the development of assimilation methods used with atmospheric models.
Resumo:
An optimization tool has been developed to help companies to optimize their production cycles and thus improve their overall supply chain management processes. The application combines the functionality that traditional APS (Advanced Planning System) and ARP (Automatic Replenishment Program) systems provide into one optimization run. A qualitative study was organized to investigate opportunities to expand the product’s market base. Twelve personal interviews were conducted and the results were collected in industry specific production planning analyses. Five process industries were analyzed to identify the product’s suitability to each industry sector and the most important product development areas. Based on the research the paper and the plastic film industries remain the most potential industry sectors at this point. To be successful in other industry sectors some product enhancements would be required, including capabilities to optimize multiple sequential and parallel production cycles, handle sequencing of complex finishing operations and to include master planning capabilities to support overall supply chain optimization. In product sales and marketing processes the key to success is to find and reach the people who are involved directly with the problems that the optimization tool can help to solve.
Resumo:
The purpose of this thesis was to create design a guideline for an LCL-filter. This thesis reviews briefly the relevant harmonics standards, old filter designs and problems faced with the previous filters. This thesis proposes a modified design method based on the “Liserre’s method” presented in the literature. This modified method will take into account network parameters better. As input parameters, the method uses the nominal power, allowed ripple current in converter and network side and desired resonant frequency of the filter. Essential component selection issues for LCL-filter, such as heating, voltage strength, current rating etc. are also discussed. Furthermore, a simulation model used to verify the operation of the designed filter in nominal power use and in transient situations is included in this thesis.