995 resultados para Continuous Optimization
Resumo:
In this thesis (TFG) the results of the comparison between different methods to obtain a recombinant protein, by orthologous and heterologous expression, are exposed. This study will help us to identify the best way to express and purify a recombinant protein that will be used for biotechnology applications. In the first part of the project the goal was to find the best expression and purification system to obtain the recombinant protein of interest. To achieve this objective, a system expression in bacteria and in yeast was designed. The DNA was cloned into two different expression vectors to create a fusion protein with two different tags, and the expression of the protein was induced by IPTG or glucose. Additionally, in yeast, two promoters where used to express the protein, the one corresponding to the same protein (orthologous expression), and the ENO2 promoter (heterologous expression). The protein of interest is a NAD-dependent enzyme so, in a second time, its specific activity was evaluated by coenzyme conversion. The results of the TFG suggest that, comparing the model organisms, bacteria are more efficient than yeast because the quantity of protein obtained is higher and better purified. Regarding yeast, comparing the two expression mechanisms that were designed, heterologous expression works much better than the orthologous expression, so in case that we want to use yeast as expression model for the protein of interest, ENO2 will be the best option. Finally, the enzymatic assays, done to compare the effectiveness of the different expression mechanisms respect to the protein activity, revealed that the protein purified in yeast had more activity in converting the NAD coenzyme.
Resumo:
The threats caused by global warming motivate different stake holders to deal with and control them. This Master's thesis focuses on analyzing carbon trade permits in optimization framework. The studied model determines optimal emission and uncertainty levels which minimize the total cost. Research questions are formulated and answered by using different optimization tools. The model is developed and calibrated by using available consistent data in the area of carbon emission technology and control. Data and some basic modeling assumptions were extracted from reports and existing literatures. The data collected from the countries in the Kyoto treaty are used to estimate the cost functions. Theory and methods of constrained optimization are briefly presented. A two-level optimization problem (individual and between the parties) is analyzed by using several optimization methods. The combined cost optimization between the parties leads into multivariate model and calls for advanced techniques. Lagrangian, Sequential Quadratic Programming and Differential Evolution (DE) algorithm are referred to. The role of inherent measurement uncertainty in the monitoring of emissions is discussed. We briefly investigate an approach where emission uncertainty would be described in stochastic framework. MATLAB software has been used to provide visualizations including the relationship between decision variables and objective function values. Interpretations in the context of carbon trading were briefly presented. Suggestions for future work are given in stochastic modeling, emission trading and coupled analysis of energy prices and carbon permits.
Resumo:
Yhteiskunnan riippuvuus sähköstä on lisääntynyt voimakkaasti viime vuosikymmenien aikana. Sähkönjakelussa esiintyneet lyhyet ja pitkät keskeytykset ovat osoittaneet yhteiskunnan haavoittuvuuden ja yhteiskunta kestää entistä vähemmän sähkönjakelussa tapahtuvia häiriöitä. Keskeytyksistä aiheutuneiden haittojen arvostus on kasvanut ja tämä on luonut taloudelliset perusteet sähkön laatua parantaville investoinneille. Haja-asutusalueiden keskijänniteverkon johdot on rakennettu avojohtoina ja siten ne ovat alttiita sääolosuhteista johtuville myrsky- ja lumikuormavaurioille. Ilmastomuutoksen ennustetaan lisäävän tuulisuutta ja siten ongelmat sähkönjakelussa mahdollisesti lisääntyvät. Taajamissa käytetään enemmän kaapeleita ja johtolähdöt ovat lyhyitä, joten myrskyistä aiheutuvia keskeytyksiä on vähemmän kuin haja-asutusalueella. Olemassa olevat jakeluverkot ovat käytössä vielä vuosikymmeniä, joten uuden tekniikan kehittämisen rinnalla on kehitettävä myös olemassa olevaa jakeluverkkoa ja sen ylläpitoa. Ylläpidon tavoitteena on käyttövarmuuden parantamisen lisäksi huolehtia siitä, että jakeluverkkoihin sitoutunut omaisuus säilyttää arvonsa mahdollisimman hyvin pitoajan loppuun saakka. Jakeluverkkoihin investoitiin paljon 1950–70-luvuilla. Tältä ajalta on yhä käytössä puupylväitä, joiden ikääntymisen takia korvausinvestointien tarve kasvaa. Hyvänä puolena tässä on että käyttövarmuuden parantamiseksi olemassa olevaa jakeluverkkoa ei tarvitse uusia ennenaikaisesti. Tutkimuksessa päähuomio on haja-asutusalueiden 20 kV keskijänniteverkon kehittämisessä, sillä yli 90 % asiakkaiden kokemista keskeytyksistä johtuu keskijänniteverkon vioista. Erityisesti johtorakenteisiin ja johtojen sijoittamiseen on kiinnitettävä huomiota. Käyttövarmuuden lisäksi jakeluverkkojen kehittämistä ohjaavia tekijöitä ovat taloudellisuus, ympäristön huomioiminen, viranomaisvalvonta sekä asiakkaiden ja omistajien odotukset. Haja-asutusalueilla taloudelliset haasteet ovat suuret vakituisen väestön vähenemisen ja mahdollisesti sähköntarpeen pienenemisen takia. Taloudellisuus korostuu ja riskit kasvavat, kun tuottojen määrä supistuu tarvittaviin jakeluverkon investointeihin ja ylläpitokustannuksiin verrattuna. Ristiriitaa aiheuttaa se, että asiakkaat odottavat sähkönjakelulta parempaa luotettavuutta, mutta paremmasta sähkönlaadusta ei olla valmiita maksamaan juurikaan nykyistä enempää. Jakeluverkkojen kehittämistä voi hidastaa myös viranomaisvalvonta, jos tuottoja ei voida lisätä investointien lisätarpeiden suhteessa. Tutkimuksessa on analysoitu yleisellä tasolla kaapeloinnin lisäämistä, korkeiden pylväiden käyttämistä, leveitä johtokatuja, edullisten ja yksinkertaisten sähköasemien rakentamista haja-asutusalueille ja automaatioasemien lisäämistä keskijänniteverkon solmupisteisiin. Erityisesti tutkimuksessa on analysoitu uutena tekniikkana 1000 V jännitteen käyttömahdollisuutta jakeluverkkojen kehittämisessä. Sähköjohtojen siirtäminen teiden varsiin parantaa käyttövarmuutta, vaikka johdot rakennetaan samalla tekniikalla kuin olemassa olevat johdot. Hajaasutusalueille rakennettavilla sähköasemilla pitkät syöttöjohdot voidaan jakaa pienemmiksi syöttöalueiksi, jolloin keskeytyksistä aiheutuvat haitat koskettavat kerrallaan pienempää asiakasmäärää. Samaan tulokseen päästään oikein sijoitetuilla ja toteutetuilla automaatioasemilla. Tutkimuksen mukaan lupaavaksi tekniikaksi jakeluverkkojen kehittämisessä on osoittautumassa 1000 V jänniteportaan ottaminen 400 V pienjännitteen lisäksi. 1000 V verkoilla voidaan korvata häiriöherkkiä 20 kV keskijänniteverkon lyhyitä, alle viiden kilometrin pituisia haarajohtoja ja haarajohtojen jatkeita, missä siirrettävät tehot ovat pieniä. Uudessa jakelujärjestelmässä sähkö tuodaan 1000 V jännitteellä lähelle asiakasta, jossa jännite muunnetaan normaaliksi asiakkaille soveltuvaksi 400/230 V jännitteeksi. Edullisuus perustuu siihen, että rakentamisessa käytetään samoja pienjännitejohtoja kuin asiakkaille menevässä 400 V pienjänniteverkossa. 1000 V jakelutekniikassa sekä investointikustannukset että ylläpitokustannukset ovat pienemmät kuin perinteisessä 20 kV ilmajohtotekniikassa. 1000 V johdot säästävät maisemaa, sillä ne eivät tarvitse leveää johtokatua kuten 20 kV keskijännitejohdot. 1000 V verkkojen käyttö soveltuukin erityisesti vapaa-ajanasuntojen sähköistykseen herkissä ranta- ja järvimaisemissa. 1000 V verkot mahdollistavat kaapeliauraamisen lisäämisen ja näin voidaan vähentää ympäristöä haittaavien kyllästettyjen pylväiden käyttöä. 1000 V jakeluverkkojen osalta tutkimustyön tuloksia on sovellettu suomalaisessa Suur-Savon Sähkö Oy:ssä. Käytännön kokemuksia 1000 V jakelujärjestelmästä on useista kymmenistä kohteista. Tutkimustulokset osoittavat, ettei keskijänniteverkon maakaapelointi hajaasutusalueilla ole taloudellisesti kannattavaa nykyisillä keskeytyksistä aiheutuvilla haitta-arvoilla, mutta jos keskeytyskustannusten arvostus kasvaa, tulee kaapelointi kannattavaksi monissa paikoissa. Myös myrskyisyyden ja myrskyistä aiheutuvien jakelukeskeytysten lisääntyminen tekisi kaapeloinnista kannattavan. Tulevaisuudessa jakeluverkkojen rakentaminen on entistä monimuotoisempi tehtävä, jossa taloudellisuuden ja käyttövarmuuden lisäksi on huomioitava asiakkaat, omistajat, viranomaiset ja ympäristö. Tutkimusta jakelutekniikan kehittämiseksi tarvitaan edelleen. Tulevaisuuden osalta haja-asutusalueiden jakeluverkkojen kehittämiseen liittyy paljon epävarmuuksia. Hajautetun kiinteistökohtaisen sähköntuotannon lisääntyminen voi tehdä jakeluverkoista nykyistä tarpeettomampia, mutta esimerkiksi liikenteen sähköistyminen voi kasvattaa jakeluverkkojen merkitystä. Tästä syystä jakeluverkkojen rakentamisessa tarvitaan joustavuutta, jotta tarvittaessa voidaan helposti sopeutua erilaisiin kehityssuuntiin.
Resumo:
Current technology trends in medical device industry calls for fabrication of massive arrays of microfeatures such as microchannels on to nonsilicon material substrates with high accuracy, superior precision, and high throughput. Microchannels are typical features used in medical devices for medication dosing into the human body, analyzing DNA arrays or cell cultures. In this study, the capabilities of machining systems for micro-end milling have been evaluated by conducting experiments, regression modeling, and response surface methodology. In machining experiments by using micromilling, arrays of microchannels are fabricated on aluminium and titanium plates, and the feature size and accuracy (width and depth) and surface roughness are measured. Multicriteria decision making for material and process parameters selection for desired accuracy is investigated by using particle swarm optimization (PSO) method, which is an evolutionary computation method inspired by genetic algorithms (GA). Appropriate regression models are utilized within the PSO and optimum selection of micromilling parameters; microchannel feature accuracy and surface roughness are performed. An analysis for optimal micromachining parameters in decision variable space is also conducted. This study demonstrates the advantages of evolutionary computing algorithms in micromilling decision making and process optimization investigations and can be expanded to other applications
Resumo:
The objective of the thesis was to examine the possibilities in designing better performing nozzles for the heatset drying oven in Forest Pilot Center. To achieve the objective, two predesigned nozzle types along with the replicas of the current nozzles in the heatset drying oven were tested on a pilot-scale dryer. During the runnability trials, the pilot dryer was installed between the last printing unit and the drying oven. The two sets of predesigned nozzles were consecutively installed in the dryer. Four web tension values and four different impingement air velocities were used and the web behavior during the trial points was evaluated and recorded. The runnability in all trial conditions was adequate or even good. During the heat transfer trials, each nozzle type was tested on at least two different nozzle-to-surface distances and four different impingement air velocities. In a test situation, an aluminum plate fitted with thermocouples was set below a nozzle and the temperature measurement of each block was logged. From the measurements, a heat transfer coefficient profile for the nozzle was calculated. The performance of each nozzle type in tested conditions could now be rated and compared. The results verified that the predesigned simpler nozzles were better than the replicas. For runnability reasons, there were rows of inclined orifices on the leading and trailing edges of the current nozzles. They were believed to deteriorate the overall performance of the nozzle, and trials were conducted to test this hypothesis. The perpendicular orifices and inclined orifices of a replica nozzle were consecutively taped shut and the performance of the modified nozzles was measured as before, and then compared to the performance of the whole nozzle. It was found out, that after a certain nozzle-to-surface distance the jets from the two nozzles would collide, which deteriorates the heat transfer.
Resumo:
A continuous random variable is expanded as a sum of a sequence of uncorrelated random variables. These variables are principal dimensions in continuous scaling on a distance function, as an extension of classic scaling on a distance matrix. For a particular distance, these dimensions are principal components. Then some properties are studied and an inequality is obtained. Diagonal expansions are considered from the same continuous scaling point of view, by means of the chi-square distance. The geometric dimension of a bivariate distribution is defined and illustrated with copulas. It is shown that the dimension can have the power of continuum.
Resumo:
Today´s organizations must have the ability to react to rapid changes in the market. These rapid changes cause pressure to continuously find new efficient ways to organize work practices. Increased competition requires businesses to become more effective and to pay attention to quality of management and to make people to understand their work's impact on the final result. The fundamentals in continmuois improvement are systematic and agile tackling of indentified individual process constraints and the fact tha nothin finally improves without changes. Successful continuous improvement requires management commitment, education, implementation, measurement, recognition and regeneration. These ingredients form the foundation, both for breakthrough projects and small step ongoing improvement activities. One part of the organization's management system are the quality tools, which provide systematic methodologies for identifying problems, defining their root causes, finding solutions, gathering and sorting of data, supporting decision making and implementing the changes, and many other management tasks. Organizational change management includes processes and tools for managing the people in an organizational level change. These tools include a structured approach, which can be used for effective transition of organizations through change. When combined with the understanding of change management of individuals, these tools provide a framework for managing people in change,
Resumo:
This study presents examination of ways to increase power generation in pulp mills. The main purpose was to identify and verify the best ways of power generation growth. The literature part of this study presented operation of energy pulp mill departments, energy consumption and generation by the recovery and power boilers. The second chapter of this part described the main directions for increase of electricity generation rise of black liquor dry solid content, increase of main steam parameters, flue gas heat recovery technologies, feed water and combustion air preheating. The third chapter of the literature part presented possible technical, environment and corrosion risks appeared from described alternatives. In the experimental part of this study, calculations and results of possible models with alternatives was presented. The possible combinations of alternatives were generated in 44 `models of energy pulp mill. The target of this part was define extra electricity generation after alternatives using and estimate profitability of generated models. The calculations were made by computer programme PROSIM. In the conclusions, the results were estimated on the basis of extra electricity generation and equipment design data of models. The profitability of cases was verified by their payback periods and additional incomes.
Resumo:
We generalize to arbitrary waiting-time distributions some results which were previously derived for discrete distributions. We show that for any two waiting-time distributions with the same mean delay time, that with higher dispersion will lead to a faster front. Experimental data on the speed of virus infections in a plaque are correctly explained by the theoretical predictions using a Gaussian delay-time distribution, which is more realistic for this system than the Dirac delta distribution considered previously [J. Fort and V. Méndez, Phys. Rev. Lett.89, 178101 (2002)]
Resumo:
Software integration is a stage in a software development process to assemble separate components to produce a single product. It is important to manage the risks involved and being able to integrate smoothly, because software cannot be released without integrating it first. Furthermore, it has been shown that the integration and testing phase can make up 40 % of the overall project costs. These issues can be mitigated by using a software engineering practice called continuous integration. This thesis work presents how continuous integration is introduced to the author's employer organisation. This includes studying how the continuous integration process works and creating the technical basis to start using the process on future projects. The implemented system supports software written in C and C++ programming languages on Linux platform, but the general concepts can be applied to any programming language and platform by selecting the appropriate tools. The results demonstrate in detail what issues need to be solved when the process is acquired in a corporate environment. Additionally, they provide an implementation and process description suitable to the organisation. The results show that continuous integration can reduce the risks involved in a software process and increase the quality of the product as well.
Resumo:
Metaheuristic methods have become increasingly popular approaches in solving global optimization problems. From a practical viewpoint, it is often desirable to perform multimodal optimization which, enables the search of more than one optimal solution to the task at hand. Population-based metaheuristic methods offer a natural basis for multimodal optimization. The topic has received increasing interest especially in the evolutionary computation community. Several niching approaches have been suggested to allow multimodal optimization using evolutionary algorithms. Most global optimization approaches, including metaheuristics, contain global and local search phases. The requirement to locate several optima sets additional requirements for the design of algorithms to be effective in both respects in the context of multimodal optimization. In this thesis, several different multimodal optimization algorithms are studied in regard to how their implementation in the global and local search phases affect their performance in different problems. The study concentrates especially on variations of the Differential Evolution algorithm and their capabilities in multimodal optimization. To separate the global and local search search phases, three multimodal optimization algorithms are proposed, two of which hybridize the Differential Evolution with a local search method. As the theoretical background behind the operation of metaheuristics is not generally thoroughly understood, the research relies heavily on experimental studies in finding out the properties of different approaches. To achieve reliable experimental information, the experimental environment must be carefully chosen to contain appropriate and adequately varying problems. The available selection of multimodal test problems is, however, rather limited, and no general framework exists. As a part of this thesis, such a framework for generating tunable test functions for evaluating different methods of multimodal optimization experimentally is provided and used for testing the algorithms. The results demonstrate that an efficient local phase is essential for creating efficient multimodal optimization algorithms. Adding a suitable global phase has the potential to boost the performance significantly, but the weak local phase may invalidate the advantages gained from the global phase.
Resumo:
Analyzing the state of the art in a given field in order to tackle a new problem is always a mandatory task. Literature provides surveys based on summaries of previous studies, which are often based on theoretical descriptions of the methods. An engineer, however, requires some evidence from experimental evaluations in order to make the appropriate decision when selecting a technique for a problem. This is what we have done in this paper: experimentally analyzed a set of representative state-of-the-art techniques in the problem we are dealing with, namely, the road passenger transportation problem. This is an optimization problem in which drivers should be assigned to transport services, fulfilling some constraints and minimizing some function cost. The experimental results have provided us with good knowledge of the properties of several methods, such as modeling expressiveness, anytime behavior, computational time, memory requirements, parameters, and free downloadable tools. Based on our experience, we are able to choose a technique to solve our problem. We hope that this analysis is also helpful for other engineers facing a similar problem
Resumo:
In this study, different solutions to extract vitamin C were tested. High-performance liquid chromatography was chosen and the conditions were based on isocratic elution in reverse phase column. Dehydroascorbic acid was determined indirectly after its reduction using dithiothreitol. The use of metaphosphoric acid to stabilize the vitamin C was shown to be required and it was necessary to neutralize the pH of the extract to apply dithiothreitol. The average recovery was 90% in collard and tomato samples. The presence of oil did not interfere in extraction and the methodology can be used to analyze stir fried vegetables.
Resumo:
The carrot leaf dehydration conditions in air circulation oven were optimized through response surface methodology (RSM) for minimizing the degradation of polyunsaturated fatty acids, particularly alpha-linolenic (LNA, 18:3n-3). The optimized leaf drying time and temperature were 43 h and 70 ºC, respectively. The fatty acids (FA) were investigated using gas chromatography equipped with a flame ionization detector and fused silica capillary column; FA were identified with standards and based on equivalent-chain-length. LNA and other FA were quantified against C21:0 internal standard. After dehydration, the amount of LNA, quantified in mg/100 g dry matter of dehydrated carrot leaves, were 984 mg.