956 resultados para Scheduling optimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The iron and steelmaking industry is among the major contributors to the anthropogenic emissions of carbon dioxide in the world. The rising levels of CO2 in the atmosphere and the global concern about the greenhouse effect and climate change have brought about considerable investigations on how to reduce the energy intensity and CO2 emissions of this industrial sector. In this thesis the problem is tackled by mathematical modeling and optimization using three different approaches. The possibility to use biomass in the integrated steel plant, particularly as an auxiliary reductant in the blast furnace, is investigated. By pre-processing the biomass its heating value and carbon content can be increased at the same time as the oxygen content is decreased. As the compression strength of the preprocessed biomass is lower than that of coke, it is not suitable for replacing a major part of the coke in the blast furnace burden. Therefore the biomass is assumed to be injected at the tuyere level of the blast furnace. Carbon capture and storage is, nowadays, mostly associated with power plants but it can also be used to reduce the CO2 emissions of an integrated steel plant. In the case of a blast furnace, the effect of CCS can be further increased by recycling the carbon dioxide stripped top gas back into the process. However, this affects the economy of the integrated steel plant, as the amount of top gases available, e.g., for power and heat production is decreased. High quality raw materials are a prerequisite for smooth blast furnace operation. High quality coal is especially needed to produce coke with sufficient properties to ensure proper gas permeability and smooth burden descent. Lower quality coals as well as natural gas, which some countries have in great volumes, can be utilized with various direct and smelting reduction processes. The DRI produced with a direct reduction process can be utilized as a feed material for blast furnace, basic oxygen furnace or electric arc furnace. The liquid hot metal from a smelting reduction process can in turn be used in basic oxygen furnace or electric arc furnace. The unit sizes and investment costs of an alternative ironmaking process are also lower than those of a blast furnace. In this study, the economy of an integrated steel plant is investigated by simulation and optimization. The studied system consists of linearly described unit processes from coke plant to steel making units, with a more detailed thermodynamical model of the blast furnace. The results from the blast furnace operation with biomass injection revealed the importance of proper pre-processing of the raw biomass as the composition of the biomass as well as the heating value and the yield are all affected by the pyrolysis temperature. As for recycling of CO2 stripped blast furnace top gas, substantial reductions in the emission rates are achieved if the stripped CO2 can be stored. However, the optimal recycling degree together with other operation conditions is heavily dependent on the cost structure of CO2 emissions and stripping/storage. The economical feasibility related to the use of DRI in the blast furnace depends on the price ratio between the DRI pellets and the BF pellets. The high amount of energy needed in the rotary hearth furnace to reduce the iron ore leads to increased CO2 emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Många kvantitativa problem från vitt skilda områden kan beskrivas som optimeringsproblem. Ett mått på lösningens kvalitet bör optimeras samtidigt som vissa villkor på lösningen uppfylls. Kvalitetsmåttet kallas vanligen objektfunktion och kan beskriva kostnader (exempelvis produktion, logistik), potentialenergi (molekylmodellering, proteinveckning), risk (finans, försäkring) eller något annat relevant mått. I min doktorsavhandling diskuteras speciellt icke-linjär programmering, NLP, i ändliga dimensioner. Problem med enkel struktur, till exempel någon form av konvexitet, kan lösas effektivt. Tyvärr kan inte alla kvantitativa samband modelleras på ett konvext vis. Icke-konvexa problem kan angripas med heuristiska metoder, algoritmer som söker lösningar med hjälp av deterministiska eller stokastiska tumregler. Ibland fungerar det här väl, men heuristikerna kan sällan garantera kvaliteten på lösningen eller ens att en lösning påträffas. För vissa tillämpningar är det här oacceptabelt. Istället kan man tillämpa så kallad global optimering. Genom att successivt dela variabeldomänen i mindre delar och beräkna starkare gränser på det optimala värdet hittas en lösning inom feltoleransen. Den här metoden kallas branch-and-bound, ungefär dela-och-begränsa. För att ge undre gränser (vid minimering) approximeras problemet med enklare problem, till exempel konvexa, som kan lösas effektivt. I avhandlingen studeras tillvägagångssätt för att approximera differentierbara funktioner med konvexa underskattningar, speciellt den så kallade alphaBB-metoden. Denna metod adderar störningar av en viss form och garanterar konvexitet genom att sätta villkor på den perturberade Hessematrisen. Min forskning har lyft fram en naturlig utvidgning av de perturbationer som används i alphaBB. Nya metoder för att bestämma underskattningsparametrar har beskrivits och jämförts. I sammanfattningsdelen diskuteras global optimering ur bredare perspektiv på optimering och beräkningsalgoritmer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to optimize and validate the solid-liquid extraction (ESL) technique for determination of picloram residues in soil samples. At the optimization stage, the optimal conditions for extraction of soil samples were determined using univariate analysis. Ratio soil/solution extraction, type and time of agitation, ionic strength and pH of extraction solution were evaluated. Based on the optimized parameters, the following method of extraction and analysis of picloram was developed: weigh 2.00 g of soil dried and sieved through a sieve mesh of 2.0 mm pore, add 20.0 mL of KCl concentration of 0.5 mol L-1, shake the bottle in the vortex for 10 seconds to form suspension and adjust to pH 7.00, with alkaline KOH 0.1 mol L-1. Homogenate the system in a shaker system for 60 minutes and then let it stand for 10 minutes. The bottles are centrifuged for 10 minutes at 3,500 rpm. After the settlement of the soil particles and cleaning of the supernatant extract, an aliquot is withdrawn and analyzed by high performance liquid chromatography. The optimized method was validated by determining the selectivity, linearity, detection and quantification limits, precision and accuracy. The ESL methodology was efficient for analysis of residues of the pesticides studied, with percentages of recovery above 90%. The limits of detection and quantification were 20.0 and 66.0 mg kg-1 soil for the PVA, and 40.0 and 132.0 mg kg-1 soil for the VLA. The coefficients of variation (CV) were equal to 2.32 and 2.69 for PVA and TH soils, respectively. The methodology resulted in low organic solvent consumption and cleaner extracts, as well as no purification steps for chromatographic analysis were required. The parameters evaluated in the validation process indicated that the ESL methodology is efficient for the extraction of picloram residues in soils, with low limits of detection and quantification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examines the excess returns provided by G10 currency carry trading during the Euro era. The currency carry trade has been a popular trade throughout the past decades offering excess returns to investors. The thesis aims to contribute to existing research on the topic by utilizing a new set of data for the Euro era as well as using the Euro as a basis for the study. The focus of the thesis is specifically on different carry trade strategies’ performance, risk and diversification benefits. The study finds proof of the failure of the uncovered interest rate parity theory through multiple regression analyses. Furthermore, the research finds evidence of significant diversification benefits in terms of Sharpe ratio and improved return distributions. The results suggest that currency carry trades have offered excess returns during 1999-2014 and that volatility plays an important role in carry trade returns. The risk, however, is diversifiable and therefore our results support previous quantitative research findings on the topic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identification of low-dimensional structures and main sources of variation from multivariate data are fundamental tasks in data analysis. Many methods aimed at these tasks involve solution of an optimization problem. Thus, the objective of this thesis is to develop computationally efficient and theoretically justified methods for solving such problems. Most of the thesis is based on a statistical model, where ridges of the density estimated from the data are considered as relevant features. Finding ridges, that are generalized maxima, necessitates development of advanced optimization methods. An efficient and convergent trust region Newton method for projecting a point onto a ridge of the underlying density is developed for this purpose. The method is utilized in a differential equation-based approach for tracing ridges and computing projection coordinates along them. The density estimation is done nonparametrically by using Gaussian kernels. This allows application of ridge-based methods with only mild assumptions on the underlying structure of the data. The statistical model and the ridge finding methods are adapted to two different applications. The first one is extraction of curvilinear structures from noisy data mixed with background clutter. The second one is a novel nonlinear generalization of principal component analysis (PCA) and its extension to time series data. The methods have a wide range of potential applications, where most of the earlier approaches are inadequate. Examples include identification of faults from seismic data and identification of filaments from cosmological data. Applicability of the nonlinear PCA to climate analysis and reconstruction of periodic patterns from noisy time series data are also demonstrated. Other contributions of the thesis include development of an efficient semidefinite optimization method for embedding graphs into the Euclidean space. The method produces structure-preserving embeddings that maximize interpoint distances. It is primarily developed for dimensionality reduction, but has also potential applications in graph theory and various areas of physics, chemistry and engineering. Asymptotic behaviour of ridges and maxima of Gaussian kernel densities is also investigated when the kernel bandwidth approaches infinity. The results are applied to the nonlinear PCA and to finding significant maxima of such densities, which is a typical problem in visual object tracking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this thesis is to examine distribution network designs and modeling practices and create a framework to identify best possible distribution network structure for the case company. The main research question therefore is: How to optimize case company’s distribution network in terms of customer needs and costs? Theory chapters introduce the basic building blocks of the distribution network design and needed calculation methods and models. Framework for the distribution network projects was created based on the theory and the case study was carried out by following the defined framework. Distribution network calculations were based on the company’s sales plan for the years 2014 - 2020. Main conclusions and recommendations were that the new Asian business strategy requires high investments in logistics and the first step is to open new satellite DC in China as soon as possible to support sales and second possible step is to open regional DC in Asia within 2 - 4 years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Almost every problem of design, planning and management in the technical and organizational systems has several conflicting goals or interests. Nowadays, multicriteria decision models represent a rapidly developing area of operation research. While solving practical optimization problems, it is necessary to take into account various kinds of uncertainty due to lack of data, inadequacy of mathematical models to real-time processes, calculation errors, etc. In practice, this uncertainty usually leads to undesirable outcomes where the solutions are very sensitive to any changes in the input parameters. An example is the investment managing. Stability analysis of multicriteria discrete optimization problems investigates how the found solutions behave in response to changes in the initial data (input parameters). This thesis is devoted to the stability analysis in the problem of selecting investment project portfolios, which are optimized by considering different types of risk and efficiency of the investment projects. The stability analysis is carried out in two approaches: qualitative and quantitative. The qualitative approach describes the behavior of solutions in conditions with small perturbations in the initial data. The stability of solutions is defined in terms of existence a neighborhood in the initial data space. Any perturbed problem from this neighborhood has stability with respect to the set of efficient solutions of the initial problem. The other approach in the stability analysis studies quantitative measures such as stability radius. This approach gives information about the limits of perturbations in the input parameters, which do not lead to changes in the set of efficient solutions. In present thesis several results were obtained including attainable bounds for the stability radii of Pareto optimal and lexicographically optimal portfolios of the investment problem with Savage's, Wald's criteria and criteria of extreme optimism. In addition, special classes of the problem when the stability radii are expressed by the formulae were indicated. Investigations were completed using different combinations of Chebyshev's, Manhattan and Hölder's metrics, which allowed monitoring input parameters perturbations differently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis considers optimization problems arising in printed circuit board assembly. Especially, the case in which the electronic components of a single circuit board are placed using a single placement machine is studied. Although there is a large number of different placement machines, the use of collect-and-place -type gantry machines is discussed because of their flexibility and increasing popularity in the industry. Instead of solving the entire control optimization problem of a collect-andplace machine with a single application, the problem is divided into multiple subproblems because of its hard combinatorial nature. This dividing technique is called hierarchical decomposition. All the subproblems of the one PCB - one machine -context are described, classified and reviewed. The derived subproblems are then either solved with exact methods or new heuristic algorithms are developed and applied. The exact methods include, for example, a greedy algorithm and a solution based on dynamic programming. Some of the proposed heuristics contain constructive parts while others utilize local search or are based on frequency calculations. For the heuristics, it is made sure with comprehensive experimental tests that they are applicable and feasible. A number of quality functions will be proposed for evaluation and applied to the subproblems. In the experimental tests, artificially generated data from Markov-models and data from real-world PCB production are used. The thesis consists of an introduction and of five publications where the developed and used solution methods are described in their full detail. For all the problems stated in this thesis, the methods proposed are efficient enough to be used in the PCB assembly production in practice and are readily applicable in the PCB manufacturing industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this Thesis is to find the most optimal heat recovery solution for Wärtsilä’s dynamic district heating power plant considering Germany energy markets as in Germany government pays subsidies for CHP plants in order to increase its share of domestic power production to 25 % by 2020. Different heat recovery connections have been simulated dozens to be able to determine the most efficient heat recovery connections. The purpose is also to study feasibility of different heat recovery connections in the dynamic district heating power plant in the Germany markets thus taking into consideration the day ahead electricity prices, district heating network temperatures and CHP subsidies accordingly. The auxiliary cooling, dynamical operation and cost efficiency of the power plant is also investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biofilm formed by Staphylococcus aureus is considered an important virulence trait in the pathogenesis of infections associated with implantable medical devices. Gene expression analyses are important strategies for determining the mechanisms involved in production and regulation of biofilm. Obtaining intact RNA preparations is the first and most critical step for these studies. In this article, we describe an optimized protocol for obtaining total RNA from sessile cells of S. aureus using the RNeasy Mini Kit. This method essentially consists of a few steps, as follows: 1) addition of acetone-ethanol to sessile cells, 2) lysis with lysostaphin at 37°C/10 min, 3) vigorous mixing, 4) three cycles of freezing and thawing, and 5) purification of the lysate in the RNeasy column. This simple pre-kit procedure yields high-quality total RNA from planktonic and sessile cells of S. aureus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to analyze the agreement between measurements of unloaded oxygen uptake and peak oxygen uptake based on equations proposed by Wasserman and on real measurements directly obtained with the ergospirometry system. We performed an incremental cardiopulmonary exercise test (CPET), which was applied to two groups of sedentary male subjects: one apparently healthy group (HG, n=12) and the other had stable coronary artery disease (n=16). The mean age in the HG was 47±4 years and that in the coronary artery disease group (CG) was 57±8 years. Both groups performed CPET on a cycle ergometer with a ramp-type protocol at an intensity that was calculated according to the Wasserman equation. In the HG, there was no significant difference between measurements predicted by the formula and real measurements obtained in CPET in the unloaded condition. However, at peak effort, a significant difference was observed between oxygen uptake (V˙O2)peak(predicted)and V˙O2peak(real)(nonparametric Wilcoxon test). In the CG, there was a significant difference of 116.26 mL/min between the predicted values by the formula and the real values obtained in the unloaded condition. A significant difference in peak effort was found, where V˙O2peak(real)was 40% lower than V˙O2peak(predicted)(nonparametric Wilcoxon test). There was no agreement between the real and predicted measurements as analyzed by Lin’s coefficient or the Bland and Altman model. The Wasserman formula does not appear to be appropriate for prediction of functional capacity of volunteers. Therefore, this formula cannot precisely predict the increase in power in incremental CPET on a cycle ergometer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lophius gastrophysus has important commercial value in Brazil particularly for foreign trade. In this study, we described the optimization of Random Amplified Polymorphic DNA (RAPD) protocol for identification of L. gastrophysus. Different conditions (annealing temperatures, MgCl concentrations, DNA quantity) were tested to find reproducible and adequate profiles. Amplifications performed with primers A01, ² A02 and A03 generate the best RAPD profiles when the conditions were annealing temperature of 36ºC, 25 ng of DNA quantity and 2.5 mM MgCl2. Exact identification of the species and origin of marine products is necessary and RAPD could be used as an accurate, rapid tool to expose commercial fraud.