885 resultados para Variational principles
Resumo:
Static process simulation has traditionally been used to model complex processes for various purposes. However, the use of static processsimulators for the preparation of holistic examinations aiming at improving profit-making capability requires a lot of work because the production of results requires the assessment of the applicability of detailed data which may be irrelevant to the objective. The relevant data for the total assessment gets buried byirrelevant data. Furthermore, the models do not include an examination of the maintenance or risk management, and economic examination is often an extra property added to them which can be performed with a spreadsheet program. A process model applicable to holistic economic examinations has been developed in this work. The model is based on the life cycle profit philosophy developed by Hagberg and Henriksson in 1996. The construction of the model has utilized life cycle assessment and life cycle costing methodologies with a view to developing, above all, a model which would be applicable to the economic examinations of complete wholes and which would require the need for information focusing on aspects essential to the objectives. Life cycle assessment and costing differ from each other in terms of the modeling principles, but the features of bothmethodologies can be used in the development of economic process modeling. Methods applicable to the modeling of complex processes can be examined from the viewpoint of life cycle methodologies, because they involve the collection and management of large corpuses of information and the production of information for the needs of decision-makers as well. The results of the study shows that on the basis of the principles of life cycle modeling, a process model can be created which may be used to produce holistic efficiency examinations on the profit-making capability of the production line, with fewer resources thanwith traditional methods. The calculations of the model are based to the maximum extent on the information system of the factory, which means that the accuracyof the results can be improved by developing information systems so that they can provide the best information for this kind of examinations.
Resumo:
Background: Optimization methods allow designing changes in a system so that specific goals are attained. These techniques are fundamental for metabolic engineering. However, they are not directly applicable for investigating the evolution of metabolic adaptation to environmental changes. Although biological systems have evolved by natural selection and result in well-adapted systems, we can hardly expect that actual metabolic processes are at the theoretical optimum that could result from an optimization analysis. More likely, natural systems are to be found in a feasible region compatible with global physiological requirements. Results: We first present a new method for globally optimizing nonlinear models of metabolic pathways that are based on the Generalized Mass Action (GMA) representation. The optimization task is posed as a nonconvex nonlinear programming (NLP) problem that is solved by an outer- approximation algorithm. This method relies on solving iteratively reduced NLP slave subproblems and mixed-integer linear programming (MILP) master problems that provide valid upper and lower bounds, respectively, on the global solution to the original NLP. The capabilities of this method are illustrated through its application to the anaerobic fermentation pathway in Saccharomyces cerevisiae. We next introduce a method to identify the feasibility parametric regions that allow a system to meet a set of physiological constraints that can be represented in mathematical terms through algebraic equations. This technique is based on applying the outer-approximation based algorithm iteratively over a reduced search space in order to identify regions that contain feasible solutions to the problem and discard others in which no feasible solution exists. As an example, we characterize the feasible enzyme activity changes that are compatible with an appropriate adaptive response of yeast Saccharomyces cerevisiae to heat shock Conclusion: Our results show the utility of the suggested approach for investigating the evolution of adaptive responses to environmental changes. The proposed method can be used in other important applications such as the evaluation of parameter changes that are compatible with health and disease states.
Resumo:
Background: Understanding the relationship between gene expression changes, enzyme activity shifts, and the corresponding physiological adaptive response of organisms to environmental cues is crucial in explaining how cells cope with stress. For example, adaptation of yeast to heat shock involves a characteristic profile of changes to the expression levels of genes coding for enzymes of the glycolytic pathway and some of its branches. The experimental determination of changes in gene expression profiles provides a descriptive picture of the adaptive response to stress. However, it does not explain why a particular profile is selected for any given response. Results: We used mathematical models and analysis of in silico gene expression profiles (GEPs) to understand how changes in gene expression correlate to an efficient response of yeast cells to heat shock. An exhaustive set of GEPs, matched with the corresponding set of enzyme activities, was simulated and analyzed. The effectiveness of each profile in the response to heat shock was evaluated according to relevant physiological and functional criteria. The small subset of GEPs that lead to effective physiological responses after heat shock was identified as the result of the tuning of several evolutionary criteria. The experimentally observed transcriptional changes in response to heat shock belong to this set and can be explained by quantitative design principles at the physiological level that ultimately constrain changes in gene expression. Conclusion: Our theoretical approach suggests a method for understanding the combined effect of changes in the expression of multiple genes on the activity of metabolic pathways, and consequently on the adaptation of cellular metabolism to heat shock. This method identifies quantitative design principles that facilitate understating the response of the cell to stress.
Resumo:
Meta-analyses are considered as an important pillar of evidence-based medicine. The aim of this review is to describe the main principles of a meta-analysis and to use examples of head and neck oncology to demonstrate their clinical impact and methodological interest. The major role of individual patient data is outlined, as well as the superiority of individual patient data over meta-analyses based on published summary data. The major clinical breakthrough of head and neck meta-analyses are summarized, regarding concomitant chemotherapy, altered fractionated chemotherapy, new regimens of induction chemotherapy or the use of radioprotectants. Recent methodological developments are described, including network meta-analyses, the validation of surrogate markers. Lastly, the future of meta-analyses is discussed in the context of personalized medicine.
Resumo:
Työn tavoitteena oli kuvata piirilevyvalmistaja Aspocomp Oy:n Espoon tehtaan tämän hetkinen tuotannonohjausperiaate ja tunnistaa siinä esiintyvät puutteet sekä kehittää vaihtoehtoinen tuotannonohjausperiaate piirilevyvalmistukseen. Vaihtoehtoisen ohjausperiaatteen lähtökohtana oli tuotannonohjauksen sopeuttaminen vaativaan ja jatkuvasti muuttuvaan liiketoimintaympäristöön. Työn teoreettinen osa keskittyi tuotannonohjauksen eri lähestymistapoihin. Kirjallisuuskatsauksessa esitetään eri tuotannonohjausperiaatteiden keskeiset sisällöt, jotka muodostavat rungon toimivalle tuotannonohjauskäytännölle. Työn kokeellinen osa keskittyi Espoon piirilevytehtaan tuotannonohjausperiaatteen selvittämiseen. Espoon piirilevytehtaan nykyisessä tuotannonohjausperiaatteessa havaittujen ongelmakohtien ja liiketoimintaympäristön vaatimusten perusteella kehitettiin vaihtoehtoinen tuotannonohjaustapa. Vaihtoehtoisen tuotannonohjaustavan päämääränä oli läpimenoajan lyhentäminen sekä tuotannon parempi hallittavuus. Vaihtoehtoinen toimintamalli tavoitteiden saavuttamiseksi perustuu pullonkaulateoriaan, jossa keskeisin muutos nykyiseen toimintamalliin oli puolivalmisteiden varastointi toimitusajan lyhentämiseksi sekä tuotantovolyymin heilahdusten vaikutusten vähentämiseksi. Työn kokeellisessa osassa ilmeni, että kysynnän muutokset ja kapasiteetin suunnittelun puute aiheuttivat ongelmia piirilevytehtaan tuotannonohjauksessa.
Resumo:
Oxygen vacancies in metal oxides are known to determine their chemistry and physics. The properties of neutral oxygen vacancies in metal oxides of increasing complexity (MgO, CaO, alpha-Al2O3, and ZnO) have been studied using density functional theory. Vacancy formation energies, vacancy-vacancy interaction, and the barriers for vacancy migration are determined and rationalized in terms of the ionicity, the Madelung potential, and lattice relaxation. It is found that the Madelung potential controls the oxygen vacancy properties of highly ionic oxides whereas a more complex picture arises for covalent ZnO.
Resumo:
The interface of MgO/Ag(001) has been studied with density functional theory applied to slabs. We have found that regular MgO films show a small adhesion to the silver substrate, the binding can be increased in off-stoichiometric regimes, either by the presence of O vacancies at the oxide film or by a small excess of O atoms at the interface between the ceramic to the metal. By means of theoretical methods, the scanning tunneling microscopy signatures of these films is also analyzed in some detail. For defect free deposits containing 1 or 2 ML and at low voltages, tunnelling takes place from the surface Ag substrate, and at large positive voltages Mg atoms are imaged. If defects, oxygen vacancies, are present on the surface of the oxide they introduce much easier channels for tunnelling resulting in big protrusions and controlling the shape of the image, the extra O stored at the interface can also be detected for very thin films.
Resumo:
In recent years, technological advances have allowed manufacturers to implement dual-energy computed tomography (DECT) on clinical scanners. With its unique ability to differentiate basis materials by their atomic number, DECT has opened new perspectives in imaging. DECT has been used successfully in musculoskeletal imaging with applications ranging from detection, characterization, and quantification of crystal and iron deposits; to simulation of noncalcium (improving the visualization of bone marrow lesions) or noniodine images. Furthermore, the data acquired with DECT can be postprocessed to generate monoenergetic images of varying kiloelectron volts, providing new methods for image contrast optimization as well as metal artifact reduction. The first part of this article reviews the basic principles and technical aspects of DECT including radiation dose considerations. The second part focuses on applications of DECT to musculoskeletal imaging including gout and other crystal-induced arthropathies, virtual noncalcium images for the study of bone marrow lesions, the study of collagenous structures, applications in computed tomography arthrography, as well as the detection of hemosiderin and metal particles.
Resumo:
In recent years, technological advances have allowed manufacturers to implement dual-energy computed tomography (DECT) on clinical scanners. With its unique ability to differentiate basis materials by their atomic number, DECT has opened new perspectives in imaging. DECT has been successfully used in musculoskeletal imaging with applications ranging from detection, characterization, and quantification of crystal and iron deposits, to simulation of noncalcium (improving the visualization of bone marrow lesions) or noniodine images. Furthermore, the data acquired with DECT can be postprocessed to generate monoenergetic images of varying kiloelectron volts, providing new methods for image contrast optimization as well as metal artifact reduction. The first part of this article reviews the basic principles and technical aspects of DECT including radiation dose considerations. The second part focuses on applications of DECT to musculoskeletal imaging including gout and other crystal-induced arthropathies, virtual noncalcium images for the study of bone marrow lesions, the study of collagenous structures, applications in computed tomography arthrography, as well as the detection of hemosiderin and metal particles.
Resumo:
In geriatrics, driving cessation is addressed within the biopsychosocial model. This has broadened the scope of practitioners, not only in terms of assessing fitness to drive, but also by helping to maintain social engagements and provide support for transport transition. Causes can be addressed at different levels by adapting medication, improving physical health, modifying behaviour, adapting lifestyle, or bringing changes to the environment. This transdisciplinary approach requires an understanding of how different disciplines are linked to each other. This article reviews the philosophical principles of causality between fields and provides a framework for understanding causality within the biopsychosocial model. Understanding interlevel constraints should help practitioners overcome their differences, and favor transversal approaches to driving cessation.
Resumo:
The Extended Kalman Filter (EKF) and four dimensional assimilation variational method (4D-VAR) are both advanced data assimilation methods. The EKF is impractical in large scale problems and 4D-VAR needs much effort in building the adjoint model. In this work we have formulated a data assimilation method that will tackle the above difficulties. The method will be later called the Variational Ensemble Kalman Filter (VEnKF). The method has been tested with the Lorenz95 model. Data has been simulated from the solution of the Lorenz95 equation with normally distributed noise. Two experiments have been conducted, first with full observations and the other one with partial observations. In each experiment we assimilate data with three-hour and six-hour time windows. Different ensemble sizes have been tested to examine the method. There is no strong difference between the results shown by the two time windows in either experiment. Experiment I gave similar results for all ensemble sizes tested while in experiment II, higher ensembles produce better results. In experiment I, a small ensemble size was enough to produce nice results while in experiment II the size had to be larger. Computational speed is not as good as we would want. The use of the Limited memory BFGS method instead of the current BFGS method might improve this. The method has proven succesful. Even if, it is unable to match the quality of analyses of EKF, it attains significant skill in forecasts ensuing from the analysis it has produced. It has two advantages over EKF; VEnKF does not require an adjoint model and it can be easily parallelized.