905 resultados para Stochastic simulation methods


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, I present a number of leading examples in the empirical literature that use simulation-based estimation methods. For each example, I describe the model, why simulation is needed, and how to simulate the relevant object. There is a section on simulation methods and another on simulations-based estimation methods. The paper concludes by considering the significance of each of the examples discussed a commenting on potential future areas of interest.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper the issue of finding uncertainty intervals for queries in a Bayesian Network is reconsidered. The investigation focuses on Bayesian Nets with discrete nodes and finite populations. An earlier asymptotic approach is compared with a simulation-based approach, together with further alternatives, one based on a single sample of the Bayesian Net of a particular finite population size, and another which uses expected population sizes together with exact probabilities. We conclude that a query of a Bayesian Net should be expressed as a probability embedded in an uncertainty interval. Based on an investigation of two Bayesian Net structures, the preferred method is the simulation method. However, both the single sample method and the expected sample size methods may be useful and are simpler to compute. Any method at all is more useful than none, when assessing a Bayesian Net under development, or when drawing conclusions from an ‘expert’ system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND Many koala populations around Australia are in serious decline, with a substantial component of this decline in some Southeast Queensland populations attributed to the impact of Chlamydia. A Chlamydia vaccine for koalas is in development and has shown promise in early trials. This study contributes to implementation preparedness by simulating vaccination strategies designed to reverse population decline and by identifying which age and sex category it would be most effective to target. METHODS We used field data to inform the development and parameterisation of an individual-based stochastic simulation model of a koala population endemic with Chlamydia. The model took into account transmission, morbidity and mortality caused by Chlamydia infections. We calibrated the model to characteristics of typical Southeast Queensland koala populations. As there is uncertainty about the effectiveness of the vaccine in real-world settings, a variety of potential vaccine efficacies, half-lives and dosing schedules were simulated. RESULTS Assuming other threats remain constant, it is expected that current population declines could be reversed in around 5-6 years if female koalas aged 1-2 years are targeted, average vaccine protective efficacy is 75%, and vaccine coverage is around 10% per year. At lower vaccine efficacies the immunological effects of boosting become important: at 45% vaccine efficacy population decline is predicted to reverse in 6 years under optimistic boosting assumptions but in 9 years under pessimistic boosting assumptions. Terminating a successful vaccination programme at 5 years would lead to a rise in Chlamydia prevalence towards pre-vaccination levels. CONCLUSION For a range of vaccine efficacy levels it is projected that population decline due to endemic Chlamydia can be reversed under realistic dosing schedules, potentially in just 5 years. However, a vaccination programme might need to continue indefinitely in order to maintain Chlamydia prevalence at a sufficiently low level for population growth to continue.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Part I (Manjunath et al., 1994, Chem. Engng Sci. 49, 1451-1463) of this paper showed that the random particle numbers and size distributions in precipitation processes in very small drops obtained by stochastic simulation techniques deviate substantially from the predictions of conventional population balance. The foregoing problem is considered in this paper in terms of a mean field approximation obtained by applying a first-order closure to an unclosed set of mean field equations presented in Part I. The mean field approximation consists of two mutually coupled partial differential equations featuring (i) the probability distribution for residual supersaturation and (ii) the mean number density of particles for each size and supersaturation from which all average properties and fluctuations can be calculated. The mean field equations have been solved by finite difference methods for (i) crystallization and (ii) precipitation of a metal hydroxide both occurring in a single drop of specified initial supersaturation. The results for the average number of particles, average residual supersaturation, the average size distribution, and fluctuations about the average values have been compared with those obtained by stochastic simulation techniques and by population balance. This comparison shows that the mean field predictions are substantially superior to those of population balance as judged by the close proximity of results from the former to those from stochastic simulations. The agreement is excellent for broad initial supersaturations at short times but deteriorates progressively at larger times. For steep initial supersaturation distributions, predictions of the mean field theory are not satisfactory thus calling for higher-order approximations. The merit of the mean field approximation over stochastic simulation lies in its potential to reduce expensive computation times involved in simulation. More effective computational techniques could not only enhance this advantage of the mean field approximation but also make it possible to use higher-order approximations eliminating the constraints under which the stochastic dynamics of the process can be predicted accurately.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Coronary tortuosity (CT) is a common coronary angiographic finding. Whether CT leads to an apparent reduction in coronary pressure distal to the tortuous segment of the coronary artery is still unknown. The purpose of this study is to determine the impact of CT on coronary pressure distribution by numerical simulation. Methods: 21 idealized models were created to investigate the influence of coronary tortuosity angle (CTA) and coronary tortuosity number (CTN) on coronary pressure distribution. A 2D incompressible Newtonian flow was assumed and the computational simulation was performed using finite volume method. CTA of 30°, 60°, 90°, 120° and CTN of 0, 1, 2, 3, 4, 5 were discussed under both steady and pulsatile conditions, and the changes of outlet pressure and inlet velocity during the cardiac cycle were considered. Results: Coronary pressure distribution was affected both by CTA and CTN. We found that the pressure drop between the start and the end of the CT segment decreased with CTA, and the length of the CT segment also declined with CTA. An increase in CTN resulted in an increase in the pressure drop. Conclusions: Compared to no-CT, CT can results in more decrease of coronary blood pressure in dependence on the severity of tortuosity and severe CT may cause myocardial ischemia.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Summary. Interim analysis is important in a large clinical trial for ethical and cost considerations. Sometimes, an interim analysis needs to be performed at an earlier than planned time point. In that case, methods using stochastic curtailment are useful in examining the data for early stopping while controlling the inflation of type I and type II errors. We consider a three-arm randomized study of treatments to reduce perioperative blood loss following major surgery. Owing to slow accrual, an unplanned interim analysis was required by the study team to determine whether the study should be continued. We distinguish two different cases: when all treatments are under direct comparison and when one of the treatments is a control. We used simulations to study the operating characteristics of five different stochastic curtailment methods. We also considered the influence of timing of the interim analyses on the type I error and power of the test. We found that the type I error and power between the different methods can be quite different. The analysis for the perioperative blood loss trial was carried out at approximately a quarter of the planned sample size. We found that there is little evidence that the active treatments are better than a placebo and recommended closure of the trial.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The forest simulator is a computerized model for predicting forest growth and future development as well as effects of forest harvests and treatments. The forest planning system is a decision support tool, usually including a forest simulator and an optimisation model, for finding the optimal forest management actions. The information produced by forest simulators and forest planning systems is used for various analytical purposes and in support of decision making. However, the quality and reliability of this information can often be questioned. Natural variation in forest growth and estimation errors in forest inventory, among other things, cause uncertainty in predictions of forest growth and development. This uncertainty stemming from different sources has various undesirable effects. In many cases outcomes of decisions based on uncertain information are something else than desired. The objective of this thesis was to study various sources of uncertainty and their effects in forest simulators and forest planning systems. The study focused on three notable sources of uncertainty: errors in forest growth predictions, errors in forest inventory data, and stochastic fluctuation of timber assortment prices. Effects of uncertainty were studied using two types of forest growth models, individual tree-level models and stand-level models, and with various error simulation methods. New method for simulating more realistic forest inventory errors was introduced and tested. Also, three notable sources of uncertainty were combined and their joint effects on stand-level net present value estimates were simulated. According to the results, the various sources of uncertainty can have distinct effects in different forest growth simulators. The new forest inventory error simulation method proved to produce more realistic errors. The analysis on the joint effects of various sources of uncertainty provided interesting knowledge about uncertainty in forest simulators.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Precipitation in small droplets involving emulsions, microemulsions or vesicles is important for Producing multicomponent ceramics and nanoparticles. Because of the random nature of nucleation and the small number of particles in a droplet, the use of a deterministic population balance equation for predicting the number density of particles may lead to erroneous results even for evaluating the mean behavior of such systems. A comparison between the predictions made through stochastic simulation and deterministic population balance involving small droplets has been made for two simple systems, one involving crystallization and the other a single-component precipitation. The two approaches have been found to yield quite different results under a variety of conditions. Contrary to expectation, the smallness of the population alone does not cause these deviations. Thus, if fluctuation in supersaturation is negligible, the population balance and simulation predictions concur. However, for large fluctuations in supersaturation, the predictions differ significantly, indicating the need to take the stochastic nature of the phenomenon into account. This paper describes the stochastic treatment of populations, which involves a sequence of so-called product density equations and forms an appropriate framework for handling small systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Proofreading/editing in protein synthesis is essential for accurate translation of information from the genetic code. In this article we present a theoretical investigation of efficiency of a kinetic proofreading mechanism that employs hydrolysis of the wrong substrate as the discriminatory step in enzyme catalytic reactions. We consider aminoacylation of tRNA(Ile) which is a crucial step in protein synthesis and for which experimental results are now available. We present an augmented kinetic scheme and then employ methods of stochastic simulation algorithm to obtain time dependent concentrations of different substances involved in the reaction and their rates of formation. We obtain the rates of product formation and ATP hydrolysis for both correct and wrong substrates (isoleucine and valine in our case, respectively), in single molecular enzyme as well as ensemble enzyme kinetics. The present theoretical scheme correctly reproduces (i) the amplitude of the discrimination factor in the overall rates between isoleucine and valine which is obtained as (1.8x10(2)).(4.33x10(2)) = 7.8x10(4), (ii) the rates of ATP hydrolysis for both Ile and Val at different substrate concentrations in the aminoacylation of tRNA(Ile). The present study shows a non-michaelis type dependence of rate of reaction on tRNA(Ile) concentration in case of valine. The overall editing in steady state is found to be independent of amino acid concentration. Interestingly, the computed ATP hydrolysis rate for valine at high substrate concentration is same as the rate of formation of Ile-tRNA(Ile) whereas at intermediate substrate concentration the ATP hydrolysis rate is relatively low. We find that the presence of additional editing domain in class I editing enzyme makes the kinetic proofreading more efficient through enhanced hydrolysis of wrong product at the editing CP1 domain.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Motivated by several recent experimental observations that vitamin-D could interact with antigen presenting cells (APCs) and T-lymphocyte cells (T-cells) to promote and to regulate different stages of immune response, we developed a coarse grained but general kinetic model in an attempt to capture the role of vitamin-D in immunomodulatory responses. Our kinetic model, developed using the ideas of chemical network theory, leads to a system of nine coupled equations that we solve both by direct and by stochastic (Gillespie) methods. Both the analyses consistently provide detail information on the dependence of immune response to the variation of critical rate parameters. We find that although vitamin-D plays a negligible role in the initial immune response, it exerts a profound influence in the long term, especially in helping the system to achieve a new, stable steady state. The study explores the role of vitamin-D in preserving an observed bistability in the phase diagram (spanned by system parameters) of immune regulation, thus allowing the response to tolerate a wide range of pathogenic stimulation which could help in resisting autoimmune diseases. We also study how vitamin-D affects the time dependent population of dendritic cells that connect between innate and adaptive immune responses. Variations in dose dependent response of anti-inflammatory and pro-inflammatory T-cell populations to vitamin-D correlate well with recent experimental results. Our kinetic model allows for an estimation of the range of optimum level of vitamin-D required for smooth functioning of the immune system and for control of both hyper-regulation and inflammation. Most importantly, the present study reveals that an overdose or toxic level of vitamin-D or any steroid analogue could give rise to too large a tolerant response, leading to an inefficacy in adaptive immune function.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Monte Carlo simulation methods involving splitting of Markov chains have been used in evaluation of multi-fold integrals in different application areas. We examine in this paper the performance of these methods in the context of evaluation of reliability integrals from the point of view of characterizing the sampling fluctuations. The methods discussed include the Au-Beck subset simulation, Holmes-Diaconis-Ross method, and generalized splitting algorithm. A few improvisations based on first order reliability method are suggested to select algorithmic parameters of the latter two methods. The bias and sampling variance of the alternative estimators are discussed. Also, an approximation to the sampling distribution of some of these estimators is obtained. Illustrative examples involving component and series system reliability analyses are presented with a view to bring out the relative merits of alternative methods. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Separating the dynamics of variables that evolve on different timescales is a common assumption in exploring complex systems, and a great deal of progress has been made in understanding chemical systems by treating independently the fast processes of an activated chemical species from the slower processes that proceed activation. Protein motion underlies all biocatalytic reactions, and understanding the nature of this motion is central to understanding how enzymes catalyze reactions with such specificity and such rate enhancement. This understanding is challenged by evidence of breakdowns in the separability of timescales of dynamics in the active site form motions of the solvating protein. Quantum simulation methods that bridge these timescales by simultaneously evolving quantum and classical degrees of freedom provide an important method on which to explore this breakdown. In the following dissertation, three problems of enzyme catalysis are explored through quantum simulation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Diversas aplicações industriais relevantes envolvem os processos de adsorção, citando como exemplos a purificação de produtos, separação de substâncias, controle de poluição e umidade entre outros. O interesse crescente pelos processos de purificação de biomoléculas deve-se principalmente ao desenvolvimento da biotecnologia e à demanda das indústrias farmacêutica e química por produtos com alto grau de pureza. O leito móvel simulado (LMS) é um processo cromatográfico contínuo que tem sido aplicado para simular o movimento do leito de adsorvente, de forma contracorrente ao movimento do líquido, através da troca periódica das posições das correntes de entrada e saída, sendo operado de forma contínua, sem prejuízo da pureza das correntes de saída. Esta consiste no extrato, rico no componente mais fortemente adsorvido, e no rafinado, rico no componente mais fracamente adsorvido, sendo o processo particularmente adequado a separações binárias. O objetivo desta tese é estudar e avaliar diferentes abordagens utilizando métodos estocásticos de otimização para o problema inverso dos fenômenos envolvidos no processo de separação em LMS. Foram utilizados modelos discretos com diferentes abordagens de transferência de massa, com a vantagem da utilização de um grande número de pratos teóricos em uma coluna de comprimento moderado, neste processo a separação cresce à medida que os solutos fluem através do leito, isto é, ao maior número de vezes que as moléculas interagem entre a fase móvel e a fase estacionária alcançando assim o equilíbrio. A modelagem e a simulação verificadas nestas abordagens permitiram a avaliação e a identificação das principais características de uma unidade de separação do LMS. A aplicação em estudo refere-se à simulação de processos de separação do Baclofen e da Cetamina. Estes compostos foram escolhidos por estarem bem caracterizados na literatura, estando disponíveis em estudos de cinética e de equilíbrio de adsorção nos resultados experimentais. De posse de resultados experimentais avaliou-se o comportamento do problema direto e inverso de uma unidade de separação LMS visando comparar os resultados obtidos com os experimentais, sempre se baseando em critérios de eficiência de separação entre as fases móvel e estacionária. Os métodos estudados foram o GA (Genetic Algorithm) e o PCA (Particle Collision Algorithm) e também foi feita uma hibridização entre o GA e o PCA. Como resultado desta tese analisouse e comparou-se os métodos de otimização em diferentes aspectos relacionados com o mecanismo cinético de transferência de massa por adsorção e dessorção entre as fases sólidas do adsorvente.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As an important measure to understand oil and gas accumulation during petroleum exploration and development, Petroleum geological model is an integrated system of theories and methods, which includes sedimentology, reservoir geology, structural geology, petroleum geology and other geological theories, and is used to describe or predict the distribution of oil and gas. Progressive exploration and development for oil and gas is commonly used in terrestrial sedimentary basin in China for the oil and gas generation, accumulation and exploitation are very intricate. It is necessary to establish petroleum geological model, adaptive to different periods of progressive exploration and development practice. Meanwhile there is lack of an integrated system of theories and methods of petroleum geological model suitable for different exploration and development stages for oil and gas, because the current different models are intercrossed, which emphasize their different aspects. According to the characteristics of exploration and development for the Triassic oil and gas pool in Lunnan area, Tarim Basin, the Lunnan horst belt was selected as the major study object of this paper. On the basis of the study of petroleum geological model system, the petroleum geological models for different exploration and development stages are established, which could be applied to predict the distribution of oil and gas distribution. The main results are as follows. (1) The generation-accumulation and exploration-development of hydrocarbon are taken as an integrated system during the course of time, so petroleum exploration and development are closely combined. Under the guidance of some philosophical views that the whole world could be understood, the present writer realizes that any one kind of petroleum geological models can be used to predict and guide petroleum exploration and development practice. The writer do not recognize that any one kind of petroleum geological models can be viewed as sole model for guiding the petroleum exploration and development in the world. Based on the differences of extents and details of research work during various stage of exploration and development for oil and gas, the system of classification for petroleum geological models is established, which can be regarded as theoretical basis for progressive petroleum exploration and development. (2) A petroleum geological model was established based on detailed researches on the Triassic stratigraphy, structure, sedimentology and reservoir rocks in the Lunnan area, northern Tarim Basin. Some sub-belt of hydrocarbon accumulation in the Lunnan area are divided and the predominate controlling factors for oil and gas distribution in the Lunnan area are given out. (3) Geological models for Lunnan and Jiefangqudong oil fields were rebuilt by the combinations of seismology and geology, exploration and development, dynamic and static behavior, thus finding out the distribution of potential zones for oil and gas accumulations. Meanwhile Oil and gas accumulations were considered as the important unit in progressive exploration and development, and the classification was made for Lunnan Triassic pools. Petroleum geological model was created through 3D seismic fine interpretation and detailed description of characteristics of reservoir rocks and the distribution of oil and gas, especially for LN3 and LN26 well zones. The possible distribution of Triassic oil traps and their efficiency in the Lunnan area has been forecasted, and quantitative analysis for original oil(water) saturation in oil pools was performed. (4) The concept of oil cell is proposed by the writer for the first time. It represents the relatively oil-rich zones in oil pool, which were formed by the differences of fluid flows during the middle stage of reservoir development. The classification of oil cells is also given out in this paper. After the studies of physical and numerical modeling, the dominant controlling factors for the formation of various oil cells are analyzed. Oil cells are considered as the most important hydrocarbon potential zones after first recovery, which are main object of progressive development adjustment and improvement oil recovery. An example as main target of analysis was made for various oil cells of Triassic reservoir in the LN2 well area. (5) It is important and necessary that the classification of flow unit and the establishment of geological model of flow unit based on analysis of forecast for inter-well reservoir parameters connected with the statistical analysis of reservoir character of horizontal wells. With the help of self-adaptive interpolation and stochastic simulation, the geological model of flow units was built on the basis of division and correlation of flow units, with which the residual oil distribution in TIII reservoir in the LN2 well area after water flooding can be established.