889 resultados para Stochastic processes -- Mathematical models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

An emergency is a deviation from a planned course of events that endangers people, properties, or the environment. It can be described as an unexpected event that causes economic damage, destruction, and human suffering. When a disaster happens, Emergency Managers are expected to have a response plan to most likely disaster scenarios. Unlike earthquakes and terrorist attacks, a hurricane response plan can be activated ahead of time, since a hurricane is predicted at least five days before it makes landfall. This research looked into the logistics aspects of the problem, in an attempt to develop a hurricane relief distribution network model. We addressed the problem of how to efficiently and effectively deliver basic relief goods to victims of a hurricane disaster. Specifically, where to preposition State Staging Areas (SSA), which Points of Distributions (PODs) to activate, and the allocation of commodities to each POD. Previous research has addressed several of these issues, but not with the incorporation of the random behavior of the hurricane's intensity and path. This research presents a stochastic meta-model that deals with the location of SSAs and the allocation of commodities. The novelty of the model is that it treats the strength and path of the hurricane as stochastic processes, and models them as Discrete Markov Chains. The demand is also treated as stochastic parameter because it depends on the stochastic behavior of the hurricane. However, for the meta-model, the demand is an input that is determined using Hazards United States (HAZUS), a software developed by the Federal Emergency Management Agency (FEMA) that estimates losses due to hurricanes and floods. A solution heuristic has been developed based on simulated annealing. Since the meta-model is a multi-objective problem, the heuristic is a multi-objective simulated annealing (MOSA), in which the initial solution and the cooling rate were determined via a Design of Experiments. The experiment showed that the initial temperature (T0) is irrelevant, but temperature reduction (δ) must be very gradual. Assessment of the meta-model indicates that the Markov Chains performed as well or better than forecasts made by the National Hurricane Center (NHC). Tests of the MOSA showed that it provides solutions in an efficient manner. Thus, an illustrative example shows that the meta-model is practical.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experimental and theoretical studies have shown the importance of stochastic processes in genetic regulatory networks and cellular processes. Cellular networks and genetic circuits often involve small numbers of key proteins such as transcriptional factors and signaling proteins. In recent years stochastic models have been used successfully for studying noise in biological pathways, and stochastic modelling of biological systems has become a very important research field in computational biology. One of the challenge problems in this field is the reduction of the huge computing time in stochastic simulations. Based on the system of the mitogen-activated protein kinase cascade that is activated by epidermal growth factor, this work give a parallel implementation by using OpenMP and parallelism across the simulation. Special attention is paid to the independence of the generated random numbers in parallel computing, that is a key criterion for the success of stochastic simulations. Numerical results indicate that parallel computers can be used as an efficient tool for simulating the dynamics of large-scale genetic regulatory networks and cellular processes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tesis está dividida en dos partes: en la primera parte se presentan y estudian los procesos telegráficos, los procesos de Poisson con compensador telegráfico y los procesos telegráficos con saltos. El estudio presentado en esta primera parte incluye el cálculo de las distribuciones de cada proceso, las medias y varianzas, así como las funciones generadoras de momentos entre otras propiedades. Utilizando estas propiedades en la segunda parte se estudian los modelos de valoración de opciones basados en procesos telegráficos con saltos. En esta parte se da una descripción de cómo calcular las medidas neutrales al riesgo, se encuentra la condición de no arbitraje en este tipo de modelos y por último se calcula el precio de las opciones Europeas de compra y venta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wind-excited vibrations in the frequency range of 10 to 50 Hz due to vortex shedding often cause fatigue failures in the cables of overhead transmission lines. Damping devices, such as the Stockbridge dampers, have been in use for a long time for supressing these vibrations. The dampers are conveniently modelled by means of their driving point impedance, measured in the lab over the frequency range under consideration. The cables can be modelled as strings with additional small bending stiffness. The main problem in modelling the vibrations does however lay in the aerodynamic forces, which usually are approximated by the forces acting on a rigid cylinder in planar flow. In the present paper, the wind forces are represented by stochastic processes with arbitrary crosscorrelation in space; the case of a Kármán vortex street on a rigid cylinder in planar flow is contained as a limit case in this approach. The authors believe that this new view of the problem may yield useful results, particularly also concerning the reliability of the lines and the probability of fatigue damages. © 1987.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experimental and theoretical studies have shown the importance of stochastic processes in genetic regulatory networks and cellular processes. Cellular networks and genetic circuits often involve small numbers of key proteins such as transcriptional factors and signaling proteins. In recent years stochastic models have been used successfully for studying noise in biological pathways, and stochastic modelling of biological systems has become a very important research field in computational biology. One of the challenge problems in this field is the reduction of the huge computing time in stochastic simulations. Based on the system of the mitogen-activated protein kinase cascade that is activated by epidermal growth factor, this work give a parallel implementation by using OpenMP and parallelism across the simulation. Special attention is paid to the independence of the generated random numbers in parallel computing, that is a key criterion for the success of stochastic simulations. Numerical results indicate that parallel computers can be used as an efficient tool for simulating the dynamics of large-scale genetic regulatory networks and cellular processes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stochastic models for competing clonotypes of T cells by multivariate, continuous-time, discrete state, Markov processes have been proposed in the literature by Stirk, Molina-París and van den Berg (2008). A stochastic modelling framework is important because of rare events associated with small populations of some critical cell types. Usually, computational methods for these problems employ a trajectory-based approach, based on Monte Carlo simulation. This is partly because the complementary, probability density function (PDF) approaches can be expensive but here we describe some efficient PDF approaches by directly solving the governing equations, known as the Master Equation. These computations are made very efficient through an approximation of the state space by the Finite State Projection and through the use of Krylov subspace methods when evolving the matrix exponential. These computational methods allow us to explore the evolution of the PDFs associated with these stochastic models, and bimodal distributions arise in some parameter regimes. Time-dependent propensities naturally arise in immunological processes due to, for example, age-dependent effects. Incorporating time-dependent propensities into the framework of the Master Equation significantly complicates the corresponding computational methods but here we describe an efficient approach via Magnus formulas. Although this contribution focuses on the example of competing clonotypes, the general principles are relevant to multivariate Markov processes and provide fundamental techniques for computational immunology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the fundamental motivations underlying computational cell biology is to gain insight into the complicated dynamical processes taking place, for example, on the plasma membrane or in the cytosol of a cell. These processes are often so complicated that purely temporal mathematical models cannot adequately capture the complex chemical kinetics and transport processes of, for example, proteins or vesicles. On the other hand, spatial models such as Monte Carlo approaches can have very large computational overheads. This chapter gives an overview of the state of the art in the development of stochastic simulation techniques for the spatial modelling of dynamic processes in a living cell.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wound healing and tumour growth involve collective cell spreading, which is driven by individual motility and proliferation events within a population of cells. Mathematical models are often used to interpret experimental data and to estimate the parameters so that predictions can be made. Existing methods for parameter estimation typically assume that these parameters are constants and often ignore any uncertainty in the estimated values. We use approximate Bayesian computation (ABC) to estimate the cell diffusivity, D, and the cell proliferation rate, λ, from a discrete model of collective cell spreading, and we quantify the uncertainty associated with these estimates using Bayesian inference. We use a detailed experimental data set describing the collective cell spreading of 3T3 fibroblast cells. The ABC analysis is conducted for different combinations of initial cell densities and experimental times in two separate scenarios: (i) where collective cell spreading is driven by cell motility alone, and (ii) where collective cell spreading is driven by combined cell motility and cell proliferation. We find that D can be estimated precisely, with a small coefficient of variation (CV) of 2–6%. Our results indicate that D appears to depend on the experimental time, which is a feature that has been previously overlooked. Assuming that the values of D are the same in both experimental scenarios, we use the information about D from the first experimental scenario to obtain reasonably precise estimates of λ, with a CV between 4 and 12%. Our estimates of D and λ are consistent with previously reported values; however, our method is based on a straightforward measurement of the position of the leading edge whereas previous approaches have involved expensive cell counting techniques. Additional insights gained using a fully Bayesian approach justify the computational cost, especially since it allows us to accommodate information from different experiments in a principled way.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Iteration is unavoidable in the design process and should be incorporated when planning and managing projects in order to minimize surprises and reduce schedule distortions. However, planning and managing iteration is challenging because the relationships between its causes and effects are complex. Most approaches which use mathematical models to analyze the impact of iteration on the design process focus on a relatively small number of its causes and effects. Therefore, insights derived from these analytical models may not be robust under a broader consideration of potential influencing factors. In this article, we synthesize an explanatory framework which describes the network of causes and effects of iteration identified from the literature, and introduce an analytic approach which combines a task network modeling approach with System Dynamics simulation. Our approach models the network of causes and effects of iteration alongside the process architecture which is required to analyze the impact of iteration on design process performance. We show how this allows managers to assess the impact of changes to process architecture and to management levers which influence iterative behavior, accounting for the fact that these changes can occur simultaneously and can accumulate in non-linear ways. We also discuss how the insights resulting from this analysis can be visualized for easier consumption by project participants not familiar with simulation methods. Copyright © 2010 by ASME.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last 30 years have seen Fuzzy Logic (FL) emerging as a method either complementing or challenging stochastic methods as the traditional method of modelling uncertainty. But the circumstances under which FL or stochastic methods should be used are shrouded in disagreement, because the areas of application of statistical and FL methods are overlapping with differences in opinion as to when which method should be used. Lacking are practically relevant case studies comparing these two methods. This work compares stochastic and FL methods for the assessment of spare capacity on the example of pharmaceutical high purity water (HPW) utility systems. The goal of this study was to find the most appropriate method modelling uncertainty in industrial scale HPW systems. The results provide evidence which suggests that stochastic methods are superior to the methods of FL in simulating uncertainty in chemical plant utilities including HPW systems in typical cases whereby extreme events, for example peaks in demand, or day-to-day variation rather than average values are of interest. The average production output or other statistical measures may, for instance, be of interest in the assessment of workshops. Furthermore the results indicate that the stochastic model should be used only if found necessary by a deterministic simulation. Consequently, this thesis concludes that either deterministic or stochastic methods should be used to simulate uncertainty in chemical plant utility systems and by extension some process system because extreme events or the modelling of day-to-day variation are important in capacity extension projects. Other reasons supporting the suggestion that stochastic HPW models are preferred to FL HPW models include: 1. The computer code for stochastic models is typically less complex than a FL models, thus reducing code maintenance and validation issues. 2. In many respects FL models are similar to deterministic models. Thus the need for a FL model over a deterministic model is questionable in the case of industrial scale HPW systems as presented here (as well as other similar systems) since the latter requires simpler models. 3. A FL model may be difficult to "sell" to an end-user as its results represent "approximate reasoning" a definition of which is, however, lacking. 4. Stochastic models may be applied with some relatively minor modifications on other systems, whereas FL models may not. For instance, the stochastic HPW system could be used to model municipal drinking water systems, whereas the FL HPW model should or could not be used on such systems. This is because the FL and stochastic model philosophies of a HPW system are fundamentally different. The stochastic model sees schedule and volume uncertainties as random phenomena described by statistical distributions based on either estimated or historical data. The FL model, on the other hand, simulates schedule uncertainties based on estimated operator behaviour e.g. tiredness of the operators and their working schedule. But in a municipal drinking water distribution system the notion of "operator" breaks down. 5. Stochastic methods can account for uncertainties that are difficult to model with FL. The FL HPW system model does not account for dispensed volume uncertainty, as there appears to be no reasonable method to account for it with FL whereas the stochastic model includes volume uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many deterministic models with hysteresis have been developed in the areas of economics, finance, terrestrial hydrology and biology. These models lack any stochastic element which can often have a strong effect in these areas. In this work stochastically driven closed loop systems with hysteresis type memory are studied. This type of system is presented as a possible stochastic counterpart to deterministic models in the areas of economics, finance, terrestrial hydrology and biology. Some price dynamics models are presented as a motivation for the development of this type of model. Numerical schemes for solving this class of stochastic differential equation are developed in order to examine the prototype models presented. As a means of further testing the developed numerical schemes, numerical examination is made of the behaviour near equilibrium of coupled ordinary differential equations where the time derivative of the Preisach operator is included in one of the equations. A model of two phenotype bacteria is also presented. This model is examined to explore memory effects and related hysteresis effects in the area of biology. The memory effects found in this model are similar to that found in the non-ideal relay. This non-ideal relay type behaviour is used to model a colony of bacteria with multiple switching thresholds. This model contains a Preisach type memory with a variable Preisach weight function. Shown numerically for this multi-threshold model is a pattern formation for the distribution of the phenotypes among the available thresholds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Continuing our development of a mathematical theory of stochastic microlensing, we study the random shear and expected number of random lensed images of different types. In particular, we characterize the first three leading terms in the asymptotic expression of the joint probability density function (pdf) of the random shear tensor due to point masses in the limit of an infinite number of stars. Up to this order, the pdf depends on the magnitude of the shear tensor, the optical depth, and the mean number of stars through a combination of radial position and the star's mass. As a consequence, the pdf's of the shear components are seen to converge, in the limit of an infinite number of stars, to shifted Cauchy distributions, which shows that the shear components have heavy tails in that limit. The asymptotic pdf of the shear magnitude in the limit of an infinite number of stars is also presented. All the results on the random microlensing shear are given for a general point in the lens plane. Extending to the general random distributions (not necessarily uniform) of the lenses, we employ the Kac-Rice formula and Morse theory to deduce general formulas for the expected total number of images and the expected number of saddle images. We further generalize these results by considering random sources defined on a countable compact covering of the light source plane. This is done to introduce the notion of global expected number of positive parity images due to a general lensing map. Applying the result to microlensing, we calculate the asymptotic global expected number of minimum images in the limit of an infinite number of stars, where the stars are uniformly distributed. This global expectation is bounded, while the global expected number of images and the global expected number of saddle images diverge as the order of the number of stars. © 2009 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The transition of the mammalian cell from quiescence to proliferation is a highly variable process. Over the last four decades, two lines of apparently contradictory, phenomenological models have been proposed to account for such temporal variability. These include various forms of the transition probability (TP) model and the growth control (GC) model, which lack mechanistic details. The GC model was further proposed as an alternative explanation for the concept of the restriction point, which we recently demonstrated as being controlled by a bistable Rb-E2F switch. Here, through a combination of modeling and experiments, we show that these different lines of models in essence reflect different aspects of stochastic dynamics in cell cycle entry. In particular, we show that the variable activation of E2F can be described by stochastic activation of the bistable Rb-E2F switch, which in turn may account for the temporal variability in cell cycle entry. Moreover, we show that temporal dynamics of E2F activation can be recast into the frameworks of both the TP model and the GC model via parameter mapping. This mapping suggests that the two lines of phenomenological models can be reconciled through the stochastic dynamics of the Rb-E2F switch. It also suggests a potential utility of the TP or GC models in defining concise, quantitative phenotypes of cell physiology. This may have implications in classifying cell types or states.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extreme arid regions in the worlds' major deserts are typified by quartz pavement terrain. Cryptic hypolithic communities colonize the ventral surface of quartz rocks and this habitat is characterized by a relative lack of environmental and trophic complexity. Combined with readily identifiable major environmental stressors this provides a tractable model system for determining the relative role of stochastic and deterministic drivers in community assembly. Through analyzing an original, worldwide data set of 16S rRNA-gene defined bacterial communities from the most extreme deserts on the Earth, we show that functional assemblages within the communities were subject to different assembly influences. Null models applied to the photosynthetic assemblage revealed that stochastic processes exerted most effect on the assemblage, although the level of community dissimilarity varied between continents in a manner not always consistent with neutral models. The heterotrophic assemblages displayed signatures of niche processes across four continents, whereas in other cases they conformed to neutral predictions. Importantly, for continents where neutrality was either rejected or accepted, assembly drivers differed between the two functional groups. This study demonstrates that multi-trophic microbial systems may not be fully described by a single set of niche or neutral assembly rules and that stochasticity is likely a major determinant of such systems, with significant variation in the influence of these determinants on a global scale.