969 resultados para STOCHASTIC PROCESSES
Resumo:
We provide a general framework for estimating persistence in populations which may be affected by catastrophic events, and which are either unbounded or have very large ceilings. We model the population using a birth-death process modified to allow for downward jumps of arbitrary size. For such processes, it is typically necessary to truncate the process in order to make the evaluation of expected extinction times (and higher-order moments) computationally feasible. Hence, we give particular attention to the selection of a cut-off point at which to truncate the process, and we present a simple method for obtaining quantitative indicators of the suitability of a chosen cut-off. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
Let S be a countable set and let Q = (q(ij), i, j is an element of S) be a conservative q-matrix over S with a single instantaneous state b. Suppose that we are given a real number mu >= 0 and a strictly positive probability measure m = (m(j), j is an element of S) such that Sigma(i is an element of S) m(i)q(ij) = -mu m(j), j 0 b. We prove that there exists a Q-process P(t) = (p(ij) (t), i, j E S) for which m is a mu-invariant measure, that is Sigma(i is an element of s) m(i)p(ij)(t) = e(-mu t)m(j), j is an element of S. We illustrate our results with reference to the Kolmogorov 'K 1' chain and a birth-death process with catastrophes and instantaneous resurrection.
Resumo:
The published requirements for accurate measurement of heat transfer at the interface between two bodies have been reviewed. A strategy for reliable measurement has been established, based on the depth of the temperature sensors in the medium, on the inverse method parameters and on the time response of the sensors. Sources of both deterministic and stochastic errors have been investigated and a method to evaluate them has been proposed, with the help of a normalisation technique. The key normalisation variables are the duration of the heat input and the maximum heat flux density. An example of application of this technique in the field of high pressure die casting is demonstrated. The normalisation study, coupled with previous determination of the heat input duration, makes it possible to determine the optimum location for the sensors, along with an acceptable sampling rate and the thermocouples critical response-time (as well as eventual filter characteristics). Results from the gauge are used to assess the suitability of the initial design choices. In particular the unavoidable response time of the thermocouples is estimated by comparison with the normalised simulation. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
A stochastic metapopulation model accounting for habitat dynamics is presented. This is the stochastic SIS logistic model with the novel aspect that it incorporates varying carrying capacity. We present results of Kurtz and Barbour, that provide deterministic and diffusion approximations for a wide class of stochastic models, in a form that most easily allows their direct application to population models. These results are used to show that a suitably scaled version of the metapopulation model converges, uniformly in probability over finite time intervals, to a deterministic model previously studied in the ecological literature. Additionally, they allow us to establish a bivariate normal approximation to the quasi-stationary distribution of the process. This allows us to consider the effects of habitat dynamics on metapopulation modelling through a comparison with the stochastic SIS logistic model and provides an effective means for modelling metapopulations inhabiting dynamic landscapes.
Resumo:
This paper has three primary aims: to establish an effective means for modelling mainland-island metapopulations inhabiting a dynamic landscape: to investigate the effect of immigration and dynamic changes in habitat on metapopulation patch occupancy dynamics; and to illustrate the implications of our results for decision-making and population management. We first extend the mainland-island metapopulation model of Alonso and McKane [Bull. Math. Biol. 64:913-958,2002] to incorporate a dynamic landscape. It is shown, for both the static and the dynamic landscape models, that a suitably scaled version of the process converges to a unique deterministic model as the size of the system becomes large. We also establish that. under quite general conditions, the density of occupied patches, and the densities of suitable and occupied patches, for the respective models, have approximate normal distributions. Our results not only provide us with estimates for the means and variances that are valid at all stages in the evolution of the population, but also provide a tool for fitting the models to real metapopulations. We discuss the effect of immigration and habitat dynamics on metapopulations, showing that mainland-like patches heavily influence metapopulation persistence, and we argue for adopting measures to increase connectivity between this large patch and the other island-like patches. We illustrate our results with specific reference to examples of populations of butterfly and the grasshopper Bryodema tuberculata.
Resumo:
Stochastic models based on Markov birth processes are constructed to describe the process of invasion of a fly larva by entomopathogenic nematodes. Various forms for the birth (invasion) rates are proposed. These models are then fitted to data sets describing the observed numbers of nematodes that have invaded a fly larval after a fixed period of time. Non-linear birthrates are required to achieve good fits to these data, with their precise form leading to different patterns of invasion being identified for three populations of nematodes considered. One of these (Nemasys) showed the greatest propensity for invasion. This form of modelling may be useful more generally for analysing data that show variation which is different from that expected from a binomial distribution.
Resumo:
This thesis is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variant of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here two new extended frameworks are derived and presented that are based on basis function expansions and local polynomial approximations of a recently proposed variational Bayesian algorithm. It is shown that the new extensions converge to the original variational algorithm and can be used for state estimation (smoothing). However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new methods are numerically validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein-Uhlenbeck process, for which the exact likelihood can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz '63 (3-dimensional model). The algorithms are also applied to the 40 dimensional stochastic Lorenz '96 system. In this investigation these new approaches are compared with a variety of other well known methods such as the ensemble Kalman filter / smoother, a hybrid Monte Carlo sampler, the dual unscented Kalman filter (for jointly estimating the systems states and model parameters) and full weak-constraint 4D-Var. Empirical analysis of their asymptotic behaviour as a function of observation density or length of time window increases is provided.
Resumo:
Stochastic differential equations arise naturally in a range of contexts, from financial to environmental modeling. Current solution methods are limited in their representation of the posterior process in the presence of data. In this work, we present a novel Gaussian process approximation to the posterior measure over paths for a general class of stochastic differential equations in the presence of observations. The method is applied to two simple problems: the Ornstein-Uhlenbeck process, of which the exact solution is known and can be compared to, and the double-well system, for which standard approaches such as the ensemble Kalman smoother fail to provide a satisfactory result. Experiments show that our variational approximation is viable and that the results are very promising as the variational approximate solution outperforms standard Gaussian process regression for non-Gaussian Markov processes.
Resumo:
This work is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variation of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here a new extended framework is derived that is based on a local polynomial approximation of a recently proposed variational Bayesian algorithm. The paper begins by showing that the new extension of this variational algorithm can be used for state estimation (smoothing) and converges to the original algorithm. However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new approach is validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein–Uhlenbeck process, the exact likelihood of which can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz ’63 (3D model). As a special case the algorithm is also applied to the 40 dimensional stochastic Lorenz ’96 system. In our investigation we compare this new approach with a variety of other well known methods, such as the hybrid Monte Carlo, dual unscented Kalman filter, full weak-constraint 4D-Var algorithm and analyse empirically their asymptotic behaviour as a function of observation density or length of time window increases. In particular we show that we are able to estimate parameters in both the drift (deterministic) and the diffusion (stochastic) part of the model evolution equations using our new methods.
Resumo:
In 1974 Dr D M Bramwell published his research work at the University of Aston a part of which was the establishment of an elemental work study data base covering drainage construction. The Transport and Road Research Laboratory decided to, extend that work as part of their continuing research programme into the design and construction of buried pipelines by placing a research contract with Bryant Construction. This research may be considered under two broad categories. In the first, site studies were undertaken to validate and extend the data base. The studies showed good agreement with the existing data with the exception of the excavation trench shoring and pipelaying data which was amended to incorporate new construction plant and methods. An inter-active on-line computer system for drainage estimating was developed. This system stores the elemental data, synthesizes the standard time of each drainage operation and is used to determine the required resources and construction method of the total drainage activity. The remainder of the research was into the general topic of construction efficiency. An on-line command driven computer system was produced. This system uses a stochastic simulation technique, based on distributions of site efficiency measurements to evaluate the effects of varying performance levels. The analysis of this performance data quantities the variability inherent in construction and demonstrates how some of this variability can be reconciled by considering the characteristics of a contract. A long term trend of decreasing efficiency with contract duration was also identified. The results obtained from the simulation suite were compared to site records collected from current contracts. This showed that this approach will give comparable answers, but these are greatly affected by the site performance parameters.
Resumo:
Optimal design for parameter estimation in Gaussian process regression models with input-dependent noise is examined. The motivation stems from the area of computer experiments, where computationally demanding simulators are approximated using Gaussian process emulators to act as statistical surrogates. In the case of stochastic simulators, which produce a random output for a given set of model inputs, repeated evaluations are useful, supporting the use of replicate observations in the experimental design. The findings are also applicable to the wider context of experimental design for Gaussian process regression and kriging. Designs are proposed with the aim of minimising the variance of the Gaussian process parameter estimates. A heteroscedastic Gaussian process model is presented which allows for an experimental design technique based on an extension of Fisher information to heteroscedastic models. It is empirically shown that the error of the approximation of the parameter variance by the inverse of the Fisher information is reduced as the number of replicated points is increased. Through a series of simulation experiments on both synthetic data and a systems biology stochastic simulator, optimal designs with replicate observations are shown to outperform space-filling designs both with and without replicate observations. Guidance is provided on best practice for optimal experimental design for stochastic response models. © 2013 Elsevier Inc. All rights reserved.
Resumo:
Since wind at the earth's surface has an intrinsically complex and stochastic nature, accurate wind power forecasts are necessary for the safe and economic use of wind energy. In this paper, we investigated a combination of numeric and probabilistic models: a Gaussian process (GP) combined with a numerical weather prediction (NWP) model was applied to wind-power forecasting up to one day ahead. First, the wind-speed data from NWP was corrected by a GP, then, as there is always a defined limit on power generated in a wind turbine due to the turbine controlling strategy, wind power forecasts were realized by modeling the relationship between the corrected wind speed and power output using a censored GP. To validate the proposed approach, three real-world datasets were used for model training and testing. The empirical results were compared with several classical wind forecast models, and based on the mean absolute error (MAE), the proposed model provides around 9% to 14% improvement in forecasting accuracy compared to an artificial neural network (ANN) model, and nearly 17% improvement on a third dataset which is from a newly-built wind farm for which there is a limited amount of training data. © 2013 IEEE.
Resumo:
Mathematics Subject Classification: 26A33, 76M35, 82B31
Resumo:
Multitype branching processes (MTBP) model branching structures, where the nodes of the resulting tree are particles of different types. Usually such a process is not observable in the sense of the whole tree, but only as the “generation” at a given moment in time, which consists of the number of particles of every type. This requires an EM-type algorithm to obtain a maximum likelihood (ML) estimate of the parameters of the branching process. Using a version of the inside-outside algorithm for stochastic context-free grammars (SCFG), such an estimate could be obtained for the offspring distribution of the process.
Resumo:
The classical Bienaymé-Galton-Watson (BGW) branching process can be interpreted as mathematical model of population dynamics when the members of an isolated population reproduce themselves independently of each other according to a stochastic law.