947 resultados para Stochastic models
Resumo:
This report studies when and why two Hidden Markov Models (HMMs) may represent the same stochastic process. HMMs are characterized in terms of equivalence classes whose elements represent identical stochastic processes. This characterization yields polynomial time algorithms to detect equivalent HMMs. We also find fast algorithms to reduce HMMs to essentially unique and minimal canonical representations. The reduction to a canonical form leads to the definition of 'Generalized Markov Models' which are essentially HMMs without the positivity constraint on their parameters. We discuss how this generalization can yield more parsimonious representations of stochastic processes at the cost of the probabilistic interpretation of the model parameters.
Resumo:
A method for reconstruction of 3D polygonal models from multiple views is presented. The method uses sampling techniques to construct a texture-mapped semi-regular polygonal mesh of the object in question. Given a set of views and segmentation of the object in each view, constructive solid geometry is used to build a visual hull from silhouette prisms. The resulting polygonal mesh is simplified and subdivided to produce a semi-regular mesh. Regions of model fit inaccuracy are found by projecting the reference images onto the mesh from different views. The resulting error images for each view are used to compute a probability density function, and several points are sampled from it. Along the epipolar lines corresponding to these sampled points, photometric consistency is evaluated. The mesh surface is then pulled towards the regions of higher photometric consistency using free-form deformations. This sampling-based approach produces a photometrically consistent solution in much less time than possible with previous multi-view algorithms given arbitrary camera placement.
Resumo:
Empirical modeling of high-frequency currency market data reveals substantial evidence for nonnormality, stochastic volatility, and other nonlinearities. This paper investigates whether an equilibrium monetary model can account for nonlinearities in weekly data. The model incorporates time-nonseparable preferences and a transaction cost technology. Simulated sample paths are generated using Marcet's parameterized expectations procedure. The paper also develops a new method for estimation of structural economic models. The method forces the model to match (under a GMM criterion) the score function of a nonparametric estimate of the conditional density of observed data. The estimation uses weekly U.S.-German currency market data, 1975-90. © 1995.
Resumo:
We describe a strategy for Markov chain Monte Carlo analysis of non-linear, non-Gaussian state-space models involving batch analysis for inference on dynamic, latent state variables and fixed model parameters. The key innovation is a Metropolis-Hastings method for the time series of state variables based on sequential approximation of filtering and smoothing densities using normal mixtures. These mixtures are propagated through the non-linearities using an accurate, local mixture approximation method, and we use a regenerating procedure to deal with potential degeneracy of mixture components. This provides accurate, direct approximations to sequential filtering and retrospective smoothing distributions, and hence a useful construction of global Metropolis proposal distributions for simulation of posteriors for the set of states. This analysis is embedded within a Gibbs sampler to include uncertain fixed parameters. We give an example motivated by an application in systems biology. Supplemental materials provide an example based on a stochastic volatility model as well as MATLAB code.
Resumo:
A discretized series of events is a binary time series that indicates whether or not events of a point process in the line occur in successive intervals. Such data are common in environmental applications. We describe a class of models for them, based on an unobserved continuous-time discrete-state Markov process, which determines the rate of a doubly stochastic Poisson process, from which the binary time series is constructed by discretization. We discuss likelihood inference for these processes and their second-order properties and extend them to multiple series. An application involves modeling the times of exposures to air pollution at a number of receptors in Western Europe.
Resumo:
Host-parasitoid models including integrated pest management (IPM) interventions with impulsive effects at both fixed and unfixed times were analyzed with regard to host-eradication, host-parasitoid persistence and host-outbreak solutions. The host-eradication periodic solution with fixed moments is globally stable if the host's intrinsic growth rate is less than the summation of the mean host-killing rate and the mean parasitization rate during the impulsive period. Solutions for all three categories can coexist, with switch-like transitions among their attractors showing that varying dosages and frequencies of insecticide applications and the numbers of parasitoids released are crucial. Periodic solutions also exist for models with unfixed moments for which the maximum amplitude of the host is less than the economic threshold. The dosages and frequencies of IPM interventions for these solutions are much reduced in comparison with the pest-eradication periodic solution. Our results, which are robust to inclusion of stochastic effects and with a wide range of parameter values, confirm that IPM is more effective than any single control tactic.
Resumo:
In all but the most sterile environments bacteria will reside in fluid being transported through conduits and some of these will attach and grow as biofilms on the conduit walls. The concentration and diversity of bacteria in the fluid at the point of delivery will be a mix of those when it entered the conduit and those that have become entrained into the flow due to seeding from biofilms. Examples include fluids through conduits such as drinking water pipe networks, endotracheal tubes, catheters and ventilation systems. Here we present two probabilistic models to describe changes in the composition of bulk fluid microbial communities as they are transported through a conduit whilst exposed to biofilm communities. The first (discrete) model simulates absolute numbers of individual cells, whereas the other (continuous) model simulates the relative abundance of taxa in the bulk fluid. The discrete model is founded on a birth-death process whereby the community changes one individual at a time and the numbers of cells in the system can vary. The continuous model is a stochastic differential equation derived from the discrete model and can also accommodate changes in the carrying capacity of the bulk fluid. These models provide a novel Lagrangian framework to investigate and predict the dynamics of migrating microbial communities. In this paper we compare the two models, discuss their merits, possible applications and present simulation results in the context of drinking water distribution systems. Our results provide novel insight into the effects of stochastic dynamics on the composition of non-stationary microbial communities that are exposed to biofilms and provides a new avenue for modelling microbial dynamics in systems where fluids are being transported.
Resumo:
A nonperturbative nonlinear statistical approach is presented to describe turbulent magnetic systems embedded in a uniform mean magnetic field. A general formula in the form of an ordinary differential equation for magnetic field-line wandering (random walk) is derived. By considering the solution of this equation for different limits several new results are obtained. As an example, it is demonstrated that the stochastic wandering of magnetic field-lines in a two-component turbulence model leads to superdiffusive transport, contrary to an existing diffusive picture. The validity of quasilinear theory for field-line wandering is discussed, with respect to different turbulence geometry models, and previous diffusive results are shown to be deduced in appropriate limits.
Resumo:
We propose a new approach for modeling nonlinear multivariate interest rate processes based on time-varying copulas and reducible stochastic differential equations (SDEs). In the modeling of the marginal processes, we consider a class of nonlinear SDEs that are reducible to Ornstein--Uhlenbeck (OU) process or Cox, Ingersoll, and Ross (1985) (CIR) process. The reducibility is achieved via a nonlinear transformation function. The main advantage of this approach is that these SDEs can account for nonlinear features, observed in short-term interest rate series, while at the same time leading to exact discretization and closed-form likelihood functions. Although a rich set of specifications may be entertained, our exposition focuses on a couple of nonlinear constant elasticity volatility (CEV) processes, denoted as OU-CEV and CIR-CEV, respectively. These two processes encompass a number of existing models that have closed-form likelihood functions. The transition density, the conditional distribution function, and the steady-state density function are derived in closed form as well as the conditional and unconditional moments for both processes. In order to obtain a more flexible functional form over time, we allow the transformation function to be time varying. Results from our study of U.S. and UK short-term interest rates suggest that the new models outperform existing parametric models with closed-form likelihood functions. We also find the time-varying effects in the transformation functions statistically significant. To examine the joint behavior of interest rate series, we propose flexible nonlinear multivariate models by joining univariate nonlinear processes via appropriate copulas. We study the conditional dependence structure of the two rates using Patton (2006a) time-varying symmetrized Joe--Clayton copula. We find evidence of asymmetric dependence between the two rates, and that the level of dependence is positively related to the level of the two rates. (JEL: C13, C32, G12) Copyright The Author 2010. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oxfordjournals.org, Oxford University Press.
Resumo:
The relationships among organisms and their surroundings can be of immense complexity. To describe and understand an ecosystem as a tangled bank, multiple ways of interaction and their effects have to be considered, such as predation, competition, mutualism and facilitation. Understanding the resulting interaction networks is a challenge in changing environments, e.g. to predict knock-on effects of invasive species and to understand how climate change impacts biodiversity. The elucidation of complex ecological systems with their interactions will benefit enormously from the development of new machine learning tools that aim to infer the structure of interaction networks from field data. In the present study, we propose a novel Bayesian regression and multiple changepoint model (BRAM) for reconstructing species interaction networks from observed species distributions. The model has been devised to allow robust inference in the presence of spatial autocorrelation and distributional heterogeneity. We have evaluated the model on simulated data that combines a trophic niche model with a stochastic population model on a 2-dimensional lattice, and we have compared the performance of our model with L1-penalized sparse regression (LASSO) and non-linear Bayesian networks with the BDe scoring scheme. In addition, we have applied our method to plant ground coverage data from the western shore of the Outer Hebrides with the objective to infer the ecological interactions. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
We present results for a suite of 14 three-dimensional, high-resolution hydrodynamical simulations of delayed-detonation models of Type Ia supernova (SN Ia) explosions. This model suite comprises the first set of three-dimensional SN Ia simulations with detailed isotopic yield information. As such, it may serve as a data base for Chandrasekhar-mass delayed-detonation model nucleosynthetic yields and for deriving synthetic observables such as spectra and light curves. We employ aphysically motivated, stochastic model based on turbulent velocity fluctuations and fuel density to calculate in situ the deflagration-to-detonation transition probabilities. To obtain different strengths of the deflagration phase and thereby different degrees of pre-expansion, we have chosen a sequence of initial models with 1, 3, 5, 10, 20, 40, 100, 150, 200, 300 and 1600 (two different realizations) ignition kernels in a hydrostatic white dwarf with a central density of 2.9 × 10 g cm, as well as one high central density (5.5 × 10 g cm) and one low central density (1.0 × 10 g cm) rendition of the 100 ignition kernel configuration. For each simulation, we determined detailed nucleosynthetic yields by postprocessing10 tracer particles with a 384 nuclide reaction network. All delayed-detonation models result in explosions unbinding thewhite dwarf, producing a range of 56Ni masses from 0.32 to 1.11M. As a general trend, the models predict that the stableneutron-rich iron-group isotopes are not found at the lowest velocities, but rather at intermediate velocities (~3000×10 000 km s) in a shell surrounding a Ni-rich core. The models further predict relatively low-velocity oxygen and carbon, with typical minimum velocities around 4000 and 10 000 km s, respectively. © 2012 The Authors. Published by Oxford University Press on behalf of the Royal Astronomical Society.
Resumo:
1. Ecologists are debating the relative role of deterministic and stochastic determinants of community structure. Although the high diversity and strong spatial structure of soil animal assemblages could provide ecologists with an ideal ecological scenario, surprisingly little information is available on these assemblages.
2. We studied species-rich soil oribatid mite assemblages from a Mediterranean beech forest and a grassland. We applied multivariate regression approaches and analysed spatial autocorrelation at multiple spatial scales using Moran's eigenvectors. Results were used to partition community variance in terms of the amount of variation uniquely accounted for by environmental correlates (e.g. organic matter) and geographical position. Estimated neutral diversity and immigration parameters were also applied to a soil animal group for the first time to simulate patterns of community dissimilarity expected under neutrality, thereby testing neutral predictions.
3. After accounting for spatial autocorrelation, the correlation between community structure and key environmental parameters disappeared: about 40% of community variation consisted of spatial patterns independent of measured environmental variables such as organic matter. Environmentally independent spatial patterns encompassed the entire range of scales accounted for by the sampling design (from tens of cm to 100 m). This spatial variation could be due to either unmeasured but spatially structured variables or stochastic drift mediated by dispersal. Observed levels of community dissimilarity were significantly different from those predicted by neutral models.
4. Oribatid mite assemblages are dominated by processes involving both deterministic and stochastic components and operating at multiple scales. Spatial patterns independent of the measured environmental variables are a prominent feature of the targeted assemblages, but patterns of community dissimilarity do not match neutral predictions. This suggests that either niche-mediated competition or environmental filtering or both are contributing to the core structure of the community. This study indicates new lines of investigation for understanding the mechanisms that determine the signature of the deterministic component of animal community assembly.
Resumo:
Extreme arid regions in the worlds' major deserts are typified by quartz pavement terrain. Cryptic hypolithic communities colonize the ventral surface of quartz rocks and this habitat is characterized by a relative lack of environmental and trophic complexity. Combined with readily identifiable major environmental stressors this provides a tractable model system for determining the relative role of stochastic and deterministic drivers in community assembly. Through analyzing an original, worldwide data set of 16S rRNA-gene defined bacterial communities from the most extreme deserts on the Earth, we show that functional assemblages within the communities were subject to different assembly influences. Null models applied to the photosynthetic assemblage revealed that stochastic processes exerted most effect on the assemblage, although the level of community dissimilarity varied between continents in a manner not always consistent with neutral models. The heterotrophic assemblages displayed signatures of niche processes across four continents, whereas in other cases they conformed to neutral predictions. Importantly, for continents where neutrality was either rejected or accepted, assembly drivers differed between the two functional groups. This study demonstrates that multi-trophic microbial systems may not be fully described by a single set of niche or neutral assembly rules and that stochasticity is likely a major determinant of such systems, with significant variation in the influence of these determinants on a global scale.
Resumo:
This paper investigates sub-integer implementations of the adaptive Gaussian mixture model (GMM) for background/foreground segmentation to allow the deployment of the method on low cost/low power processors that lack Floating Point Unit (FPU). We propose two novel integer computer arithmetic techniques to update Gaussian parameters. Specifically, the mean value and the variance of each Gaussian are updated by a redefined and generalised "round'' operation that emulates the original updating rules for a large set of learning rates. Weights are represented by counters that are updated following stochastic rules to allow a wider range of learning rates and the weight trend is approximated by a line or a staircase. We demonstrate that the memory footprint and computational cost of GMM are significantly reduced, without significantly affecting the performance of background/foreground segmentation.