947 resultados para Stochastic models
Resumo:
In 2005, the ECMWF held a workshop on stochastic parameterisation, at which the convection was seen as being a key issue. That much is clear from the working group reports and particularly the statement from working group 1 that “it is clear that a stochastic convection scheme is desirable”. The present note aims to consider our current status in comparison with some of the issues raised and hopes expressed in that working group report.
Resumo:
The understanding of the statistical properties and of the dynamics of multistable systems is gaining more and more importance in a vast variety of scientific fields. This is especially relevant for the investigation of the tipping points of complex systems. Sometimes, in order to understand the time series of given observables exhibiting bimodal distributions, simple one-dimensional Langevin models are fitted to reproduce the observed statistical properties, and used to investing-ate the projected dynamics of the observable. This is of great relevance for studying potential catastrophic changes in the properties of the underlying system or resonant behaviours like those related to stochastic resonance-like mechanisms. In this paper, we propose a framework for encasing this kind of studies, using simple box models of the oceanic circulation and choosing as observable the strength of the thermohaline circulation. We study the statistical properties of the transitions between the two modes of operation of the thermohaline circulation under symmetric boundary forcings and test their agreement with simplified one-dimensional phenomenological theories. We extend our analysis to include stochastic resonance-like amplification processes. We conclude that fitted one-dimensional Langevin models, when closely scrutinised, may result to be more ad-hoc than they seem, lacking robustness and/or well-posedness. They should be treated with care, more as an empiric descriptive tool than as methodology with predictive power.
Resumo:
Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations (SSPs) as well as for model error representation, uncertainty quantification, data assimilation, and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large-scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochastic components and non-Markovian (memory) terms. Stochastic approaches in numerical weather and climate prediction models also lead to the reduction of model biases. Hence, there is a clear need for systematic stochastic approaches in weather and climate modeling. In this review, we present evidence for stochastic effects in laboratory experiments. Then we provide an overview of stochastic climate theory from an applied mathematics perspective. We also survey the current use of stochastic methods in comprehensive weather and climate prediction models and show that stochastic parameterizations have the potential to remedy many of the current biases in these comprehensive models.
Resumo:
Two stochastic epidemic lattice models, the susceptible-infected-recovered and the susceptible-exposed-infected models, are studied on a Cayley tree of coordination number k. The spreading of the disease in the former is found to occur when the infection probability b is larger than b(c) = k/2(k - 1). In the latter, which is equivalent to a dynamic site percolation model, the spreading occurs when the infection probability p is greater than p(c) = 1/(k - 1). We set up and solve the time evolution equations for both models and determine the final and time-dependent properties, including the epidemic curve. We show that the two models are closely related by revealing that their relevant properties are exactly mapped into each other when p = b/[k - (k - 1) b]. These include the cluster size distribution and the density of individuals of each type, quantities that have been determined in closed forms.
Resumo:
We study a stochastic process describing the onset of spreading dynamics of an epidemic in a population composed of individuals of three classes: susceptible (S), infected (I), and recovered (R). The stochastic process is defined by local rules and involves the following cyclic process: S -> I -> R -> S (SIRS). The open process S -> I -> R (SIR) is studied as a particular case of the SIRS process. The epidemic process is analyzed at different levels of description: by a stochastic lattice gas model and by a birth and death process. By means of Monte Carlo simulations and dynamical mean-field approximations we show that the SIRS stochastic lattice gas model exhibit a line of critical points separating the two phases: an absorbing phase where the lattice is completely full of S individuals and an active phase where S, I and R individuals coexist, which may or may not present population cycles. The critical line, that corresponds to the onset of epidemic spreading, is shown to belong in the directed percolation universality class. By considering the birth and death process we analyze the role of noise in stabilizing the oscillations. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
We have studied by numerical simulations the relaxation of the stochastic seven-state Potts model after a quench from a high temperature down to a temperature below the first-order transition. For quench temperatures just below the transition temperature the phase ordering occurs by simple coarsening under the action of surface tension. For sufficient low temperatures however the straightening of the interface between domains drives the system toward a metastable disordered state, identified as a glassy state. Escaping from this state occurs, if the quench temperature is nonzero, by a thermal activated dynamics that eventually drives the system toward the equilibrium state. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The coexistence between different types of templates has been the choice solution to the information crisis of prebiotic evolution, triggered by the finding that a single RNA-like template cannot carry enough information to code for any useful replicase. In principle, confining d distinct templates of length L in a package or protocell, whose Survival depends on the coexistence of the templates it holds in, could resolve this crisis provided that d is made sufficiently large. Here we review the prototypical package model of Niesert et al. [1981. Origin of life between Scylla and Charybdis. J. Mol. Evol. 17, 348-353] which guarantees the greatest possible region of viability of the protocell population, and show that this model, and hence the entire package approach, does not resolve the information crisis. In particular, we show that the total information stored in a viable protocell (Ld) tends to a constant value that depends only on the spontaneous error rate per nucleotide of the template replication mechanism. As a result, an increase of d must be followed by a decrease of L, so that the net information gain is null. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Mathematical models, as instruments for understanding the workings of nature, are a traditional tool of physics, but they also play an ever increasing role in biology - in the description of fundamental processes as well as that of complex systems. In this review, the authors discuss two examples of the application of group theoretical methods, which constitute the mathematical discipline for a quantitative description of the idea of symmetry, to genetics. The first one appears, in the form of a pseudo-orthogonal (Lorentz like) symmetry, in the stochastic modelling of what may be regarded as the simplest possible example of a genetic network and, hopefully, a building block for more complicated ones: a single self-interacting or externally regulated gene with only two possible states: ` on` and ` off`. The second is the algebraic approach to the evolution of the genetic code, according to which the current code results from a dynamical symmetry breaking process, starting out from an initial state of complete symmetry and ending in the presently observed final state of low symmetry. In both cases, symmetry plays a decisive role: in the first, it is a characteristic feature of the dynamics of the gene switch and its decay to equilibrium, whereas in the second, it provides the guidelines for the evolution of the coding rules.
Resumo:
This paper presents the techniques of likelihood prediction for the generalized linear mixed models. Methods of likelihood prediction is explained through a series of examples; from a classical one to more complicated ones. The examples show, in simple cases, that the likelihood prediction (LP) coincides with already known best frequentist practice such as the best linear unbiased predictor. The paper outlines a way to deal with the covariate uncertainty while producing predictive inference. Using a Poisson error-in-variable generalized linear model, it has been shown that in complicated cases LP produces better results than already know methods.
Resumo:
This paper considers the general problem of Feasible Generalized Least Squares Instrumental Variables (FG LS IV) estimation using optimal instruments. First we summarize the sufficient conditions for the FG LS IV estimator to be asymptotic ally equivalent to an optimal G LS IV estimator. Then we specialize to stationary dynamic systems with stationary VAR errors, and use the sufficient conditions to derive new moment conditions for these models. These moment conditions produce useful IVs from the lagged endogenous variables, despite the correlation between errors and endogenous variables. This use of the information contained in the lagged endogenous variables expands the class of IV estimators under consideration and there by potentially improves both asymptotic and small-sample efficiency of the optimal IV estimator in the class. Some Monte Carlo experiments compare the new methods with those of Hatanaka [1976]. For the DG P used in the Monte Carlo experiments, asymptotic efficiency is strictly improved by the new IVs, and experimental small-sample efficiency is improved as well.
Resumo:
Asset allocation decisions and value at risk calculations rely strongly on volatility estimates. Volatility measures such as rolling window, EWMA, GARCH and stochastic volatility are used in practice. GARCH and EWMA type models that incorporate the dynamic structure of volatility and are capable of forecasting future behavior of risk should perform better than constant, rolling window volatility models. For the same asset the model that is the ‘best’ according to some criterion can change from period to period. We use the reality check test∗ to verify if one model out-performs others over a class of re-sampled time-series data. The test is based on re-sampling the data using stationary bootstrapping. For each re-sample we check the ‘best’ model according to two criteria and analyze the distribution of the performance statistics. We compare constant volatility, EWMA and GARCH models using a quadratic utility function and a risk management measurement as comparison criteria. No model consistently out-performs the benchmark.
Resumo:
We aim to provide a review of the stochastic discount factor bounds usually applied to diagnose asset pricing models. In particular, we mainly discuss the bounds used to analyze the disaster model of Barro (2006). Our attention is focused in this disaster model since the stochastic discount factor bounds that are applied to study the performance of disaster models usually consider the approach of Barro (2006). We first present the entropy bounds that provide a diagnosis of the analyzed disaster model which are the methods of Almeida and Garcia (2012, 2016); Ghosh et al. (2016). Then, we discuss how their results according to the disaster model are related to each other and also present the findings of other methodologies that are similar to these bounds but provide different evidence about the performance of the framework developed by Barro (2006).
Resumo:
Ionospheric scintillations are caused by time-varying electron density irregularities in the ionosphere, occurring more often at equatorial and high latitudes. This paper focuses exclusively on experiments undertaken in Europe, at geographic latitudes between similar to 50 degrees N and similar to 80 degrees N, where a network of GPS receivers capable of monitoring Total Electron Content and ionospheric scintillation parameters was deployed. The widely used ionospheric scintillation indices S4 and sigma(phi) represent a practical measure of the intensity of amplitude and phase scintillation affecting GNSS receivers. However, they do not provide sufficient information regarding the actual tracking errors that degrade GNSS receiver performance. Suitable receiver tracking models, sensitive to ionospheric scintillation, allow the computation of the variance of the output error of the receiver PLL (Phase Locked Loop) and DLL (Delay Locked Loop), which expresses the quality of the range measurements used by the receiver to calculate user position. The ability of such models of incorporating phase and amplitude scintillation effects into the variance of these tracking errors underpins our proposed method of applying relative weights to measurements from different satellites. That gives the least squares stochastic model used for position computation a more realistic representation, vis-a-vis the otherwise 'equal weights' model. For pseudorange processing, relative weights were computed, so that a 'scintillation-mitigated' solution could be performed and compared to the (non-mitigated) 'equal weights' solution. An improvement between 17 and 38% in height accuracy was achieved when an epoch by epoch differential solution was computed over baselines ranging from 1 to 750 km. The method was then compared with alternative approaches that can be used to improve the least squares stochastic model such as weighting according to satellite elevation angle and by the inverse of the square of the standard deviation of the code/carrier divergence (sigma CCDiv). The influence of multipath effects on the proposed mitigation approach is also discussed. With the use of high rate scintillation data in addition to the scintillation indices a carrier phase based mitigated solution was also implemented and compared with the conventional solution. During a period of occurrence of high phase scintillation it was observed that problems related to ambiguity resolution can be reduced by the use of the proposed mitigated solution.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Power-law distributions, i.e. Levy flights have been observed in various economical, biological, and physical systems in high-frequency regime. These distributions can be successfully explained via gradually truncated Levy flight (GTLF). In general, these systems converge to a Gaussian distribution in the low-frequency regime. In the present work, we develop a model for the physical basis for the cut-off length in GTLF and its variation with respect to the time interval between successive observations. We observe that GTLF automatically approach a Gaussian distribution in the low-frequency regime. We applied the present method to analyze time series in some physical and financial systems. The agreement between the experimental results and theoretical curves is excellent. The present method can be applied to analyze time series in a variety of fields, which in turn provide a basis for the development of further microscopic models for the system. © 2000 Elsevier Science B.V. All rights reserved.