16 resultados para Stochastic Processes
em CentAUR: Central Archive University of Reading - UK
Resumo:
The development of the real estate swap market offers many opportunities for investors to adjust the exposure of their portfolios to real estate. A number of OTC transactions have been observed in markets around the world. In this paper we examine the Japanese commercial real estate market from the point of view of an investor holding a portfolio of properties seeking to reduce the portfolio exposure to the real estate market by swapping an index of real estate for LIBOR. This paper explores the practicalities of hedging portfolios comprising small numbers of individual properties against an appropriate index. We use the returns from 74 properties owned by Japanese Real Estate Investment Trusts over the period up to September 2007. The paper also discusses and applies the appropriate stochastic processes required to model real estate returns in this application and presents alternative ways of reporting hedging effectiveness. We find that the development of the derivative does provide the capacity for hedging market risk but that the effectiveness of the hedge varies considerably over time. We explore the factors that cause this variability.
Resumo:
We investigate the super-Brownian motion with a single point source in dimensions 2 and 3 as constructed by Fleischmann and Mueller in 2004. Using analytic facts we derive the long time behavior of the mean in dimension 2 and 3 thereby complementing previous work of Fleischmann, Mueller and Vogt. Using spectral theory and martingale arguments we prove a version of the strong law of large numbers for the two dimensional superprocess with a single point source and finite variance.
Resumo:
The transcriptome of an organism is its set of gene transcripts (mRNAs) at a defined spatial and temporal locus. Because gene expression is affected markedly by environmental and developmental perturbations, it is widely assumed that transcriptome divergence among taxa represents adaptive phenotypic selection. This assumption has been challenged by neutral theories which propose that stochastic processes drive transcriptome evolution. To test for evidence of neutral transcriptome evolution in plants, we quantified 18 494 gene transcripts in nonsenescent leaves of 14 taxa of Brassicaceae using robust cross-species transcriptomics which includes a two-step physical and in silico-based normalization procedure based on DNA similarity among taxa. Transcriptome divergence correlates positively with evolutionary distance between taxa and with variation in gene expression among samples. Results are similar for pseudogenes and chloroplast genes evolving at different rates. Remarkably, variation in transcript abundance among root-cell samples correlates positively with transcriptome divergence among root tissues and among taxa. Because neutral processes affect transcriptome evolution in plants, many differences in gene expression among or within taxa may be nonfunctional, reflecting ancestral plasticity and founder effects. Appropriate null models are required when comparing transcriptomes in space and time.
Resumo:
Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind “noise,” which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical “downscaling” of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme.
Resumo:
Let X be a locally compact Polish space. A random measure on X is a probability measure on the space of all (nonnegative) Radon measures on X. Denote by K(X) the cone of all Radon measures η on X which are of the form η =
Resumo:
We establish a general framework for a class of multidimensional stochastic processes over [0,1] under which with probability one, the signature (the collection of iterated path integrals in the sense of rough paths) is well-defined and determines the sample paths of the process up to reparametrization. In particular, by using the Malliavin calculus we show that our method applies to a class of Gaussian processes including fractional Brownian motion with Hurst parameter H>1/4, the Ornstein–Uhlenbeck process and the Brownian bridge.
Resumo:
A driver controls a car by turning the steering wheel or by pressing on the accelerator or the brake. These actions are modelled by Gaussian processes, leading to a stochastic model for the motion of the car. The stochastic model is the basis of a new filter for tracking and predicting the motion of the car, using measurements obtained by fitting a rigid 3D model to a monocular sequence of video images. Experiments show that the filter easily outperforms traditional filters.
Resumo:
Finite computing resources limit the spatial resolution of state-of-the-art global climate simulations to hundreds of kilometres. In neither the atmosphere nor the ocean are small-scale processes such as convection, clouds and ocean eddies properly represented. Climate simulations are known to depend, sometimes quite strongly, on the resulting bulk-formula representation of unresolved processes. Stochastic physics schemes within weather and climate models have the potential to represent the dynamical effects of unresolved scales in ways which conventional bulk-formula representations are incapable of so doing. The application of stochastic physics to climate modelling is a rapidly advancing, important and innovative topic. The latest research findings are gathered together in the Theme Issue for which this paper serves as the introduction.
Resumo:
We report on a numerical study of the impact of short, fast inertia-gravity waves on the large-scale, slowly-evolving flow with which they co-exist. A nonlinear quasi-geostrophic numerical model of a stratified shear flow is used to simulate, at reasonably high resolution, the evolution of a large-scale mode which grows due to baroclinic instability and equilibrates at finite amplitude. Ageostrophic inertia-gravity modes are filtered out of the model by construction, but their effects on the balanced flow are incorporated using a simple stochastic parameterization of the potential vorticity anomalies which they induce. The model simulates a rotating, two-layer annulus laboratory experiment, in which we recently observed systematic inertia-gravity wave generation by an evolving, large-scale flow. We find that the impact of the small-amplitude stochastic contribution to the potential vorticity tendency, on the model balanced flow, is generally small, as expected. In certain circumstances, however, the parameterized fast waves can exert a dominant influence. In a flow which is baroclinically-unstable to a range of zonal wavenumbers, and in which there is a close match between the growth rates of the multiple modes, the stochastic waves can strongly affect wavenumber selection. This is illustrated by a flow in which the parameterized fast modes dramatically re-partition the probability-density function for equilibrated large-scale zonal wavenumber. In a second case study, the stochastic perturbations are shown to force spontaneous wavenumber transitions in the large-scale flow, which do not occur in their absence. These phenomena are due to a stochastic resonance effect. They add to the evidence that deterministic parameterizations in general circulation models, of subgrid-scale processes such as gravity wave drag, cannot always adequately capture the full details of the nonlinear interaction.
Resumo:
The processes that govern the predictability of decadal variations in the North Atlantic meridional overturning circulation (MOC) are investigated in a long control simulation of the ECHO-G coupled atmosphere–ocean model. We elucidate the roles of local stochastic forcing by the atmosphere, and other potential ocean processes, and use our results to build a predictive regression model. The primary influence on MOC variability is found to come from air–sea heat fluxes over the Eastern Labrador Sea. The maximum correlation between such anomalies and the variations in the MOC occurs at a lead time of 2 years, but we demonstrate that the MOC integrates the heat flux variations over a period of 10 years. The corresponding univariate regression model accounts for 74.5% of the interannual variability in the MOC (after the Ekman component has been removed). Dense anomalies to the south of the Greenland-Scotland ridge are also shown to precede the overturning variations by 4–6 years, and provide a second predictor. With the inclusion of this second predictor the resulting regression model explains 82.8% of the total variance of the MOC. This final bivariate model is also tested during large rapid decadal overturning events. The sign of the rapid change is always well represented by the bivariate model, but the magnitude is usually underestimated, suggesting that other processes are also important for these large rapid decadal changes in the MOC.
Resumo:
The understanding of the statistical properties and of the dynamics of multistable systems is gaining more and more importance in a vast variety of scientific fields. This is especially relevant for the investigation of the tipping points of complex systems. Sometimes, in order to understand the time series of given observables exhibiting bimodal distributions, simple one-dimensional Langevin models are fitted to reproduce the observed statistical properties, and used to investing-ate the projected dynamics of the observable. This is of great relevance for studying potential catastrophic changes in the properties of the underlying system or resonant behaviours like those related to stochastic resonance-like mechanisms. In this paper, we propose a framework for encasing this kind of studies, using simple box models of the oceanic circulation and choosing as observable the strength of the thermohaline circulation. We study the statistical properties of the transitions between the two modes of operation of the thermohaline circulation under symmetric boundary forcings and test their agreement with simplified one-dimensional phenomenological theories. We extend our analysis to include stochastic resonance-like amplification processes. We conclude that fitted one-dimensional Langevin models, when closely scrutinised, may result to be more ad-hoc than they seem, lacking robustness and/or well-posedness. They should be treated with care, more as an empiric descriptive tool than as methodology with predictive power.
Resumo:
The detection of long-range dependence in time series analysis is an important task to which this paper contributes by showing that whilst the theoretical definition of a long-memory (or long-range dependent) process is based on the autocorrelation function, it is not possible for long memory to be identified using the sum of the sample autocorrelations, as usually defined. The reason for this is that the sample sum is a predetermined constant for any stationary time series; a result that is independent of the sample size. Diagnostic or estimation procedures, such as those in the frequency domain, that embed this sum are equally open to this criticism. We develop this result in the context of long memory, extending it to the implications for the spectral density function and the variance of partial sums of a stationary stochastic process. The results are further extended to higher order sample autocorrelations and the bispectral density. The corresponding result is that the sum of the third order sample (auto) bicorrelations at lags h,k≥1, is also a predetermined constant, different from that in the second order case, for any stationary time series of arbitrary length.