96 resultados para Piecewise deterministic Markov processes

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider an equilibrium birth and death type process for a particle system in infinite volume, the latter is described by the space of all locally finite point configurations on Rd. These Glauber type dynamics are Markov processes constructed for pre-given reversible measures. A representation for the ``carré du champ'' and ``second carré du champ'' for the associate infinitesimal generators L are calculated in infinite volume and for a large class of functions in a generalized sense. The corresponding coercivity identity is derived and explicit sufficient conditions for the appearance and bounds for the size of the spectral gap of L are given. These techniques are applied to Glauber dynamics associated to Gibbs measure and conditions are derived extending all previous known results and, in particular, potentials with negative parts can now be treated. The high temperature regime is extended essentially and potentials with non-trivial negative part can be included. Furthermore, a special class of potentials is defined for which the size of the spectral gap is as least as large as for the free system and, surprisingly, the spectral gap is independent of the activity. This type of potentials should not show any phase transition for a given temperature at any activity.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Varroa destructor is a parasitic mite of the Eastern honeybee Apis cerana. Fifty years ago, two distinct evolutionary lineages (Korean and Japanese) invaded the Western honeybee Apis mellifera. This haplo-diploid parasite species reproduces mainly through brother sister matings, a system which largely favors the fixation of new mutations. In a worldwide sample of 225 individuals from 21 locations collected on Western honeybees and analyzed at 19 microsatellite loci, a series of de novo mutations was observed. Using historical data concerning the invasion, this original biological system has been exploited to compare three mutation models with allele size constraints for microsatellite markers: stepwise (SMM) and generalized (GSM) mutation models, and a model with mutation rate increasing exponentially with microsatellite length (ESM). Posterior probabilities of the three models have been estimated for each locus individually using reversible jump Markov Chain Monte Carlo. The relative support of each model varies widely among loci, but the GSM is the only model that always receives at least 9% support, whatever the locus. The analysis also provides robust estimates of mutation parameters for each locus and of the divergence time of the two invasive lineages (67,000 generations with a 90% credibility interval of 35,000-174,000). With an average of 10 generations per year, this divergence time fits with the last post-glacial Korea Japan land separation. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Associative memory networks such as Radial Basis Functions, Neurofuzzy and Fuzzy Logic used for modelling nonlinear processes suffer from the curse of dimensionality (COD), in that as the input dimension increases the parameterization, computation cost, training data requirements, etc. increase exponentially. Here a new algorithm is introduced for the construction of a Delaunay input space partitioned optimal piecewise locally linear models to overcome the COD as well as generate locally linear models directly amenable to linear control and estimation algorithms. The training of the model is configured as a new mixture of experts network with a new fast decision rule derived using convex set theory. A very fast simulated reannealing (VFSR) algorithm is utilized to search a global optimal solution of the Delaunay input space partition. A benchmark non-linear time series is used to demonstrate the new approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For a Lévy process ξ=(ξt)t≥0 drifting to −∞, we define the so-called exponential functional as follows: Formula Under mild conditions on ξ, we show that the following factorization of exponential functionals: Formula holds, where × stands for the product of independent random variables, H− is the descending ladder height process of ξ and Y is a spectrally positive Lévy process with a negative mean constructed from its ascending ladder height process. As a by-product, we generate an integral or power series representation for the law of Iξ for a large class of Lévy processes with two-sided jumps and also derive some new distributional properties. The proof of our main result relies on a fine Markovian study of a class of generalized Ornstein–Uhlenbeck processes, which is itself of independent interest. We use and refine an alternative approach of studying the stationary measure of a Markov process which avoids some technicalities and difficulties that appear in the classical method of employing the generator of the dual Markov process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We establish a general framework for a class of multidimensional stochastic processes over [0,1] under which with probability one, the signature (the collection of iterated path integrals in the sense of rough paths) is well-defined and determines the sample paths of the process up to reparametrization. In particular, by using the Malliavin calculus we show that our method applies to a class of Gaussian processes including fractional Brownian motion with Hurst parameter H>1/4, the Ornstein–Uhlenbeck process and the Brownian bridge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A case of long-range transport of a biomass burning plume from Alaska to Europe is analyzed using a Lagrangian approach. This plume was sampled several times in the free troposphere over North America, the North Atlantic and Europe by three different aircraft during the IGAC Lagrangian 2K4 experiment which was part of the ICARTT/ITOP measurement intensive in summer 2004. Measurements in the plume showed enhanced values of CO, VOCs and NOy, mainly in form of PAN. Observed O3 levels increased by 17 ppbv over 5 days. A photochemical trajectory model, CiTTyCAT, was used to examine processes responsible for the chemical evolution of the plume. The model was initialized with upwind data and compared with downwind measurements. The influence of high aerosol loading on photolysis rates in the plume was investigated using in situ aerosol measurements in the plume and lidar retrievals of optical depth as input into a photolysis code (Fast-J), run in the model. Significant impacts on photochemistry are found with a decrease of 18% in O3 production and 24% in O3 destruction over 5 days when including aerosols. The plume is found to be chemically active with large O3 increases attributed primarily to PAN decomposition during descent of the plume toward Europe. The predicted O3 changes are very dependent on temperature changes during transport and also on water vapor levels in the lower troposphere which can lead to O3 destruction. Simulation of mixing/dilution was necessary to reproduce observed pollutant levels in the plume. Mixing was simulated using background concentrations from measurements in air masses in close proximity to the plume, and mixing timescales (averaging 6.25 days) were derived from CO changes. Observed and simulated O3/CO correlations in the plume were also compared in order to evaluate the photochemistry in the model. Observed slopes change from negative to positive over 5 days. This change, which can be attributed largely to photochemistry, is well reproduced by multiple model runs even if slope values are slightly underestimated suggesting a small underestimation in modeled photochemical O3 production. The possible impact of this biomass burning plume on O3 levels in the European boundary layer was also examined by running the model for a further 5 days and comparing with data collected at surface sites, such as Jungfraujoch, which showed small O3 increases and elevated CO levels. The model predicts significant changes in O3 over the entire 10 day period due to photochemistry but the signal is largely lost because of the effects of dilution. However, measurements in several other BB plumes over Europe show that O3 impact of Alaskan fires can be potentially significant over Europe.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is often assumed that ventilation of the atmospheric boundary layer is weak in the absence of fronts, but is this always true? In this paper we investigate the processes responsible for ventilation of the atmospheric boundary layer during a nonfrontal day that occurred on 9 May 2005 using the UK Met Office Unified Model. Pollution sources are represented by the constant emission of a passive tracer everywhere over land. The ventilation processes observed include shallow convection, turbulent mixing followed by large-scale ascent, a sea breeze circulation and coastal outflow. Vertical distributions of tracer are validated qualitatively with AMPEP (Aircraft Measurement of chemical Processing Export fluxes of Pollutants over the UK) CO aircraft measurements and are shown to agree impressively well. Budget calculations of tracers are performed in order to determine the relative importance of these ventilation processes. Coastal outflow and the sea breeze circulation were found to ventilate 26% of the boundary layer tracer by sunset of which 2% was above 2 km. A combination of coastal outflow, the sea breeze circulation, turbulent mixing and large-scale ascent ventilated 46% of the boundary layer tracer, of which 10% was above 2 km. Finally, coastal outflow, the sea breeze circulation, turbulent mixing, large-scale ascent and shallow convection together ventilated 52% of the tracer into the free troposphere, of which 26% was above 2 km. Hence this study shows that significant ventilation of the boundary layer can occur in the absence of fronts (and thus during high-pressure events). Turbulent mixing and convection processes can double the amount of pollution ventilated from the boundary layer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are various situations in which it is natural to ask whether a given collection of k functions, ρ j (r 1,…,r j ), j=1,…,k, defined on a set X, are the first k correlation functions of a point process on X. Here we describe some necessary and sufficient conditions on the ρ j ’s for this to be true. Our primary examples are X=ℝ d , X=ℤ d , and X an arbitrary finite set. In particular, we extend a result by Ambartzumian and Sukiasian showing realizability at sufficiently small densities ρ 1(r). Typically if any realizing process exists there will be many (even an uncountable number); in this case we prove, when X is a finite set, the existence of a realizing Gibbs measure with k body potentials which maximizes the entropy among all realizing measures. We also investigate in detail a simple example in which a uniform density ρ and translation invariant ρ 2 are specified on ℤ; there is a gap between our best upper bound on possible values of ρ and the largest ρ for which realizability can be established.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Egger (2008) constructs some idealised experiments to test the usefulness of piecewise potential vorticity inversion (PPVI) in the diagnosis of Rossby wave dynamics and baroclinic development. He concludes that, ``PPVI does not help us to understand the dynamics of linear Rossby waves. It provides local tendencies of the streamfunction which are unrelated to the true ones. The same way, the motion of baroclinic waves in shear flow cannot be understood by using PPVI. Moreover, the effect of boundary temperatures as determined by PPVI is unrelated to the flow evolution.'' He goes further in arguing that we should not consider velocities as ``induced'' by PV anomalies defined by carving up the global domain. However, these conclusions partly reflect the limitations of his idealised experiments and the manner in which the PV components were partitioned from one another.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many modelling studies examine the impacts of climate change on crop yield, but few explore either the underlying bio-physical processes, or the uncertainty inherent in the parameterisation of crop growth and development. We used a perturbed-parameter crop modelling method together with a regional climate model (PRECIS) driven by the 2071-2100 SRES A2 emissions scenario in order to examine processes and uncertainties in yield simulation. Crop simulations used the groundnut (i.e. peanut; Arachis hypogaea L.) version of the General Large-Area Model for annual crops (GLAM). Two sets of GLAM simulations were carried out: control simulations and fixed-duration simulations, where the impact of mean temperature on crop development rate was removed. Model results were compared to sensitivity tests using two other crop models of differing levels of complexity: CROPGRO, and the groundnut model of Hammer et al. [Hammer, G.L., Sinclair, T.R., Boote, K.J., Wright, G.C., Meinke, H., and Bell, M.J., 1995, A peanut simulation model: I. Model development and testing. Agron. J. 87, 1085-1093]. GLAM simulations were particularly sensitive to two processes. First, elevated vapour pressure deficit (VPD) consistently reduced yield. The same result was seen in some simulations using both other crop models. Second, GLAM crop duration was longer, and yield greater, when the optimal temperature for the rate of development was exceeded. Yield increases were also seen in one other crop model. Overall, the models differed in their response to super-optimal temperatures, and that difference increased with mean temperature; percentage changes in yield between current and future climates were as diverse as -50% and over +30% for the same input data. The first process has been observed in many crop experiments, whilst the second has not. Thus, we conclude that there is a need for: (i) more process-based modelling studies of the impact of VPD on assimilation, and (ii) more experimental studies at super-optimal temperatures. Using the GLAM results, central values and uncertainty ranges were projected for mean 2071-2100 crop yields in India. In the fixed-duration simulations, ensemble mean yields mostly rose by 10-30%. The full ensemble range was greater than this mean change (20-60% over most of India). In the control simulations, yield stimulation by elevated CO2 was more than offset by other processes-principally accelerated crop development rates at elevated, but sub-optimal, mean temperatures. Hence, the quantification of uncertainty can facilitate relatively robust indications of the likely sign of crop yield changes in future climates. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Processes in the climate system that can either amplify or dampen the climate response to an external perturbation are referred to as climate feedbacks. Climate sensitivity estimates depend critically on radiative feedbacks associated with water vapor, lapse rate, clouds, snow, and sea ice, and global estimates of these feedbacks differ among general circulation models. By reviewing recent observational, numerical, and theoretical studies, this paper shows that there has been progress since the Third Assessment Report of the Intergovernmental Panel on Climate Change in (i) the understanding of the physical mechanisms involved in these feedbacks, (ii) the interpretation of intermodel differences in global estimates of these feedbacks, and (iii) the development of methodologies of evaluation of these feedbacks (or of some components) using observations. This suggests that continuing developments in climate feedback research will progressively help make it possible to constrain the GCMs’ range of climate feedbacks and climate sensitivity through an ensemble of diagnostics based on physical understanding and observations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of Research Theme 4 (RT4) was to advance understanding of the basic science issues at the heart of the ENSEMBLES project, focusing on the key processes that govern climate variability and change, and that determine the predictability of climate. Particular attention was given to understanding linear and non-linear feedbacks that may lead to climate surprises,and to understanding the factors that govern the probability of extreme events. Improved understanding of these issues will contribute significantly to the quantification and reduction of uncertainty in seasonal to decadal predictions and projections of climate change. RT4 exploited the ENSEMBLES integrations (stream 1) performed in RT2A as well as undertaking its own experimentation to explore key processes within the climate system. It was working at the cutting edge of problems related to climate feedbacks, the interaction between climate variability and climate change � especially how climate change pertains to extreme events, and the predictability of the climate system on a range of time-scales. The statisticalmethodologies developed for extreme event analysis are new and state-of-the-art. The RT4-coordinated experiments, which have been conducted with six different atmospheric GCMs forced by common timeinvariant sea surface temperature (SST) and sea-ice fields (removing some sources of inter-model variability), are designed to help to understand model uncertainty (rather than scenario or initial condition uncertainty) in predictions of the response to greenhouse-gas-induced warming. RT4 links strongly with RT5 on the evaluation of the ENSEMBLES prediction system and feeds back its results to RT1 to guide improvements in the Earth system models and, through its research on predictability, to steer the development of methods for initialising the ensembles