971 resultados para Stochastic modelling
Resumo:
We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stochastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS (GEWEX Cloud System Study) community, simulating transitions between active and suppressed periods of tropical convection.
Resumo:
We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Terahertz (THz) frequency radiation, 0.1 THz to 20 THz, is being investigated for biomedical imaging applications following the introduction of pulsed THz sources that produce picosecond pulses and function at room temperature. Owing to the broadband nature of the radiation, spectral and temporal information is available from radiation that has interacted with a sample; this information is exploited in the development of biomedical imaging tools and sensors. In this work, models to aid interpretation of broadband THz spectra were developed and evaluated. THz radiation lies on the boundary between regions best considered using a deterministic electromagnetic approach and those better analysed using a stochastic approach incorporating quantum mechanical effects, so two computational models to simulate the propagation of THz radiation in an absorbing medium were compared. The first was a thin film analysis and the second a stochastic Monte Carlo model. The Cole–Cole model was used to predict the variation with frequency of the physical properties of the sample and scattering was neglected. The two models were compared with measurements from a highly absorbing water-based phantom. The Monte Carlo model gave a prediction closer to experiment over 0.1 to 3 THz. Knowledge of the frequency-dependent physical properties, including the scattering characteristics, of the absorbing media is necessary. The thin film model is computationally simple to implement but is restricted by the geometry of the sample it can describe. The Monte Carlo framework, despite being initially more complex, provides greater flexibility to investigate more complicated sample geometries.
Resumo:
We propose a new modelling framework suitable for the description of atmospheric convective systems as a collection of distinct plumes. The literature contains many examples of models for collections of plumes in which strong simplifying assumptions are made, a diagnostic dependence of convection on the large-scale environment and the limit of many plumes often being imposed from the outset. Some recent studies have sought to remove one or the other of those assumptions. The proposed framework removes both, and is explicitly time-dependent and stochastic in its basic character. The statistical dynamics of the plume collection are defined through simple probabilistic rules applied at the level of individual plumes, and van Kampen's system size expansion is then used to construct the macroscopic limit of the microscopic model. Through suitable choices of the microscopic rules, the model is shown to encompass previous studies in the appropriate limits, and to allow their natural extensions beyond those limits.
Resumo:
Using the formalism of the Ruelle response theory, we study how the invariant measure of an Axiom A dynamical system changes as a result of adding noise, and describe how the stochastic perturbation can be used to explore the properties of the underlying deterministic dynamics. We first find the expression for the change in the expectation value of a general observable when a white noise forcing is introduced in the system, both in the additive and in the multiplicative case. We also show that the difference between the expectation value of the power spectrum of an observable in the stochastically perturbed case and of the same observable in the unperturbed case is equal to the variance of the noise times the square of the modulus of the linear susceptibility describing the frequency-dependent response of the system to perturbations with the same spatial patterns as the considered stochastic forcing. This provides a conceptual bridge between the change in the fluctuation properties of the system due to the presence of noise and the response of the unperturbed system to deterministic forcings. Using Kramers-Kronig theory, it is then possible to derive the real and imaginary part of the susceptibility and thus deduce the Green function of the system for any desired observable. We then extend our results to rather general patterns of random forcing, from the case of several white noise forcings, to noise terms with memory, up to the case of a space-time random field. Explicit formulas are provided for each relevant case analysed. As a general result, we find, using an argument of positive-definiteness, that the power spectrum of the stochastically perturbed system is larger at all frequencies than the power spectrum of the unperturbed system. We provide an example of application of our results by considering the spatially extended chaotic Lorenz 96 model. These results clarify the property of stochastic stability of SRB measures in Axiom A flows, provide tools for analysing stochastic parameterisations and related closure ansatz to be implemented in modelling studies, and introduce new ways to study the response of a system to external perturbations. Taking into account the chaotic hypothesis, we expect that our results have practical relevance for a more general class of system than those belonging to Axiom A.
Resumo:
This paper uses a novel numerical optimization technique - robust optimization - that is well suited to solving the asset-liability management (ALM) problem for pension schemes. It requires the estimation of fewer stochastic parameters, reduces estimation risk and adopts a prudent approach to asset allocation. This study is the first to apply it to a real-world pension scheme, and the first ALM model of a pension scheme to maximise the Sharpe ratio. We disaggregate pension liabilities into three components - active members, deferred members and pensioners, and transform the optimal asset allocation into the scheme’s projected contribution rate. The robust optimization model is extended to include liabilities and used to derive optimal investment policies for the Universities Superannuation Scheme (USS), benchmarked against the Sharpe and Tint, Bayes-Stein, and Black-Litterman models as well as the actual USS investment decisions. Over a 144 month out-of-sample period robust optimization is superior to the four benchmarks across 20 performance criteria, and has a remarkably stable asset allocation – essentially fix-mix. These conclusions are supported by six robustness checks.
Resumo:
The critical behavior of the stochastic susceptible-infected-recovered model on a square lattice is obtained by numerical simulations and finite-size scaling. The order parameter as well as the distribution in the number of recovered individuals is determined as a function of the infection rate for several values of the system size. The analysis around criticality is obtained by exploring the close relationship between the present model and standard percolation theory. The quantity UP, equal to the ratio U between the second moment and the squared first moment of the size distribution multiplied by the order parameter P, is shown to have, for a square system, a universal value 1.0167(1) that is the same for site and bond percolation, confirming further that the SIR model is also in the percolation class.
Resumo:
This paper is concerned with the cost efficiency in achieving the Swedish national air quality objectives under uncertainty. To realize an ecologically sustainable society, the parliament has approved a set of interim and long-term pollution reduction targets. However, there are considerable quantification uncertainties on the effectiveness of the proposed pollution reduction measures. In this paper, we develop a multivariate stochastic control framework to deal with the cost efficiency problem with multiple pollutants. Based on the cost and technological data collected by several national authorities, we explore the implications of alternative probabilistic constraints. It is found that a composite probabilistic constraint induces considerably lower abatement cost than separable probabilistic restrictions. The trend is reinforced by the presence of positive correlations between reductions in the multiple pollutants.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In biological world, life of cells is guaranteed by their ability to sense and to respond to a large variety of internal and external stimuli. In particular, excitable cells, like muscle or nerve cells, produce quick depolarizations in response to electrical, mechanical or chemical stimuli: this means that they can change their internal potential through a quick exchange of ions between cytoplasm and the external environment. This can be done thanks to the presence of ion channels, proteins that span the lipid bilayer and act like switches, allowing ionic current to flow opening and shutting in a stochastic way. For a particular class of ion channels, ligand-gated ion channels, the gating processes is strongly influenced by binding between receptive sites located on the channel surface and specific target molecules. These channels, inserted in biomimetic membranes and in presence of a proper electronic system for acquiring and elaborating the electrical signal, could give us the possibility of detecting and quantifying concentrations of specific molecules in complex mixtures from ionic currents across the membrane; in this thesis work, this possibility is investigated. In particular, it reports a description of experiments focused on the creation and the characterization of artificial lipid membranes, the reconstitution of ion channels and the analysis of their electrical and statistical properties. Moreover, after a chapter about the basis of the modelling of the kinetic behaviour of ligand gated ion channels, a possible approach for the estimation of the target molecule concentration, based on a statistical analysis of the ion channel open probability, is proposed. The fifth chapter contains a description of the kinetic characterisation of a ligand gated ion channel: the homomeric α2 isoform of the glycine receptor. It involved both experimental acquisitions and signal analysis. The last chapter represents the conclusions of this thesis, with some remark on the effective performance that may be achieved using ligand gated ion channels as sensing elements.
Resumo:
Basic concepts and definitions relative to Lagrangian Particle Dispersion Models (LPDMs)for the description of turbulent dispersion are introduced. The study focusses on LPDMs that use as input, for the large scale motion, fields produced by Eulerian models, with the small scale motions described by Lagrangian Stochastic Models (LSMs). The data of two different dynamical model have been used: a Large Eddy Simulation (LES) and a General Circulation Model (GCM). After reviewing the small scale closure adopted by the Eulerian model, the development and implementation of appropriate LSMs is outlined. The basic requirement of every LPDM used in this work is its fullfillment of the Well Mixed Condition (WMC). For the dispersion description in the GCM domain, a stochastic model of Markov order 0, consistent with the eddy-viscosity closure of the dynamical model, is implemented. A LSM of Markov order 1, more suitable for shorter timescales, has been implemented for the description of the unresolved motion of the LES fields. Different assumptions on the small scale correlation time are made. Tests of the LSM on GCM fields suggest that the use of an interpolation algorithm able to maintain an analytical consistency between the diffusion coefficient and its derivative is mandatory if the model has to satisfy the WMC. Also a dynamical time step selection scheme based on the diffusion coefficient shape is introduced, and the criteria for the integration step selection are discussed. Absolute and relative dispersion experiments are made with various unresolved motion settings for the LSM on LES data, and the results are compared with laboratory data. The study shows that the unresolved turbulence parameterization has a negligible influence on the absolute dispersion, while it affects the contribution of the relative dispersion and meandering to absolute dispersion, as well as the Lagrangian correlation.
Resumo:
Knowledge on the relative importance of alternative sources of human campylobacteriosis is important in order to implement effective disease prevention measures. The objective of this study was to assess the relative importance of three key exposure pathways (travelling abroad, poultry meat, pet contact) for different patient age groups in Switzerland. With a stochastic exposure model data on Campylobacter incidence for the years 2002-2007 were linked with data for the three exposure pathways and the results of a case-control study. Mean values for the population attributable fractions (PAF) over all age groups and years were 27% (95% CI 17-39) for poultry consumption, 27% (95% CI 22-32) for travelling abroad, 8% (95% CI 6-9) for pet contact and 39% (95% CI 25-50) for other risk factors. This model provided robust results when using data available for Switzerland, but the uncertainties remained high. The output of the model could be improved if more accurate input data are available to estimate the infection rate per exposure. In particular, the relatively high proportion of cases attributed to 'other risk factors' requires further attention.
Resumo:
OBJECTIVES: Treatment as prevention depends on retaining HIV-infected patients in care. We investigated the effect on HIV transmission of bringing patients lost to follow up (LTFU) back into care. DESIGN: Mathematical model. METHODS: Stochastic mathematical model of cohorts of 1000 HIV-infected patients on antiretroviral therapy (ART), based on data from two clinics in Lilongwe, Malawi. We calculated cohort viral load (CVL; sum of individual mean viral loads each year) and used a mathematical relationship between viral load and transmission probability to estimate the number of new HIV infections. We simulated four scenarios: 'no LTFU' (all patients stay in care); 'no tracing' (patients LTFU are not traced); 'immediate tracing' (after missed clinic appointment); and, 'delayed tracing' (after six months). RESULTS: About 440 of 1000 patients were LTFU over five years. CVL (million copies/ml per 1000 patients) were 3.7 (95% prediction interval [PrI] 2.9-4.9) for no LTFU, 8.6 (95% PrI 7.3-10.0) for no tracing, 7.7 (95% PrI 6.2-9.1) for immediate, and 8.0 (95% PrI 6.7-9.5) for delayed tracing. Comparing no LTFU with no tracing the number of new infections increased from 33 (95% PrI 29-38) to 54 (95% PrI 47-60) per 1000 patients. Immediate tracing prevented 3.6 (95% PrI -3.3-12.8) and delayed tracing 2.5 (95% PrI -5.8-11.1) new infections per 1000. Immediate tracing was more efficient than delayed tracing: 116 and to 142 tracing efforts, respectively, were needed to prevent one new infection. CONCLUSION: Tracing of patients LTFU enhances the preventive effect of ART, but the number of transmissions prevented is small.
Resumo:
OBJECTIVES Mortality in patients starting antiretroviral therapy (ART) is higher in Malawi and Zambia than in South Africa. We examined whether different monitoring of ART (viral load [VL] in South Africa and CD4 count in Malawi and Zambia) could explain this mortality difference. DESIGN Mathematical modelling study based on data from ART programmes. METHODS We used a stochastic simulation model to study the effect of VL monitoring on mortality over 5 years. In baseline scenario A all parameters were identical between strategies except for more timely and complete detection of treatment failure with VL monitoring. Additional scenarios introduced delays in switching to second-line ART (scenario B) or higher virologic failure rates (due to worse adherence) when monitoring was based on CD4 counts only (scenario C). Results are presented as relative risks (RR) with 95% prediction intervals and percent of observed mortality difference explained. RESULTS RRs comparing VL with CD4 cell count monitoring were 0.94 (0.74-1.03) in scenario A, 0.94 (0.77-1.02) with delayed switching (scenario B) and 0.80 (0.44-1.07) when assuming a 3-times higher rate of failure (scenario C). The observed mortality at 3 years was 10.9% in Malawi and Zambia and 8.6% in South Africa (absolute difference 2.3%). The percentage of the mortality difference explained by VL monitoring ranged from 4% (scenario A) to 32% (scenarios B and C combined, assuming a 3-times higher failure rate). Eleven percent was explained by non-HIV related mortality. CONCLUSIONS VL monitoring reduces mortality moderately when assuming improved adherence and decreased failure rates.