114 resultados para Deterministic imputation
Resumo:
We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stchastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, nonetheless SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS community, simulating the transitions between active and suppressed periods of tropical convection.
Resumo:
A stochastic parameterization scheme for deep convection is described, suitable for use in both climate and NWP models. Theoretical arguments and the results of cloud-resolving models, are discussed in order to motivate the form of the scheme. In the deterministic limit, it tends to a spectrum of entraining/detraining plumes and is similar to other current parameterizations. The stochastic variability describes the local fluctuations about a large-scale equilibrium state. Plumes are drawn at random from a probability distribution function (pdf) that defines the chance of finding a plume of given cloud-base mass flux within each model grid box. The normalization of the pdf is given by the ensemble-mean mass flux, and this is computed with a CAPE closure method. The characteristics of each plume produced are determined using an adaptation of the plume model from the Kain-Fritsch parameterization. Initial tests in the single column version of the Unified Model verify that the scheme is effective in producing the desired distributions of convective variability without adversely affecting the mean state.
Resumo:
Sudden stratospheric warmings (SSWs) are usually considered to be initiated by planetary wave activity. Here it is asked whether small-scale variability (e.g., related to gravity waves) can lead to SSWs given a certain amount of planetary wave activity that is by itself not sufficient to cause a SSW. A highly vertically truncated version of the Holton–Mass model of stratospheric wave–mean flow interaction, recently proposed by Ruzmaikin et al., is extended to include stochastic forcing. In the deterministic setting, this low-order model exhibits multiple stable equilibria corresponding to the undisturbed vortex and SSW state, respectively. Momentum forcing due to quasi-random gravity wave activity is introduced as an additive noise term in the zonal momentum equation. Two distinct approaches are pursued to study the stochastic system. First, the system, initialized at the undisturbed state, is numerically integrated many times to derive statistics of first passage times of the system undergoing a transition to the SSW state. Second, the Fokker–Planck equation corresponding to the stochastic system is solved numerically to derive the stationary probability density function of the system. Both approaches show that even small to moderate strengths of the stochastic gravity wave forcing can be sufficient to cause a SSW for cases for which the deterministic system would not have predicted a SSW.
Resumo:
We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stochastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS (GEWEX Cloud System Study) community, simulating transitions between active and suppressed periods of tropical convection.
Resumo:
We report on a numerical study of the impact of short, fast inertia-gravity waves on the large-scale, slowly-evolving flow with which they co-exist. A nonlinear quasi-geostrophic numerical model of a stratified shear flow is used to simulate, at reasonably high resolution, the evolution of a large-scale mode which grows due to baroclinic instability and equilibrates at finite amplitude. Ageostrophic inertia-gravity modes are filtered out of the model by construction, but their effects on the balanced flow are incorporated using a simple stochastic parameterization of the potential vorticity anomalies which they induce. The model simulates a rotating, two-layer annulus laboratory experiment, in which we recently observed systematic inertia-gravity wave generation by an evolving, large-scale flow. We find that the impact of the small-amplitude stochastic contribution to the potential vorticity tendency, on the model balanced flow, is generally small, as expected. In certain circumstances, however, the parameterized fast waves can exert a dominant influence. In a flow which is baroclinically-unstable to a range of zonal wavenumbers, and in which there is a close match between the growth rates of the multiple modes, the stochastic waves can strongly affect wavenumber selection. This is illustrated by a flow in which the parameterized fast modes dramatically re-partition the probability-density function for equilibrated large-scale zonal wavenumber. In a second case study, the stochastic perturbations are shown to force spontaneous wavenumber transitions in the large-scale flow, which do not occur in their absence. These phenomena are due to a stochastic resonance effect. They add to the evidence that deterministic parameterizations in general circulation models, of subgrid-scale processes such as gravity wave drag, cannot always adequately capture the full details of the nonlinear interaction.
Resumo:
While the standard models of concentration addition and independent action predict overall toxicity of multicomponent mixtures reasonably, interactions may limit the predictive capability when a few compounds dominate a mixture. This study was conducted to test if statistically significant systematic deviations from concentration addition (i.e. synergism/antagonism, dose ratio- or dose level-dependency) occur when two taxonomically unrelated species, the earthworm Eisenia fetida and the nematode Caenorhabditis elegans were exposed to a full range of mixtures of the similar acting neonicotinoid pesticides imidacloprid and thiacloprid. The effect of the mixtures on C. elegans was described significantly better (p<0.01) by a dose level-dependent deviation from the concentration addition model than by the reference model alone, while the reference model description of the effects on E. fetida could not be significantly improved. These results highlight that deviations from concentration addition are possible even with similar acting compounds, but that the nature of such deviations are species dependent. For improving ecological risk assessment of simple mixtures, this implies that the concentration addition model may need to be used in a probabilistic context, rather than in its traditional deterministic manner. Crown Copyright (C) 2008 Published by Elsevier Inc. All rights reserved.
Resumo:
Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling of cyanobacteria in freshwaters is an important tool for understanding their population dynamics and predicting the location and timing of the bloom events in lakes and rivers. In this article, a new deterministic model is introduced which simulates the growth and movement of cyanobacterial blooms in river systems. The model focuses on the mathematical description of the bloom formation, vertical migration and lateral transport of colonies within river environments by taking into account the four major factors that affect the cyanobacterial bloom formation in freshwaters: light, nutrients, temperature and river flow. The model consists of two sub-models: a vertical migration model with respect to growth of cyanobacteria in relation to light, nutrients and temperature; and a hydraulic model to simulate the horizontal movement of the bloom. This article presents the model algorithms and highlights some important model results. The effects of nutrient limitation, varying illumination and river flow characteristics on cyanobacterial movement are simulated. The results indicate that under high light intensities and in nutrient-rich waters colonies sink further as a result of carbohydrate accumulation in the cells. In turbulent environments, vertical migration is retarded by vertical velocity component generated by turbulent shear stress. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling of cyanobacteria in freshwaters is an important tool for understanding their population dynamics and predicting bloom occurrence in lakes and rivers. In this paper existing key models of cyanobacteria are reviewed, evaluated and classified. Two major groups emerge: deterministic mathematical and artificial neural network models. Mathematical models can be further subcategorized into those models concerned with impounded water bodies and those concerned with rivers. Most existing models focus on a single aspect such as the growth of transport mechanisms, but there are a few models which couple both.
Resumo:
Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling of cyanobacteria in freshwaters is an important tool for understanding their population dynamics and predicting the location and timing of the bloom events in lakes and rivers. A new deterministic-mathematical model was developed, which simulates the growth and movement of cyanobacterial blooms in river systems. The model focuses on the mathematical description of the bloom formation, vertical migration and lateral transport of colonies within river environments by taking into account the major factors that affect the cyanobacterial bloom formation in rivers including, light, nutrients and temperature. A technique called generalised sensitivity analysis was applied to the model to identify the critical parameter uncertainties in the model and investigates the interaction between the chosen parameters of the model. The result of the analysis suggested that 8 out of 12 parameters were significant in obtaining the observed cyanobacterial behaviour in a simulation. It was found that there was a high degree of correlation between the half-saturation rate constants used in the model.
Resumo:
Forecasting atmospheric blocking is one of the main problems facing medium-range weather forecasters in the extratropics. The European Centre for Medium-Range Weather Forecasts (ECMWF) Ensemble Prediction System (EPS) provides an excellent basis for medium-range forecasting as it provides a number of different possible realizations of the meteorological future. This ensemble of forecasts attempts to account for uncertainties in both the initial conditions and the model formulation. Since 18 July 2000, routine output from the EPS has included the field of potential temperature on the potential vorticity (PV) D 2 PV units (PVU) surface, the dynamical tropopause. This has enabled the objective identification of blocking using an index based on the reversal of the meridional potential-temperature gradient. A year of EPS probability forecasts of Euro-Atlantic and Pacific blocking have been produced and are assessed in this paper, concentrating on the Euro-Atlantic sector. Standard verification techniques such as Brier scores, Relative Operating Characteristic (ROC) curves and reliability diagrams are used. It is shown that Euro-Atlantic sector-blocking forecasts are skilful relative to climatology out to 10 days, and are more skilful than the deterministic control forecast at all lead times. The EPS is also more skilful than a probabilistic version of this deterministic forecast, though the difference is smaller. In addition, it is shown that the onset of a sector-blocking episode is less well predicted than its decay. As the lead time increases, the probability forecasts tend towards a model climatology with slightly less blocking than is seen in the real atmosphere. This small under-forecasting bias in the blocking forecasts is possibly related to a westerly bias in the ECMWF model. Copyright © 2003 Royal Meteorological Society
Resumo:
This paper presents a parallel genetic algorithm to the Steiner Problem in Networks. Several previous papers have proposed the adoption of GAs and others metaheuristics to solve the SPN demonstrating the validity of their approaches. This work differs from them for two main reasons: the dimension and the characteristics of the networks adopted in the experiments and the aim from which it has been originated. The reason that aimed this work was namely to build a comparison term for validating deterministic and computationally inexpensive algorithms which can be used in practical engineering applications, such as the multicast transmission in the Internet. On the other hand, the large dimensions of our sample networks require the adoption of a parallel implementation of the Steiner GA, which is able to deal with such large problem instances.
Resumo:
Improvements in the resolution of satellite imagery have enabled extraction of water surface elevations at the margins of the flood. Comparison between modelled and observed water surface elevations provides a new means for calibrating and validating flood inundation models, however the uncertainty in this observed data has yet to be addressed. Here a flood inundation model is calibrated using a probabilistic treatment of the observed data. A LiDAR guided snake algorithm is used to determine an outline of a flood event in 2006 on the River Dee, North Wales, UK, using a 12.5m ERS-1 image. Points at approximately 100m intervals along this outline are selected, and the water surface elevation recorded as the LiDAR DEM elevation at each point. With a planar water surface from the gauged upstream to downstream water elevations as an approximation, the water surface elevations at points along this flooded extent are compared to their ‘expected’ value. The pattern of errors between the two show a roughly normal distribution, however when plotted against coordinates there is obvious spatial autocorrelation. The source of this spatial dependency is investigated by comparing errors to the slope gradient and aspect of the LiDAR DEM. A LISFLOOD-FP model of the flood event is set-up to investigate the effect of observed data uncertainty on the calibration of flood inundation models. Multiple simulations are run using different combinations of friction parameters, from which the optimum parameter set will be selected. For each simulation a T-test is used to quantify the fit between modelled and observed water surface elevations. The points chosen for use in this T-test are selected based on their error. The criteria for selection enables evaluation of the sensitivity of the choice of optimum parameter set to uncertainty in the observed data. This work explores the observed data in detail and highlights possible causes of error. The identification of significant error (RMSE = 0.8m) between approximate expected and actual observed elevations from the remotely sensed data emphasises the limitations of using this data in a deterministic manner within the calibration process. These limitations are addressed by developing a new probabilistic approach to using the observed data.
Resumo:
A dynamic, deterministic, economic simulation model was developed to estimate the costs and benefits of controlling Mycobacterium avium subsp. paratuberculosis (Johne's disease) in a suckler beef herd. The model is intended as a demonstration tool for veterinarians to use with farmers. The model design process involved user consultation and participation and the model is freely accessible on a dedicated website. The 'user-friendly' model interface allows the input of key assumptions and farm specific parameters enabling model simulations to be tailored to individual farm circumstances. The model simulates the effect of Johne's disease and various measures for its control in terms of herd prevalence and the shedding states of animals within the herd, the financial costs of the disease and of any control measures and the likely benefits of control of Johne's disease for the beef suckler herd over a 10-year period. The model thus helps to make more transparent the 'hidden costs' of Johne's in a herd and the likely benefits to be gained from controlling the disease. The control strategies considered within the model are 'no control', 'testing and culling of diagnosed animals', 'improving management measures' or a dual strategy of 'testing and culling in association with improving management measures'. An example 'run' of the model shows that the strategy 'improving management measures', which reduces infection routes during the early stages, results in a marked fall in herd prevalence and total costs. Testing and culling does little to reduce prevalence and does not reduce total costs over the 10-year period.
Resumo:
Process-based integrated modelling of weather and crop yield over large areas is becoming an important research topic. The production of the DEMETER ensemble hindcasts of weather allows this work to be carried out in a probabilistic framework. In this study, ensembles of crop yield (groundnut, Arachis hypogaea L.) were produced for 10 2.5 degrees x 2.5 degrees grid cells in western India using the DEMETER ensembles and the general large-area model (GLAM) for annual crops. Four key issues are addressed by this study. First, crop model calibration methods for use with weather ensemble data are assessed. Calibration using yield ensembles was more successful than calibration using reanalysis data (the European Centre for Medium-Range Weather Forecasts 40-yr reanalysis, ERA40). Secondly, the potential for probabilistic forecasting of crop failure is examined. The hindcasts show skill in the prediction of crop failure, with more severe failures being more predictable. Thirdly, the use of yield ensemble means to predict interannual variability in crop yield is examined and their skill assessed relative to baseline simulations using ERA40. The accuracy of multi-model yield ensemble means is equal to or greater than the accuracy using ERA40. Fourthly, the impact of two key uncertainties, sowing window and spatial scale, is briefly examined. The impact of uncertainty in the sowing window is greater with ERA40 than with the multi-model yield ensemble mean. Subgrid heterogeneity affects model accuracy: where correlations are low on the grid scale, they may be significantly positive on the subgrid scale. The implications of the results of this study for yield forecasting on seasonal time-scales are as follows. (i) There is the potential for probabilistic forecasting of crop failure (defined by a threshold yield value); forecasting of yield terciles shows less potential. (ii) Any improvement in the skill of climate models has the potential to translate into improved deterministic yield prediction. (iii) Whilst model input uncertainties are important, uncertainty in the sowing window may not require specific modelling. The implications of the results of this study for yield forecasting on multidecadal (climate change) time-scales are as follows. (i) The skill in the ensemble mean suggests that the perturbation, within uncertainty bounds, of crop and climate parameters, could potentially average out some of the errors associated with mean yield prediction. (ii) For a given technology trend, decadal fluctuations in the yield-gap parameter used by GLAM may be relatively small, implying some predictability on those time-scales.
Resumo:
Despite decades of research, it remains controversial whether ecological communities converge towards a common structure determined by environmental conditions irrespective of assembly history. Here, we show experimentally that the answer depends on the level of community organization considered. In a 9-year grassland experiment, we manipulated initial plant composition on abandoned arable land and subsequently allowed natural colonization. Initial compositional variation caused plant communities to remain divergent in species identities, even though these same communities converged strongly in species traits. This contrast between species divergence and trait convergence could not be explained by dispersal limitation or community neutrality alone. Our results show that the simultaneous operation of trait-based assembly rules and species-level priority effects drives community assembly, making it both deterministic and historically contingent, but at different levels of community organization.