52 resultados para Hazard-Based Models

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter introduces ABMs, their construction, and the pros and cons of their use. Although relatively new, agent-basedmodels (ABMs) have great potential for use in ecotoxicological research – their primary advantage being the realistic simulations that can be constructed and particularly their explicit handling of space and time in simulations. Examples are provided of their use in ecotoxicology primarily exemplified by different implementations of the ALMaSS system. These examples presented demonstrate how multiple stressors, landscape structure, details regarding toxicology, animal behavior, and socioeconomic effects can and should be taken into account when constructing simulations for risk assessment. Like ecological systems, in ABMs the behavior at the system level is not simply the mean of the component responses, but the sum of the often nonlinear interactions between components in the system; hence this modeling approach opens the door to implementing and testing much more realistic and holistic ecotoxicological models than are currently used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

though discrete cell-based frameworks are now commonly used to simulate a whole range of biological phenomena, it is typically not obvious how the numerous different types of model are related to one another, nor which one is most appropriate in a given context. Here we demonstrate how individual cell movement on the discrete scale modeled using nonlinear force laws can be described by nonlinear diffusion coefficients on the continuum scale. A general relationship between nonlinear force laws and their respective diffusion coefficients is derived in one spatial dimension and, subsequently, a range of particular examples is considered. For each case excellent agreement is observed between numerical solutions of the discrete and corresponding continuum models. Three case studies are considered in which we demonstrate how the derived nonlinear diffusion coefficients can be used to (a) relate different discrete models of cell behavior; (b) derive discrete, intercell force laws from previously posed diffusion coefficients, and (c) describe aggregative behavior in discrete simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Summary 1. Agent-based models (ABMs) are widely used to predict how populations respond to changing environments. As the availability of food varies in space and time, individuals should have their own energy budgets, but there is no consensus as to how these should be modelled. Here, we use knowledge of physiological ecology to identify major issues confronting the modeller and to make recommendations about how energy budgets for use in ABMs should be constructed. 2. Our proposal is that modelled animals forage as necessary to supply their energy needs for maintenance, growth and reproduction. If there is sufficient energy intake, an animal allocates the energy obtained in the order: maintenance, growth, reproduction, energy storage, until its energy stores reach an optimal level. If there is a shortfall, the priorities for maintenance and growth/reproduction remain the same until reserves fall to a critical threshold below which all are allocated to maintenance. Rates of ingestion and allocation depend on body mass and temperature. We make suggestions for how each of these processes should be modelled mathematically. 3. Mortality rates vary with body mass and temperature according to known relationships, and these can be used to obtain estimates of background mortality rate. 4. If parameter values cannot be obtained directly, then values may provisionally be obtained by parameter borrowing, pattern-oriented modelling, artificial evolution or from allometric equations. 5. The development of ABMs incorporating individual energy budgets is essential for realistic modelling of populations affected by food availability. Such ABMs are already being used to guide conservation planning of nature reserves and shell fisheries, to assess environmental impacts of building proposals including wind farms and highways and to assess the effects on nontarget organisms of chemicals for the control of agricultural pests. Keywords: bioenergetics; energy budget; individual-based models; population dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A great explanatory gap lies between the molecular pharmacology of psychoactive agents and the neurophysiological changes they induce, as recorded by neuroimaging modalities. Causally relating the cellular actions of psychoactive compounds to their influence on population activity is experimentally challenging. Recent developments in the dynamical modelling of neural tissue have attempted to span this explanatory gap between microscopic targets and their macroscopic neurophysiological effects via a range of biologically plausible dynamical models of cortical tissue. Such theoretical models allow exploration of neural dynamics, in particular their modification by drug action. The ability to theoretically bridge scales is due to a biologically plausible averaging of cortical tissue properties. In the resulting macroscopic neural field, individual neurons need not be explicitly represented (as in neural networks). The following paper aims to provide a non-technical introduction to the mean field population modelling of drug action and its recent successes in modelling anaesthesia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We evaluate the ability of process based models to reproduce observed global mean sea-level change. When the models are forced by changes in natural and anthropogenic radiative forcing of the climate system and anthropogenic changes in land-water storage, the average of the modelled sea-level change for the periods 1900–2010, 1961–2010 and 1990–2010 is about 80%, 85% and 90% of the observed rise. The modelled rate of rise is over 1 mm yr−1 prior to 1950, decreases to less than 0.5 mm yr−1 in the 1960s, and increases to 3 mm yr−1 by 2000. When observed regional climate changes are used to drive a glacier model and an allowance is included for an ongoing adjustment of the ice sheets, the modelled sea-level rise is about 2 mm yr−1 prior to 1950, similar to the observations. The model results encompass the observed rise and the model average is within 20% of the observations, about 10% when the observed ice sheet contributions since 1993 are added, increasing confidence in future projections for the 21st century. The increased rate of rise since 1990 is not part of a natural cycle but a direct response to increased radiative forcing (both anthropogenic and natural), which will continue to grow with ongoing greenhouse gas emissions

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Individual-based models (IBMs) can simulate the actions of individual animals as they interact with one another and the landscape in which they live. When used in spatially-explicit landscapes IBMs can show how populations change over time in response to management actions. For instance, IBMs are being used to design strategies of conservation and of the exploitation of fisheries, and for assessing the effects on populations of major construction projects and of novel agricultural chemicals. In such real world contexts, it becomes especially important to build IBMs in a principled fashion, and to approach calibration and evaluation systematically. We argue that insights from physiological and behavioural ecology offer a recipe for building realistic models, and that Approximate Bayesian Computation (ABC) is a promising technique for the calibration and evaluation of IBMs. IBMs are constructed primarily from knowledge about individuals. In ecological applications the relevant knowledge is found in physiological and behavioural ecology, and we approach these from an evolutionary perspective by taking into account how physiological and behavioural processes contribute to life histories, and how those life histories evolve. Evolutionary life history theory shows that, other things being equal, organisms should grow to sexual maturity as fast as possible, and then reproduce as fast as possible, while minimising per capita death rate. Physiological and behavioural ecology are largely built on these principles together with the laws of conservation of matter and energy. To complete construction of an IBM information is also needed on the effects of competitors, conspecifics and food scarcity; the maximum rates of ingestion, growth and reproduction, and life-history parameters. Using this knowledge about physiological and behavioural processes provides a principled way to build IBMs, but model parameters vary between species and are often difficult to measure. A common solution is to manually compare model outputs with observations from real landscapes and so to obtain parameters which produce acceptable fits of model to data. However, this procedure can be convoluted and lead to over-calibrated and thus inflexible models. Many formal statistical techniques are unsuitable for use with IBMs, but we argue that ABC offers a potential way forward. It can be used to calibrate and compare complex stochastic models and to assess the uncertainty in their predictions. We describe methods used to implement ABC in an accessible way and illustrate them with examples and discussion of recent studies. Although much progress has been made, theoretical issues remain, and some of these are outlined and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Space weather effects on technological systems originate with energy carried from the Sun to the terrestrial environment by the solar wind. In this study, we present results of modeling of solar corona-heliosphere processes to predict solar wind conditions at the L1 Lagrangian point upstream of Earth. In particular we calculate performance metrics for (1) empirical, (2) hybrid empirical/physics-based, and (3) full physics-based coupled corona-heliosphere models over an 8-year period (1995–2002). L1 measurements of the radial solar wind speed are the primary basis for validation of the coronal and heliosphere models studied, though other solar wind parameters are also considered. The models are from the Center for Integrated Space-Weather Modeling (CISM) which has developed a coupled model of the whole Sun-to-Earth system, from the solar photosphere to the terrestrial thermosphere. Simple point-by-point analysis techniques, such as mean-square-error and correlation coefficients, indicate that the empirical coronal-heliosphere model currently gives the best forecast of solar wind speed at 1 AU. A more detailed analysis shows that errors in the physics-based models are predominately the result of small timing offsets to solar wind structures and that the large-scale features of the solar wind are actually well modeled. We suggest that additional “tuning” of the coupling between the coronal and heliosphere models could lead to a significant improvement of their accuracy. Furthermore, we note that the physics-based models accurately capture dynamic effects at solar wind stream interaction regions, such as magnetic field compression, flow deflection, and density buildup, which the empirical scheme cannot.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The UK has a target for an 80% reduction in CO2 emissions by 2050 from a 1990 base. Domestic energy use accounts for around 30% of total emissions. This paper presents a comprehensive review of existing models and modelling techniques and indicates how they might be improved by considering individual buying behaviour. Macro (top-down) and micro (bottom-up) models have been reviewed and analysed. It is found that bottom-up models can project technology diffusion due to their higher resolution. The weakness of existing bottom-up models at capturing individual green technology buying behaviour has been identified. Consequently, Markov chains, neural networks and agent-based modelling are proposed as possible methods to incorporate buying behaviour within a domestic energy forecast model. Among the three methods, agent-based models are found to be the most promising, although a successful agent approach requires large amounts of input data. A prototype agent-based model has been developed and tested, which demonstrates the feasibility of an agent approach. This model shows that an agent-based approach is promising as a means to predict the effectiveness of various policy measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979–2010) simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25×25 km grid, which is then reprojected onto a 1×1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25×25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite the many models developed for phosphorus concentration prediction at differing spatial and temporal scales, there has been little effort to quantify uncertainty in their predictions. Model prediction uncertainty quantification is desirable, for informed decision-making in river-systems management. An uncertainty analysis of the process-based model, integrated catchment model of phosphorus (INCA-P), within the generalised likelihood uncertainty estimation (GLUE) framework is presented. The framework is applied to the Lugg catchment (1,077 km2), a River Wye tributary, on the England–Wales border. Daily discharge and monthly phosphorus (total reactive and total), for a limited number of reaches, are used to initially assess uncertainty and sensitivity of 44 model parameters, identified as being most important for discharge and phosphorus predictions. This study demonstrates that parameter homogeneity assumptions (spatial heterogeneity is treated as land use type fractional areas) can achieve higher model fits, than a previous expertly calibrated parameter set. The model is capable of reproducing the hydrology, but a threshold Nash-Sutcliffe co-efficient of determination (E or R 2) of 0.3 is not achieved when simulating observed total phosphorus (TP) data in the upland reaches or total reactive phosphorus (TRP) in any reach. Despite this, the model reproduces the general dynamics of TP and TRP, in point source dominated lower reaches. This paper discusses why this application of INCA-P fails to find any parameter sets, which simultaneously describe all observed data acceptably. The discussion focuses on uncertainty of readily available input data, and whether such process-based models should be used when there isn’t sufficient data to support the many parameters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We assessed the vulnerability of blanket peat to climate change in Great Britain using an ensemble of 8 bioclimatic envelope models. We used 4 published models that ranged from simple threshold models, based on total annual precipitation, to Generalised Linear Models (GLMs, based on mean annual temperature). In addition, 4 new models were developed which included measures of water deficit as threshold, classification tree, GLM and generalised additive models (GAM). Models that included measures of both hydrological conditions and maximum temperature provided a better fit to the mapped peat area than models based on hydrological variables alone. Under UKCIP02 projections for high (A1F1) and low (B1) greenhouse gas emission scenarios, 7 out of the 8 models showed a decline in the bioclimatic space associated with blanket peat. Eastern regions (Northumbria, North York Moors, Orkney) were shown to be more vulnerable than higher-altitude, western areas (Highlands, Western Isles and Argyle, Bute and The Trossachs). These results suggest a long-term decline in the distribution of actively growing blanket peat, especially under the high emissions scenario, although it is emphasised that existing peatlands may well persist for decades under a changing climate. Observational data from long-term monitoring and manipulation experiments in combination with process-based models are required to explore the nature and magnitude of climate change impacts on these vulnerable areas more fully.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Integrated simulation models can be useful tools in farming system research. This chapter reviews three commonly used approaches, i.e. linear programming, system dynamics and agent-based models. Applications of each approach are presented and strengths and drawbacks discussed. We argue that, despite some challenges, mainly related to the integration of different approaches, model validation and the representation of human agents, integrated simulation models contribute important insights to the analysis of farming systems. They help unravelling the complex and dynamic interactions and feedbacks among bio-physical, socio-economic, and institutional components across scales and levels in farming systems. In addition, they can provide a platform for integrative research, and can support transdisciplinary research by functioning as learning platforms in participatory processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Earthworms are important organisms in soil communities and so are used as model organisms in environmental risk assessments of chemicals. However current risk assessments of soil invertebrates are based on short-term laboratory studies, of limited ecological relevance, supplemented if necessary by site-specific field trials, which sometimes are challenging to apply across the whole agricultural landscape. Here, we investigate whether population responses to environmental stressors and pesticide exposure can be accurately predicted by combining energy budget and agent-based models (ABMs), based on knowledge of how individuals respond to their local circumstances. A simple energy budget model was implemented within each earthworm Eisenia fetida in the ABM, based on a priori parameter estimates. From broadly accepted physiological principles, simple algorithms specify how energy acquisition and expenditure drive life cycle processes. Each individual allocates energy between maintenance, growth and/or reproduction under varying conditions of food density, soil temperature and soil moisture. When simulating published experiments, good model fits were obtained to experimental data on individual growth, reproduction and starvation. Using the energy budget model as a platform we developed methods to identify which of the physiological parameters in the energy budget model (rates of ingestion, maintenance, growth or reproduction) are primarily affected by pesticide applications, producing four hypotheses about how toxicity acts. We tested these hypotheses by comparing model outputs with published toxicity data on the effects of copper oxychloride and chlorpyrifos on E. fetida. Both growth and reproduction were directly affected in experiments in which sufficient food was provided, whilst maintenance was targeted under food limitation. Although we only incorporate toxic effects at the individual level we show how ABMs can readily extrapolate to larger scales by providing good model fits to field population data. The ability of the presented model to fit the available field and laboratory data for E. fetida demonstrates the promise of the agent-based approach in ecology, by showing how biological knowledge can be used to make ecological inferences. Further work is required to extend the approach to populations of more ecologically relevant species studied at the field scale. Such a model could help extrapolate from laboratory to field conditions and from one set of field conditions to another or from species to species.