172 resultados para Emerging Modelling Paradigms and Model Coupling
Resumo:
Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration applications address questions frequently asked by the chess community regarding the stability of the rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The last include alleged under-performance, fabrication of tournament results, and clandestine use of computer advice during competition. Beyond the model world of games, the aim is to improve fallible human performance in complex, high-value tasks.
Resumo:
This paper presents the major characteristics of the Institut Pierre Simon Laplace (IPSL) coupled ocean–atmosphere general circulation model. The model components and the coupling methodology are described, as well as the main characteristics of the climatology and interannual variability. The model results of the standard version used for IPCC climate projections, and for intercomparison projects like the Paleoclimate Modeling Intercomparison Project (PMIP 2) are compared to those with a higher resolution in the atmosphere. A focus on the North Atlantic and on the tropics is used to address the impact of the atmosphere resolution on processes and feedbacks. In the North Atlantic, the resolution change leads to an improved representation of the storm-tracks and the North Atlantic oscillation. The better representation of the wind structure increases the northward salt transports, the deep-water formation and the Atlantic meridional overturning circulation. In the tropics, the ocean–atmosphere dynamical coupling, or Bjerknes feedback, improves with the resolution. The amplitude of ENSO (El Niño-Southern oscillation) consequently increases, as the damping processes are left unchanged.
Resumo:
Increased atmospheric concentrations of carbon dioxide (CO2) will benefit the yield of most crops. Two free air CO2 enrichment (FACE) meta-analyses have shown increases in yield of between 0 and 73% for C3 crops. Despite this large range, few crop modelling studies quantify the uncertainty inherent in the parameterisation of crop growth and development. We present a novel perturbed-parameter method of crop model simulation, which uses some constraints from observations, that does this. The model used is the groundnut (i.e. peanut; Arachis hypogaea L.) version of the general large-area model for annual crops (GLAM). The conclusions are of relevance to C3 crops in general. The increases in yield simulated by GLAM for doubled CO2 were between 16 and 62%. The difference in mean percentage increase between well-watered and water-stressed simulations was 6.8. These results were compared to FACE and controlled environment studies, and to sensitivity tests on two other crop models of differing levels of complexity: CROPGRO, and the groundnut model of Hammer et al. [Hammer, G.L., Sinclair, T.R., Boote, K.J., Wright, G.C., Meinke, H., Bell, M.J., 1995. A peanut simulation model. I. Model development and testing. Agron. J. 87, 1085-1093]. The relationship between CO2 and water stress in the experiments and in the models was examined. From a physiological perspective, water-stressed crops are expected to show greater CO2 stimulation than well-watered crops. This expectation has been cited in literature. However, this result is not seen consistently in either the FACE studies or in the crop models. In contrast, leaf-level models of assimilation do consistently show this result. An analysis of the evidence from these models and from the data suggests that scale (canopy versus leaf), model calibration, and model complexity are factors in determining the sign and magnitude of the interaction between CO2 and water stress. We conclude from our study that the statement that 'water-stressed crops show greater CO2 stimulation than well-watered crops' cannot be held to be universally true. We also conclude, preliminarily, that the relationship between water stress and assimilation varies with scale. Accordingly, we provide some suggestions on how studies of a similar nature, using crop models of a range of complexity, could contribute further to understanding the roles of model calibration, model complexity and scale. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
A mathematical growth model for the batch solid-state fermentation process for fungal tannase production was developed and tested experimentally. The unstructured model describes the uptake and growth kinetics of Penicillium glabrum in an impregnated polyurethane foam substrate system. In general, good agreement between the experimental data and model simulations was obtained. Biomass, tannase and spore production are described by logistic kinetics with a time delay between biomass production and tannase and spore formation. Possible induction mechanisms for the latter are proposed. Hydrolysis of tannic acid, the main carbon source in the substrate system, is reasonably well described with Michaelis-Menten kinetics with time-varying enzyme concentration but a more complex reaction mechanism is suspected. The metabolism of gallic acid, a tannase-hydrolysis product of tannic acid, was shown to be growth limiting during the main growth phase. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Government targets for CO2 reductions are being progressively tightened, the Climate Change Act set the UK target as an 80% reduction by 2050 on 1990 figures. The residential sector accounts for about 30% of emissions. This paper discusses current modelling techniques in the residential sector: principally top-down and bottom-up. Top-down models work on a macro-economic basis and can be used to consider large scale economic changes; bottom-up models are detail rich to model technological changes. Bottom-up models demonstrate what is technically possible. However, there are differences between the technical potential and what is likely given the limited economic rationality of the typical householder. This paper recommends research to better understand individuals’ behaviour. Such research needs to include actual choices, stated preferences and opinion research to allow a detailed understanding of the individual end user. This increased understanding can then be used in an agent based model (ABM). In an ABM, agents are used to model real world actors and can be given a rule set intended to emulate the actions and behaviours of real people. This can help in understanding how new technologies diffuse. In this way a degree of micro-economic realism can be added to domestic carbon modelling. Such a model should then be of use for both forward projections of CO2 and to analyse the cost effectiveness of various policy measures.
Resumo:
A recent nonlinear system by Friston et al. (2000. NeuroImage 12: 466–477) links the changes in BOLD response to changes in neural activity. The system consists of five subsystems, linking: (1) neural activity to flow changes; (2) flow changes to oxygen delivery to tissue; (3) flow changes to changes in blood volume and venous outflow; (4) changes in flow, volume, and oxygen extraction fraction to deoxyhemoglobin changes; and finally (5) volume and deoxyhemoglobin changes to the BOLD response. Friston et al. exploit, in subsystem 2, a model by Buxton and Frank coupling flow changes to changes in oxygen metabolism which assumes tissue oxygen concentration to be close to zero. We describe below a model of the coupling between flow and oxygen delivery which takes into account the modulatory effect of changes in tissue oxygen concentration. The major development has been to extend the original Buxton and Frank model for oxygen transport to a full dynamic capillary model making the model applicable to both transient and steady state conditions. Furthermore our modification enables us to determine the time series of CMRO2 changes under different conditions, including CO2 challenges. We compare the differences in the performance of the “Friston system” using the original model of Buxton and Frank and that of our model. We also compare the data predicted by our model (with appropriate parameters) to data from a series of OIS studies. The qualitative differences in the behaviour of the models are exposed by different experimental simulations and by comparison with the results of OIS data from brief and extended stimulation protocols and from experiments using hypercapnia.
Resumo:
This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.
Resumo:
Biomass burning impacts vegetation dynamics, biogeochemical cycling, atmospheric chemistry, and climate, with sometimes deleterious socio-economic impacts. Under future climate projections it is often expected that the risk of wildfires will increase. Our ability to predict the magnitude and geographic pattern of future fire impacts rests on our ability to model fire regimes, either using well-founded empirical relationships or process-based models with good predictive skill. A large variety of models exist today and it is still unclear which type of model or degree of complexity is required to model fire adequately at regional to global scales. This is the central question underpinning the creation of the Fire Model Intercomparison Project - FireMIP, an international project to compare and evaluate existing global fire models against benchmark data sets for present-day and historical conditions. In this paper we summarise the current state-of-the-art in fire regime modelling and model evaluation, and outline what essons may be learned from FireMIP.
Resumo:
This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.
Resumo:
Reanalysis data obtained from data assimilation are increasingly used for diagnostic studies of the general circulation of the atmosphere, for the validation of modelling experiments and for estimating energy and water fluxes between the Earth surface and the atmosphere. Because fluxes are not specifically observed, but determined by the data assimilation system, they are not only influenced by the utilized observations but also by model physics and dynamics and by the assimilation method. In order to better understand the relative importance of humidity observations for the determination of the hydrological cycle, in this paper we describe an assimilation experiment using the ERA40 reanalysis system where all humidity data have been excluded from the observational data base. The surprising result is that the model, driven by the time evolution of wind, temperature and surface pressure, is able to almost completely reconstitute the large-scale hydrological cycle of the control assimilation without the use of any humidity data. In addition, analysis of the individual weather systems in the extratropics and tropics using an objective feature tracking analysis indicates that the humidity data have very little impact on these systems. We include a discussion of these results and possible consequences for the way moisture information is assimilated, as well as the potential consequences for the design of observing systems for climate monitoring. It is further suggested, with support from a simple assimilation study with another model, that model physics and dynamics play a decisive role for the hydrological cycle, stressing the need to better understand these aspects of model parametrization. .
Resumo:
This paper reports the current state of work to simplify our previous model-based methods for visual tracking of vehicles for use in a real-time system intended to provide continuous monitoring and classification of traffic from a fixed camera on a busy multi-lane motorway. The main constraints of the system design were: (i) all low level processing to be carried out by low-cost auxiliary hardware, (ii) all 3-D reasoning to be carried out automatically off-line, at set-up time. The system developed uses three main stages: (i) pose and model hypothesis using 1-D templates, (ii) hypothesis tracking, and (iii) hypothesis verification, using 2-D templates. Stages (i) & (iii) have radically different computing performance and computational costs, and need to be carefully balanced for efficiency. Together, they provide an effective way to locate, track and classify vehicles.
Resumo:
Street-level mean flow and turbulence govern the dispersion of gases away from their sources in urban areas. A suitable reference measurement in the driving flow above the urban canopy is needed to both understand and model complex street-level flow for pollutant dispersion or emergency response purposes. In vegetation canopies, a reference at mean canopy height is often used, but it is unclear whether this is suitable for urban canopies. This paper presents an evaluation of the quality of reference measurements at both roof-top (height = H) and at height z = 9H = 190 m, and their ability to explain mean and turbulent variations of street-level flow. Fast response wind data were measured at street canyon and reference sites during the six-week long DAPPLE project field campaign in spring 2004, in central London, UK, and an averaging time of 10 min was used to distinguish recirculation-type mean flow patterns from turbulence. Flow distortion at each reference site was assessed by considering turbulence intensity and streamline deflection. Then each reference was used as the dependent variable in the model of Dobre et al. (2005) which decomposes street-level flow into channelling and recirculating components. The high reference explained more of the variability of the mean flow. Coupling of turbulent kinetic energy was also stronger between street-level and the high reference flow rather than the roof-top. This coupling was weaker when overnight flow was stratified, and turbulence was suppressed at the high reference site. However, such events were rare (<1% of data) over the six-week long period. The potential usefulness of a centralised, high reference site in London was thus demonstrated with application to emergency response and air quality modelling.
Resumo:
The improved empirical understanding of silt facies in Holocene coastal sequences provided by such as diatom, foraminifera, ostracode and testate amoebae analysis, combined with insights from quantitative stratigraphic and hydraulic simulations, has led to an inclusive, integrated model for the palaeogeomorphology, stratigraphy, lithofacies and biofacies of northwest European Holocene coastal lowlands in relation to sea-level behaviour. The model covers two general circumstances and is empirically supported by a range of field studies in the Holocene deposits of a number of British estuaries, particularly, the Severn. Where deposition was continuous over periods of centuries to millennia, and sea level fluctuated about a rising trend, the succession consists of repeated cycles of silt and peat lithofacies and biofacies in which series of transgressive overlaps (submergence sequences) alternate with series of regressive overlaps (emergence sequences) in association with the waxing and waning of tidal creek networks. Environmental and sea-level change are closely coupled, and equilibrium and secular pattern is of the kind represented ideally by a closed limit cycle. In the second circumstance, characteristic of unstable wetland shores and generally affecting smaller areas, coastal erosion ensures that episodes of deposition in the high intertidal zone last no more than a few centuries. The typical response is a series of regressive overlaps (emergence sequence) in erosively based high mudflat and salt-marsh silts that record, commonly as annual banding, exceptionally high deposition rates and a state of strong disequilibrium. Environmental change, including creek development, and sea-level movement are uncoupled. Only if deposition proceeds for a sufficiently long period, so that marshes mature, are equilibrium and close coupling regained. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Models of the dynamics of nitrogen in soil (soil-N) can be used to aid the fertilizer management of a crop. The predictions of soil-N models can be validated by comparison with observed data. Validation generally involves calculating non-spatial statistics of the observations and predictions, such as their means, their mean squared-difference, and their correlation. However, when the model predictions are spatially distributed across a landscape the model requires validation with spatial statistics. There are three reasons for this: (i) the model may be more or less successful at reproducing the variance of the observations at different spatial scales; (ii) the correlation of the predictions with the observations may be different at different spatial scales; (iii) the spatial pattern of model error may be informative. In this study we used a model, parameterized with spatially variable input information about the soil, to predict the mineral-N content of soil in an arable field, and compared the results with observed data. We validated the performance of the N model spatially with a linear mixed model of the observations and model predictions, estimated by residual maximum likelihood. This novel approach allowed us to describe the joint variation of the observations and predictions as: (i) independent random variation that occurred at a fine spatial scale; (ii) correlated random variation that occurred at a coarse spatial scale; (iii) systematic variation associated with a spatial trend. The linear mixed model revealed that, in general, the performance of the N model changed depending on the spatial scale of interest. At the scales associated with random variation, the N model underestimated the variance of the observations, and the predictions were correlated poorly with the observations. At the scale of the trend, the predictions and observations shared a common surface. The spatial pattern of the error of the N model suggested that the observations were affected by the local soil condition, but this was not accounted for by the N model. In summary, the N model would be well-suited to field-scale management of soil nitrogen, but suited poorly to management at finer spatial scales. This information was not apparent with a non-spatial validation. (c),2007 Elsevier B.V. All rights reserved.
Resumo:
Answering many of the critical questions in conservation, development and environmental management requires integrating the social and natural sciences. However, understanding the array of available quantitative methods and their associated terminology presents a major barrier to successful collaboration. We provide an overview of quantitative socio-economic methods that distils their complexity into a simple taxonomy. We outline how each has been used in conjunction with ecological models to address questions relating to the management of socio-ecological systems. We review the application of social and ecological quantitative concepts to agro-ecology and classify the approaches used to integrate the two disciplines. Our review included all published integrated models from 2003 to 2008 in 27 journals that publish agricultural modelling research. Although our focus is on agro-ecology, many of the results are broadly applicable to other fields involving an interaction between human activities and ecology. We found 36 papers that integrated social and ecological concepts in a quantitative model. Four different approaches to integration were used, depending on the scale at which human welfare was quantified. Most models viewed humans as pure profit maximizers, both when calculating welfare and predicting behaviour. Synthesis and applications. We reached two main conclusions based on our taxonomy and review. The first is that quantitative methods that extend predictions of behaviour and measurements of welfare beyond a simple market value basis are underutilized by integrated models. The second is that the accuracy of prediction for integrated models remains largely unquantified. Addressing both problems requires researchers to reach a common understanding of modelling goals and data requirements during the early stages of a project.