31 resultados para REALISTIC MODELS
em CentAUR: Central Archive University of Reading - UK
Resumo:
Individual-based models (IBMs) can simulate the actions of individual animals as they interact with one another and the landscape in which they live. When used in spatially-explicit landscapes IBMs can show how populations change over time in response to management actions. For instance, IBMs are being used to design strategies of conservation and of the exploitation of fisheries, and for assessing the effects on populations of major construction projects and of novel agricultural chemicals. In such real world contexts, it becomes especially important to build IBMs in a principled fashion, and to approach calibration and evaluation systematically. We argue that insights from physiological and behavioural ecology offer a recipe for building realistic models, and that Approximate Bayesian Computation (ABC) is a promising technique for the calibration and evaluation of IBMs. IBMs are constructed primarily from knowledge about individuals. In ecological applications the relevant knowledge is found in physiological and behavioural ecology, and we approach these from an evolutionary perspective by taking into account how physiological and behavioural processes contribute to life histories, and how those life histories evolve. Evolutionary life history theory shows that, other things being equal, organisms should grow to sexual maturity as fast as possible, and then reproduce as fast as possible, while minimising per capita death rate. Physiological and behavioural ecology are largely built on these principles together with the laws of conservation of matter and energy. To complete construction of an IBM information is also needed on the effects of competitors, conspecifics and food scarcity; the maximum rates of ingestion, growth and reproduction, and life-history parameters. Using this knowledge about physiological and behavioural processes provides a principled way to build IBMs, but model parameters vary between species and are often difficult to measure. A common solution is to manually compare model outputs with observations from real landscapes and so to obtain parameters which produce acceptable fits of model to data. However, this procedure can be convoluted and lead to over-calibrated and thus inflexible models. Many formal statistical techniques are unsuitable for use with IBMs, but we argue that ABC offers a potential way forward. It can be used to calibrate and compare complex stochastic models and to assess the uncertainty in their predictions. We describe methods used to implement ABC in an accessible way and illustrate them with examples and discussion of recent studies. Although much progress has been made, theoretical issues remain, and some of these are outlined and discussed.
An empirical study of process-related attributes in segmented software cost-estimation relationships
Resumo:
Parametric software effort estimation models consisting on a single mathematical relationship suffer from poor adjustment and predictive characteristics in cases in which the historical database considered contains data coming from projects of a heterogeneous nature. The segmentation of the input domain according to clusters obtained from the database of historical projects serves as a tool for more realistic models that use several local estimation relationships. Nonetheless, it may be hypothesized that using clustering algorithms without previous consideration of the influence of well-known project attributes misses the opportunity to obtain more realistic segments. In this paper, we describe the results of an empirical study using the ISBSG-8 database and the EM clustering algorithm that studies the influence of the consideration of two process-related attributes as drivers of the clustering process: the use of engineering methodologies and the use of CASE tools. The results provide evidence that such consideration conditions significantly the final model obtained, even though the resulting predictive quality is of a similar magnitude.
Resumo:
We present a novel algorithm for joint state-parameter estimation using sequential three dimensional variational data assimilation (3D Var) and demonstrate its application in the context of morphodynamic modelling using an idealised two parameter 1D sediment transport model. The new scheme combines a static representation of the state background error covariances with a flow dependent approximation of the state-parameter cross-covariances. For the case presented here, this involves calculating a local finite difference approximation of the gradient of the model with respect to the parameters. The new method is easy to implement and computationally inexpensive to run. Experimental results are positive with the scheme able to recover the model parameters to a high level of accuracy. We expect that there is potential for successful application of this new methodology to larger, more realistic models with more complex parameterisations.
Resumo:
A new approach to the study of the local organization in amorphous polymer materials is presented. The method couples neutron diffraction experiments that explore the structure on the spatial scale 1–20 Å with the reverse Monte Carlo fitting procedure to predict structures that accurately represent the experimental scattering results over the whole momentum transfer range explored. Molecular mechanics and molecular dynamics techniques are also used to produce atomistic models independently from any experimental input, thereby providing a test of the viability of the reverse Monte Carlo method in generating realistic models for amorphous polymeric systems. An analysis of the obtained models in terms of single chain properties and of orientational correlations between chain segments is presented. We show the viability of the method with data from molten polyethylene. The analysis derives a model with average C-C and C-H bond lengths of 1.55 Å and 1.1 Å respectively, average backbone valence angle of 112, a torsional angle distribution characterized by a fraction of trans conformers of 0.67 and, finally, a weak interchain orientational correlation at around 4 Å.
Resumo:
The destructive environmental and socio-economic impacts of the El Niño/Southern Oscillation1, 2 (ENSO) demand an improved understanding of how ENSO will change under future greenhouse warming. Robust projected changes in certain aspects of ENSO have been recently established3, 4, 5. However, there is as yet no consensus on the change in the magnitude of the associated sea surface temperature (SST) variability6, 7, 8, commonly used to represent ENSO amplitude1, 6, despite its strong effects on marine ecosystems and rainfall worldwide1, 2, 3, 4, 9. Here we show that the response of ENSO SST amplitude is time-varying, with an increasing trend in ENSO amplitude before 2040, followed by a decreasing trend thereafter. We attribute the previous lack of consensus to an expectation that the trend in ENSO amplitude over the entire twenty-first century is unidirectional, and to unrealistic model dynamics of tropical Pacific SST variability. We examine these complex processes across 22 models in the Coupled Model Intercomparison Project phase 5 (CMIP5) database10, forced under historical and greenhouse warming conditions. The nine most realistic models identified show a strong consensus on the time-varying response and reveal that the non-unidirectional behaviour is linked to a longitudinal difference in the surface warming rate across the Indo-Pacific basin. Our results carry important implications for climate projections and climate adaptation pathways.
Resumo:
Accurate seasonal forecasts rely on the presence of low frequency, predictable signals in the climate system which have a sufficiently well understood and significant impact on the atmospheric circulation. In the Northern European region, signals associated with seasonal scale variability such as ENSO, North Atlantic SST anomalies and the North Atlantic Oscillation have not yet proven sufficient to enable satisfactorily skilful dynamical seasonal forecasts. The winter-time circulations of the stratosphere and troposphere are highly coupled. It is therefore possible that additional seasonal forecasting skill may be gained by including a realistic stratosphere in models. In this study we assess the ability of five seasonal forecasting models to simulate the Northern Hemisphere extra-tropical winter-time stratospheric circulation. Our results show that all of the models have a polar night jet which is too weak and displaced southward compared to re-analysis data. It is shown that the models underestimate the number, magnitude and duration of periods of anomalous stratospheric circulation. Despite the poor representation of the general circulation of the stratosphere, the results indicate that there may be a detectable tropospheric response following anomalous circulation events in the stratosphere. However, the models fail to exhibit any predictability in their forecasts. These results highlight some of the deficiencies of current seasonal forecasting models with a poorly resolved stratosphere. The combination of these results with other recent studies which show a tropospheric response to stratospheric variability, demonstrates a real prospect for improving the skill of seasonal forecasts.
Resumo:
This chapter introduces ABMs, their construction, and the pros and cons of their use. Although relatively new, agent-basedmodels (ABMs) have great potential for use in ecotoxicological research – their primary advantage being the realistic simulations that can be constructed and particularly their explicit handling of space and time in simulations. Examples are provided of their use in ecotoxicology primarily exemplified by different implementations of the ALMaSS system. These examples presented demonstrate how multiple stressors, landscape structure, details regarding toxicology, animal behavior, and socioeconomic effects can and should be taken into account when constructing simulations for risk assessment. Like ecological systems, in ABMs the behavior at the system level is not simply the mean of the component responses, but the sum of the often nonlinear interactions between components in the system; hence this modeling approach opens the door to implementing and testing much more realistic and holistic ecotoxicological models than are currently used.
Resumo:
Motivation: We compare phylogenetic approaches for inferring functional gene links. The approaches detect independent instances of the correlated gain and loss of pairs of genes from species' genomes. We investigate the effect on results of basing evidence of correlations on two phylogenetic approaches, Dollo parsminony and maximum likelihood (ML). We further examine the effect of constraining the ML model by fixing the rate of gene gain at a low value, rather than estimating it from the data. Results: We detect correlated evolution among a test set of pairs of yeast (Saccharomyces cerevisiae) genes, with a case study of 21 eukaryotic genomes and test data derived from known yeast protein complexes. If the rate at which genes are gained is constrained to be low, ML achieves by far the best results at detecting known functional links. The model then has fewer parameters but it is more realistic by preventing genes from being gained more than once. Availability: BayesTraits by M. Pagel and A. Meade, and a script to configure and repeatedly launch it by D. Barker and M. Pagel, are available at http://www.evolution.reading.ac.uk .
Resumo:
Physiological evidence using Infrared Video Microscopy during the uncaging of glutamate has proven the existence of excitable calcium ion channels in spine heads, highlighting the need for reliable models of spines. In this study we compare the three main methods of simulating excitable spines: Baer & Rinzel's Continuum (B&R) model, Coombes' Spike-Diffuse-Spike (SDS) model and paired cable and ion channel equations (Cable model). Tests are done to determine how well the models approximate each other in terms of speed and heights of travelling waves. Significant quantitative differences are found between the models: travelling waves in the SDS model in particular are found to travel at much lower speeds and sometimes much higher voltages than in the Cable or B&R models. Meanwhile qualitative differences are found between the B&R and SDS models over realistic parameter ranges. The cause of these differences is investigated and potential solutions proposed.
Resumo:
The performance of flood inundation models is often assessed using satellite observed data; however these data have inherent uncertainty. In this study we assess the impact of this uncertainty when calibrating a flood inundation model (LISFLOOD-FP) for a flood event in December 2006 on the River Dee, North Wales, UK. The flood extent is delineated from an ERS-2 SAR image of the event using an active contour model (snake), and water levels at the flood margin calculated through intersection of the shoreline vector with LiDAR topographic data. Gauged water levels are used to create a reference water surface slope for comparison with the satellite-derived water levels. Residuals between the satellite observed data points and those from the reference line are spatially clustered into groups of similar values. We show that model calibration achieved using pattern matching of observed and predicted flood extent is negatively influenced by this spatial dependency in the data. By contrast, model calibration using water elevations produces realistic calibrated optimum friction parameters even when spatial dependency is present. To test the impact of removing spatial dependency a new method of evaluating flood inundation model performance is developed by using multiple random subsamples of the water surface elevation data points. By testing for spatial dependency using Moran’s I, multiple subsamples of water elevations that have no significant spatial dependency are selected. The model is then calibrated against these data and the results averaged. This gives a near identical result to calibration using spatially dependent data, but has the advantage of being a statistically robust assessment of model performance in which we can have more confidence. Moreover, by using the variations found in the subsamples of the observed data it is possible to assess the effects of observational uncertainty on the assessment of flooding risk.
Resumo:
Tropical Cyclone (TC) is normally not studied at the individual level with Global Climate Models (GCMs), because the coarse grid spacing is often deemed insufficient for a realistic representation of the basic underlying processes. GCMs are indeed routinely deployed at low resolution, in order to enable sufficiently long integrations, which means that only large-scale TC proxies are diagnosed. A new class of GCMs is emerging, however, which is capable of simulating TC-type vortexes by retaining a horizontal resolution similar to that of operational NWP GCMs; their integration on the latest supercomputers enables the completion of long-term integrations. The UK-Japan Climate Collaboration and the UK-HiGEM projects have developed climate GCMs which can be run routinely for decades (with grid spacing of 60 km) or centuries (with grid spacing of 90 km); when coupled to the ocean GCM, a mesh of 1/3 degrees provides eddy-permitting resolution. The 90 km resolution model has been developed entirely by the UK-HiGEM consortium (together with its 1/3 degree ocean component); the 60 km atmospheric GCM has been developed by UJCC, in collaboration with the Met Office Hadley Centre.
Resumo:
Current variability of precipitation (P) and its response to surface temperature (T) are analysed using coupled(CMIP5) and atmosphere-only (AMIP5) climate model simulations and compared with observational estimates. There is striking agreement between Global Precipitation Climatology Project (GPCP) observed and AMIP5 simulated P anomalies over land both globally and in the tropics suggesting that prescribed sea surface temperature and realistic radiative forcings are sufficient for simulating the interannual variability in continental P. Differences between the observed and simulated P variability over the ocean, originate primarily from the wet tropical regions, in particular the western Pacific, but are reduced slightly after 1995. All datasets show positive responses of P to T globally of around 2 %/K for simulations and 3-4 %/K in GPCP observations but model responses over the tropical oceans are around 3 times smaller than GPCP over the period 1988-2005. The observed anticorrelation between land and ocean P, linked with El Niño Southern Oscillation, is captured by the simulations. All data sets over the tropical ocean show a tendency for wet regions to become wetter and dry regions drier with warming. Over the wet region (75% precipitation percentile), the precipitation response is ~13-15%/K for GPCP and ~5%/K for models while trends in P are 2.4%/decade for GPCP, 0.6% /decade for CMIP5 and 0.9%/decade for AMIP5 suggesting that models are underestimating the precipitation responses or a deficiency exists in the satellite datasets.
Resumo:
The extra-tropical response to El Niño in configurations of a coupled model with increased horizontal resolution in the oceanic component is shown to be more realistic than in configurations with a low resolution oceanic component. This general conclusion is independent of the atmospheric resolution. Resolving small-scale processes in the ocean produces a more realistic oceanic mean state, with a reduced cold tongue bias, which in turn allows the atmospheric model component to be forced more realistically. A realistic atmospheric basic state is critical in order to represent Rossby wave propagation in response to El Niño, and hence the extra-tropical response to El Niño. Through the use of high and low resolution configurations of the forced atmospheric-only model component we show that, in isolation, atmospheric resolution does not significantly affect the simulation of the extra-tropical response to El Niño. It is demonstrated, through perturbations to the SST forcing of the atmospheric model component, that biases in the climatological SST field typical of coupled model configurations with low oceanic resolution can account for the erroneous atmospheric basic state seen in these coupled model configurations. These results highlight the importance of resolving small-scale oceanic processes in producing a realistic large-scale mean climate in coupled models, and suggest that it might may be possible to “squeeze out” valuable extra performance from coupled models through increases to oceanic resolution alone.
Resumo:
Summary 1. Agent-based models (ABMs) are widely used to predict how populations respond to changing environments. As the availability of food varies in space and time, individuals should have their own energy budgets, but there is no consensus as to how these should be modelled. Here, we use knowledge of physiological ecology to identify major issues confronting the modeller and to make recommendations about how energy budgets for use in ABMs should be constructed. 2. Our proposal is that modelled animals forage as necessary to supply their energy needs for maintenance, growth and reproduction. If there is sufficient energy intake, an animal allocates the energy obtained in the order: maintenance, growth, reproduction, energy storage, until its energy stores reach an optimal level. If there is a shortfall, the priorities for maintenance and growth/reproduction remain the same until reserves fall to a critical threshold below which all are allocated to maintenance. Rates of ingestion and allocation depend on body mass and temperature. We make suggestions for how each of these processes should be modelled mathematically. 3. Mortality rates vary with body mass and temperature according to known relationships, and these can be used to obtain estimates of background mortality rate. 4. If parameter values cannot be obtained directly, then values may provisionally be obtained by parameter borrowing, pattern-oriented modelling, artificial evolution or from allometric equations. 5. The development of ABMs incorporating individual energy budgets is essential for realistic modelling of populations affected by food availability. Such ABMs are already being used to guide conservation planning of nature reserves and shell fisheries, to assess environmental impacts of building proposals including wind farms and highways and to assess the effects on nontarget organisms of chemicals for the control of agricultural pests. Keywords: bioenergetics; energy budget; individual-based models; population dynamics.