126 resultados para Event-based Model
Resumo:
Multiscale modeling is emerging as one of the key challenges in mathematical biology. However, the recent rapid increase in the number of modeling methodologies being used to describe cell populations has raised a number of interesting questions. For example, at the cellular scale, how can the appropriate discrete cell-level model be identified in a given context? Additionally, how can the many phenomenological assumptions used in the derivation of models at the continuum scale be related to individual cell behavior? In order to begin to address such questions, we consider a discrete one-dimensional cell-based model in which cells are assumed to interact via linear springs. From the discrete equations of motion, the continuous Rouse [P. E. Rouse, J. Chem. Phys. 21, 1272 (1953)] model is obtained. This formalism readily allows the definition of a cell number density for which a nonlinear "fast" diffusion equation is derived. Excellent agreement is demonstrated between the continuum and discrete models. Subsequently, via the incorporation of cell division, we demonstrate that the derived nonlinear diffusion model is robust to the inclusion of more realistic biological detail. In the limit of stiff springs, where cells can be considered to be incompressible, we show that cell velocity can be directly related to cell production. This assumption is frequently made in the literature but our derivation places limits on its validity. Finally, the model is compared with a model of a similar form recently derived for a different discrete cell-based model and it is shown how the different diffusion coefficients can be understood in terms of the underlying assumptions about cell behavior in the respective discrete models.
Resumo:
The effect of different sugars and glyoxal on the formation of acrylamide in low-moisture starch-based model systems was studied, and kinetic data were obtained. Glucose was more effective than fructose, tagatose, or maltose in acrylamide formation, whereas the importance of glyoxal as a key sugar fragmentation intermediate was confirmed. Glyoxal formation was greater in model systems containing asparagine and glucose rather than fructose. A solid phase microextraction GC-MS method was employed to determine quantitatively the formation of pyrazines in model reaction systems. Substituted pyrazine formation was more evident in model systems containing fructose; however, the unsubstituted homologue, which was the only pyrazine identified in the headspace of glyoxal-asparagine systems, was formed at higher yields when aldoses were used as the reducing sugar. Highly significant correlations were obtained for the relationship between pyrazine and acrylamide formation. The importance of the tautomerization of the asparagine-carbonyl decarboxylated Schiff base in the relative yields of pyrazines and acrylamide is discussed.
Resumo:
The identification of non-linear systems using only observed finite datasets has become a mature research area over the last two decades. A class of linear-in-the-parameter models with universal approximation capabilities have been intensively studied and widely used due to the availability of many linear-learning algorithms and their inherent convergence conditions. This article presents a systematic overview of basic research on model selection approaches for linear-in-the-parameter models. One of the fundamental problems in non-linear system identification is to find the minimal model with the best model generalisation performance from observational data only. The important concepts in achieving good model generalisation used in various non-linear system-identification algorithms are first reviewed, including Bayesian parameter regularisation and models selective criteria based on the cross validation and experimental design. A significant advance in machine learning has been the development of the support vector machine as a means for identifying kernel models based on the structural risk minimisation principle. The developments on the convex optimisation-based model construction algorithms including the support vector regression algorithms are outlined. Input selection algorithms and on-line system identification algorithms are also included in this review. Finally, some industrial applications of non-linear models are discussed.
Resumo:
Remote sensing is the only practicable means to observe snow at large scales. Measurements from passive microwave instruments have been used to derive snow climatology since the late 1970’s, but the algorithms used were limited by the computational power of the era. Simplifications such as the assumption of constant snow properties enabled snow mass to be retrieved from the microwave measurements, but large errors arise from those assumptions, which are still used today. A better approach is to perform retrievals within a data assimilation framework, where a physically-based model of the snow properties can be used to produce the best estimate of the snow cover, in conjunction with multi-sensor observations such as the grain size, surface temperature, and microwave radiation. We have developed an existing snow model, SNOBAL, to incorporate mass and energy transfer of the soil, and to simulate the growth of the snow grains. An evaluation of this model is presented and techniques for the development of new retrieval systems are discussed.
Resumo:
Government targets for CO2 reductions are being progressively tightened, the Climate Change Act set the UK target as an 80% reduction by 2050 on 1990 figures. The residential sector accounts for about 30% of emissions. This paper discusses current modelling techniques in the residential sector: principally top-down and bottom-up. Top-down models work on a macro-economic basis and can be used to consider large scale economic changes; bottom-up models are detail rich to model technological changes. Bottom-up models demonstrate what is technically possible. However, there are differences between the technical potential and what is likely given the limited economic rationality of the typical householder. This paper recommends research to better understand individuals’ behaviour. Such research needs to include actual choices, stated preferences and opinion research to allow a detailed understanding of the individual end user. This increased understanding can then be used in an agent based model (ABM). In an ABM, agents are used to model real world actors and can be given a rule set intended to emulate the actions and behaviours of real people. This can help in understanding how new technologies diffuse. In this way a degree of micro-economic realism can be added to domestic carbon modelling. Such a model should then be of use for both forward projections of CO2 and to analyse the cost effectiveness of various policy measures.
Resumo:
It is thought that speciation in phytophagous insects is often due to colonization of novel host plants, because radiations of plant and insect lineages are typically asynchronous. Recent phylogenetic comparisons have supported this model of diversification for both insect herbivores and specialized pollinators. An exceptional case where contemporaneous plant insect diversification might be expected is the obligate mutualism between fig trees (Ficus species, Moraceae) and their pollinating wasps (Agaonidae, Hymenoptera). The ubiquity and ecological significance of this mutualism in tropical and subtropical ecosystems has long intrigued biologists, but the systematic challenge posed by >750 interacting species pairs has hindered progress toward understanding its evolutionary history. In particular, taxon sampling and analytical tools have been insufficient for large-scale co-phylogenetic analyses. Here, we sampled nearly 200 interacting pairs of fig and wasp species from across the globe. Two supermatrices were assembled: on average, wasps had sequences from 77% of six genes (5.6kb), figs had sequences from 60% of five genes (5.5 kb), and overall 850 new DNA sequences were generated for this study. We also developed a new analytical tool, Jane 2, for event-based phylogenetic reconciliation analysis of very large data sets. Separate Bayesian phylogenetic analyses for figs and fig wasps under relaxed molecular clock assumptions indicate Cretaceous diversification of crown groups and contemporaneous divergence for nearly half of all fig and pollinator lineages. Event-based co-phylogenetic analyses further support the co-diversification hypothesis. Biogeographic analyses indicate that the presentday distribution of fig and pollinator lineages is consistent with an Eurasian origin and subsequent dispersal, rather than with Gondwanan vicariance. Overall, our findings indicate that the fig-pollinator mutualism represents an extreme case among plant-insect interactions of coordinated dispersal and long-term co-diversification.
Resumo:
The UK has a target for an 80% reduction in CO2 emissions by 2050 from a 1990 base. Domestic energy use accounts for around 30% of total emissions. This paper presents a comprehensive review of existing models and modelling techniques and indicates how they might be improved by considering individual buying behaviour. Macro (top-down) and micro (bottom-up) models have been reviewed and analysed. It is found that bottom-up models can project technology diffusion due to their higher resolution. The weakness of existing bottom-up models at capturing individual green technology buying behaviour has been identified. Consequently, Markov chains, neural networks and agent-based modelling are proposed as possible methods to incorporate buying behaviour within a domestic energy forecast model. Among the three methods, agent-based models are found to be the most promising, although a successful agent approach requires large amounts of input data. A prototype agent-based model has been developed and tested, which demonstrates the feasibility of an agent approach. This model shows that an agent-based approach is promising as a means to predict the effectiveness of various policy measures.
Resumo:
Photoelectron spectroscopy and scanning tunneling microscopy have been used to investigate how the oxidation state of Ce in CeO2-x(111) ultrathin films is influenced by the presence of Pd nanoparticles. Pd induces an increase in the concentration of Ce3+ cations, which is interpreted as charge transfer from Pd to CeO2-x(111) on the basis of DFT+U calculations. Charge transfer from Pd to Ce4+ is found to be energetically favorable even for individual Pd adatoms. These results have implications for our understanding of the redox behavior of ceria-based model catalyst systems.
Resumo:
This paper introduces a new agent-based model, which incorporates the actions of individual homeowners in a long-term domestic stock model, and details how it was applied in energy policy analysis. The results indicate that current policies are likely to fall significantly short of the 80% target and suggest that current subsidy levels need re-examining. In the model, current subsidy levels appear to offer too much support to some technologies, which in turn leads to the suppression of other technologies that have a greater energy saving potential. The model can be used by policy makers to develop further scenarios to find alternative, more effective, sets of policy measures. The model is currently limited to the owner-occupied stock in England, although it can be expanded, subject to the availability of data.
Resumo:
[1] We present a new, process-based model of soil and stream water dissolved organic carbon (DOC): the Integrated Catchments Model for Carbon (INCA-C). INCA-C is the first model of DOC cycling to explicitly include effects of different land cover types, hydrological flow paths, in-soil carbon biogeochemistry, and surface water processes on in-stream DOC concentrations. It can be calibrated using only routinely available monitoring data. INCA-C simulates daily DOC concentrations over a period of years to decades. Sources, sinks, and transformation of solid and dissolved organic carbon in peat and forest soils, wetlands, and streams as well as organic carbon mineralization in stream waters are modeled. INCA-C is designed to be applied to natural and seminatural forested and peat-dominated catchments in boreal and temperate regions. Simulations at two forested catchments showed that seasonal and interannual patterns of DOC concentration could be modeled using climate-related parameters alone. A sensitivity analysis showed that model predictions were dependent on the mass of organic carbon in the soil and that in-soil process rates were dependent on soil moisture status. Sensitive rate coefficients in the model included those for organic carbon sorption and desorption and DOC mineralization in the soil. The model was also sensitive to the amount of litter fall. Our results show the importance of climate variability in controlling surface water DOC concentrations and suggest the need for further research on the mechanisms controlling production and consumption of DOC in soils.
Resumo:
Increased atmospheric deposition of inorganic nitrogen (N) may lead to increased leaching of nitrate (NO3-) to surface waters. The mechanisms responsible for, and controls on, this leaching are matters of debate. An experimental N addition has been conducted at Gardsjon, Sweden to determine the magnitude and identify the mechanisms of N leaching from forested catchments within the EU funded project NITREX. The ability of INCA-N, a simple process-based model of catchment N dynamics, to simulate catchment-scale inorganic N dynamics in soil and stream water during the course of the experimental addition is evaluated. Simulations were performed for 1990-2002. Experimental N addition began in 1991. INCA-N was able to successfully reproduce stream and soil water dynamics before and during the experiment. While INCA-N did not correctly simulate the lag between the start of N addition and NO 2 3 breakthrough, the model was able to simulate the state change resulting from increased N deposition. Sensitivity analysis showed that model behaviour was controlled primarily by parameters related to hydrology and vegetation dynamics and secondarily by in-soil processes.
Resumo:
The aim of this three year project funded by the Countryside Council for Wales (CCW) is to develop techniques firstly, to refine and update existing targets for habitat restoration and re-creation at the landscape scale and secondly, to develop a GIS-based model for the implementation of those targets at the local scale. Landscape Character Assessment (LCA) is being used to map Landscape Types across the whole of Wales as the first stage towards setting strategic habitat targets. The GIS habitat model uses data from the digital Phase I Habitat Survey for Wales to determine the suitability of individual sites for restoration to specific habitat types, including broadleaf woodland. The long-term aim is to develop a system that strengthens the character of Welsh landscapes and provides real biodiversity benefits based upon realistic targets given limited resources for habitat restoration and re-creation.
Resumo:
A multi-scale framework for decision support is presented that uses a combination of experiments, models, communication, education and decision support tools to arrive at a realistic strategy to minimise diffuse pollution. Effective partnerships between researchers and stakeholders play a key part in successful implementation of this strategy. The Decision Support Matrix (DSM) is introduced as a set of visualisations that can be used at all scales, both to inform decision making and as a communication tool in stakeholder workshops. A demonstration farm is presented and one of its fields is taken as a case study. Hydrological and nutrient flow path models are used for event based simulation (TOPCAT), catchment scale modelling (INCA) and field scale flow visualisation (TopManage). One of the DSMs; The Phosphorus Export Risk Matrix (PERM) is discussed in detail. The PERM was developed iteratively as a point of discussion in stakeholder workshops, as a decision support and education tool. The resulting interactive PERM contains a set of questions and proposed remediation measures that reflect both expert and local knowledge. Education and visualisation tools such as GIS, risk indicators, TopManage and the PERM are found to be invaluable in communicating improved farming practice to stakeholders. (C) 2008 Elsevier Ltd. All rights reserved.
Predictive vegetation mapping in the Mediterranean context: Considerations and methodological issues
Resumo:
The need to map vegetation communities over large areas for nature conservation and to predict the impact of environmental change on vegetation distributions, has stimulated the development of techniques for predictive vegetation mapping. Predictive vegetation studies start with the development of a model relating vegetation units and mapped physical data, followed by the application of that model to a geographic database and over a wide range of spatial scales. This field is particularly important for identifying sites for rare and endangered species and locations of high biodiversity such as many areas of the Mediterranean Basin. The potential of the approach is illustrated with a mapping exercise in the alti-meditterranean zone of Lefka Ori in Crete. The study established the nature of the relationship between vegetation communities and physical data including altitude, slope and geomorphology. In this way the knowledge of community distribution was improved enabling a GIS-based model capable of predicting community distribution to be constructed. The paper describes the development of the spatial model and the methodological problems of predictive mapping for monitoring Mediterranean ecosystems. The paper concludes with a discussion of the role of predictive vegetation mapping and other spatial techniques, such as fuzzy mapping and geostatistics, for improving our understanding of the dynamics of Mediterranean ecosystems and for practical management in a region that is under increasing pressure from human impact.
Resumo:
This paper reports three experiments that examine the role of similarity processing in McGeorge and Burton's (1990) incidental learning task. In the experiments subjects performed a distractor task involving four-digit number strings, all of which conformed to a simple hidden rule. They were then given a forced-choice memory test in which they were presented with pairs of strings and were led to believe that one string of each pair had appeared in the prior learning phase. Although this was not the case, one string of each pair did conform to the hidden rule. Experiment 1 showed that, as in the McGeorge and Burton study, subjects were significantly more likely to select test strings that conformed to the hidden rule. However, additional analyses suggested that rather than having implicitly abstracted the rule, subjects may have been selecting strings that were in some way similar to those seen during the learning phase. Experiments 2 and 3 were designed to try to separate out effects due to similarity from those due to implicit rule abstraction. It was found that the results were more consistent with a similarity-based model than implicit rule abstraction per se.