890 resultados para Context Model
Resumo:
In Central Brazil, the long-term sustainability of beef cattle systems is under threat over vast tracts of farming areas, as more than half of the 50 million hectares of sown pastures are suffering from degradation. Overgrazing practised to maintain high stocking rates is regarded as one of the main causes. High stocking rates are deliberate and crucial decisions taken by the farmers, which appear paradoxical, even irrational given the state of knowledge regarding the consequences of overgrazing. The phenomenon however appears inextricably linked with the objectives that farmers hold. In this research those objectives were elicited first and from their ranking two, ‘asset value of cattle (representing cattle ownership)' and ‘present value of economic returns', were chosen to develop an original bi-criteria Compromise Programming model to test various hypotheses postulated to explain the overgrazing behaviour. As part of the model a pasture productivity index is derived to estimate the pasture recovery cost. Different scenarios based on farmers' attitudes towards overgrazing, pasture costs and capital availability were analysed. The results of the model runs show that benefits from holding more cattle can outweigh the increased pasture recovery and maintenance costs. This result undermines the hypothesis that farmers practise overgrazing because they are unaware or uncaring about overgrazing costs. An appropriate approach to the problem of pasture degradation requires information on the economics, and its interplay with farmers' objectives, for a wide range of pasture recovery and maintenance methods. Seen within the context of farmers' objectives, some level of overgrazing appears rational. Advocacy of the simple ‘no overgrazing' rule is an insufficient strategy to maintain the long-term sustainability of the beef production systems in Central Brazil.
Resumo:
In Central Brazil, the long-term, sustainability of beef cattle systems is under threat over vast tracts of farming areas, as more than half of the 50 million hectares of sown pastures are suffering from. degradation. Overgrazing practised to maintain high stocking rates is regarded as one of the main causes. High stocking rates are deliberate and crucial decisions taken by the farmers, which appear paradoxical, even irrational given the state of knowledge regarding the consequences of overgrazing. The phenomenon however appears inextricably linked with the objectives that farmers hold. In this research those objectives were elicited first and from their ranking two, 'asset value of cattle (representing cattle ownership and 'present value of economic returns', were chosen to develop an original bi-criteria Compromise Programming model to test various hypotheses postulated to explain the overgrazing behaviour. As part of the model a pasture productivity index is derived to estimate the pasture recovery cost. Different scenarios based on farmers' attitudes towards overgrazing, pasture costs and capital availability were analysed. The results of the model runs show that benefits from holding more cattle can outweigh the increased pasture recovery and maintenance costs. This result undermines the hypothesis that farmers practise overgrazing because they are unaware or uncaring caring about overgrazing costs. An appropriate approach to the problem of pasture degradation requires information on the economics,and its interplay with farmers' objectives, for a wide range of pasture recovery and maintenance methods. Seen within the context of farmers' objectives, some level of overgrazing appears rational. Advocacy of the simple 'no overgrazing' rule is an insufficient strategy to maintain the long-term sustainability of the beef production systems in Central Brazil. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
It is well established that crop production is inherently vulnerable to variations in the weather and climate. More recently the influence of vegetation on the state of the atmosphere has been recognized. The seasonal growth of crops can influence the atmosphere and have local impacts on the weather, which in turn affects the rate of seasonal crop growth and development. Considering the coupled nature of the crop-climate system, and the fact that a significant proportion of land is devoted to the cultivation of crops, important interactions may be missed when studying crops and the climate system in isolation, particularly in the context of land use and climate change. To represent the two-way interactions between seasonal crop growth and atmospheric variability, we integrate a crop model developed specifically to operate at large spatial scales (General Large Area Model for annual crops) into the land surface component of a global climate model (GCM; HadAM3). In the new coupled crop-climate model, the simulated environment (atmosphere and soil states) influences growth and development of the crop, while simultaneously the temporal variations in crop leaf area and height across its growing season alter the characteristics of the land surface that are important determinants of surface fluxes of heat and moisture, as well as other aspects of the land-surface hydrological cycle. The coupled model realistically simulates the seasonal growth of a summer annual crop in response to the GCM's simulated weather and climate. The model also reproduces the observed relationship between seasonal rainfall and crop yield. The integration of a large-scale single crop model into a GCM, as described here, represents a first step towards the development of fully coupled crop and climate models. Future development priorities and challenges related to coupling crop and climate models are discussed.
A hierarchical Bayesian model for predicting the functional consequences of amino-acid polymorphisms
Resumo:
Genetic polymorphisms in deoxyribonucleic acid coding regions may have a phenotypic effect on the carrier, e.g. by influencing susceptibility to disease. Detection of deleterious mutations via association studies is hampered by the large number of candidate sites; therefore methods are needed to narrow down the search to the most promising sites. For this, a possible approach is to use structural and sequence-based information of the encoded protein to predict whether a mutation at a particular site is likely to disrupt the functionality of the protein itself. We propose a hierarchical Bayesian multivariate adaptive regression spline (BMARS) model for supervised learning in this context and assess its predictive performance by using data from mutagenesis experiments on lac repressor and lysozyme proteins. In these experiments, about 12 amino-acid substitutions were performed at each native amino-acid position and the effect on protein functionality was assessed. The training data thus consist of repeated observations at each position, which the hierarchical framework is needed to account for. The model is trained on the lac repressor data and tested on the lysozyme mutations and vice versa. In particular, we show that the hierarchical BMARS model, by allowing for the clustered nature of the data, yields lower out-of-sample misclassification rates compared with both a BMARS and a frequen-tist MARS model, a support vector machine classifier and an optimally pruned classification tree.
Resumo:
Multiscale modeling is emerging as one of the key challenges in mathematical biology. However, the recent rapid increase in the number of modeling methodologies being used to describe cell populations has raised a number of interesting questions. For example, at the cellular scale, how can the appropriate discrete cell-level model be identified in a given context? Additionally, how can the many phenomenological assumptions used in the derivation of models at the continuum scale be related to individual cell behavior? In order to begin to address such questions, we consider a discrete one-dimensional cell-based model in which cells are assumed to interact via linear springs. From the discrete equations of motion, the continuous Rouse [P. E. Rouse, J. Chem. Phys. 21, 1272 (1953)] model is obtained. This formalism readily allows the definition of a cell number density for which a nonlinear "fast" diffusion equation is derived. Excellent agreement is demonstrated between the continuum and discrete models. Subsequently, via the incorporation of cell division, we demonstrate that the derived nonlinear diffusion model is robust to the inclusion of more realistic biological detail. In the limit of stiff springs, where cells can be considered to be incompressible, we show that cell velocity can be directly related to cell production. This assumption is frequently made in the literature but our derivation places limits on its validity. Finally, the model is compared with a model of a similar form recently derived for a different discrete cell-based model and it is shown how the different diffusion coefficients can be understood in terms of the underlying assumptions about cell behavior in the respective discrete models.
Resumo:
Successful pest management is often hindered by the inherent complexity of the interactions of a pest with its environment. The use of genetically characterized model plants can allow investigation of chosen aspects of these interactions by limiting the number of variables during experimentation. However, it is important to study the generic nature of these model systems if the data generated are to be assessed in a wider context, for instance, with those systems of commercial significance. This study assesses the suitability of Arabidopsis thaliana (L.) Heynh. (Brassicaceae) as a model host plant to investigate plant-herbivore-natural enemy interactions, with Plutella xylostella (L.) (Lepidoptera: Plutellidae), the diamondback moth, and Cotesia plutellae (Kurdjumov) (Hymenoptera: Braconidae), a parasitoid of P. xylostella. The growth and development of P. xylostella and C. plutellae on an A. thaliana host plant (Columbia type) were compared to that on Brassica rapa var. pekinensis (L.) (Brassicaceae), a host crop that is widely cultivated and also commonly used as a laboratory host for P. xylostella rearing. The second part of the study investigated the potential effect of the different A. thaliana background lines, Columbia and Landsberg (used in wider scientific studies), on growth and development of P. xylostella and C. plutellae. Plutella xylostella life history parameters were found generally to be similar between the host plants investigated. However, C. plutellae were more affected by the differences in host plant. Fewer adult parasitoids resulted from development on A. thaliana compared to B. rapa, and those that did emerge were significantly smaller. Adult male C. plutellae developing on Columbia were also significantly smaller than those on Landsberg A. thaliana.
Resumo:
In survival analysis frailty is often used to model heterogeneity between individuals or correlation within clusters. Typically frailty is taken to be a continuous random effect, yielding a continuous mixture distribution for survival times. A Bayesian analysis of a correlated frailty model is discussed in the context of inverse Gaussian frailty. An MCMC approach is adopted and the deviance information criterion is used to compare models. As an illustration of the approach a bivariate data set of corneal graft survival times is analysed. (C) 2006 Elsevier B.V. All rights reserved.
Resumo:
Background: Selecting the highest quality 3D model of a protein structure from a number of alternatives remains an important challenge in the field of structural bioinformatics. Many Model Quality Assessment Programs (MQAPs) have been developed which adopt various strategies in order to tackle this problem, ranging from the so called "true" MQAPs capable of producing a single energy score based on a single model, to methods which rely on structural comparisons of multiple models or additional information from meta-servers. However, it is clear that no current method can separate the highest accuracy models from the lowest consistently. In this paper, a number of the top performing MQAP methods are benchmarked in the context of the potential value that they add to protein fold recognition. Two novel methods are also described: ModSSEA, which based on the alignment of predicted secondary structure elements and ModFOLD which combines several true MQAP methods using an artificial neural network. Results: The ModSSEA method is found to be an effective model quality assessment program for ranking multiple models from many servers, however further accuracy can be gained by using the consensus approach of ModFOLD. The ModFOLD method is shown to significantly outperform the true MQAPs tested and is competitive with methods which make use of clustering or additional information from multiple servers. Several of the true MQAPs are also shown to add value to most individual fold recognition servers by improving model selection, when applied as a post filter in order to re-rank models. Conclusion: MQAPs should be benchmarked appropriately for the practical context in which they are intended to be used. Clustering based methods are the top performing MQAPs where many models are available from many servers; however, they often do not add value to individual fold recognition servers when limited models are available. Conversely, the true MQAP methods tested can often be used as effective post filters for re-ranking few models from individual fold recognition servers and further improvements can be achieved using a consensus of these methods.
Resumo:
Purpose – The purpose of this paper is to propose a process model for knowledge transfer in using theories relating knowledge communication and knowledge translation. Design/methodology/approach – Most of what is put forward in this paper is based on a research project titled “Procurement for innovation and knowledge transfer (ProFIK)”. The project is funded by a UK government research council – The Engineering and Physical Sciences Research Council (EPSRC). The discussions are mainly grounded on a thorough review of literature accomplished as part of the research project. Findings – The process model developed in this paper has built upon the theory of knowledge transfer and the theory of communication. Knowledge transfer, per se, is not a mere transfer of knowledge. It involves different stages of knowledge transformation. Depending on the context of knowledge transfer, it can also be influenced by many factors; some positive and some negative. The developed model of knowledge transfer attempts to encapsulate all these issues in order to create a holistic framework. Originality/value of paper – An attempt has been made in the paper to combine some of the significant theories or findings relating to knowledge transfer together, making the paper an original and valuable one.
Resumo:
Some of the most pressing problems currently facing chemical education throughout the world are rehearsed. It is suggested that if the notion of "context" is to be used as the basis for an address to these problems, it must enable a number of challenges to be met. Four generic models of "context" are identified that are currently used or that may be used in some form within chemical education as the basis for curriculum design. It is suggested that a model based on physical settings, together with their cultural justifications, and taught with a socio-cultural perspective on learning, is likely to meet those challenges most fully. A number of reasons why the relative efficacies of these four models of approaches cannot be evaluated from the existing research literature are suggested. Finally, an established model for the representation of the development of curricula is used to discuss the development and evaluation of context-based chemical curricula.
Resumo:
This paper proposes a conceptual model of a context-aware group support system (GSS) to assist local council employees to perform collaborative tasks in conjunction with inter- and intra-organisational stakeholders. Most discussions about e-government focus on the use of ICT to improve the relationship between government and citizen, not on the relationship between government and employees. This paper seeks to expose the unique culture of UK local councils and to show how a GSS could support local government employer and employee needs.
Resumo:
Driven by new network and middleware technologies such as mobile broadband, near-field communication, and context awareness the so-called ambient lifestyle will foster innovative use cases in building automation, healthcare and agriculture. In the EU project Hydra1 highlevel security, trust and privacy concerns such as loss of control, profiling and surveillance are considered at the outset. At the end of this project the Hydra middleware development platform will have been designed so as to enable developers to realise secure ambient scenarios especially in the user domains of building automation, healthcare, and agriculture. This paper gives a short introduction to the Hydra project, its user domains and its approach to ensure security by design. Based on the results of a focus group analysis of the building automation domain typical threats are evaluated and their risks are assessed. Then, specific security requirements with respect to security, privacy, and trust are derived in order to incorporate them into the Hydra Security Meta Model. How concepts such as context security, semantic security, and virtualisation support the overall Hydra approach will be introduced and illustrated on the basis of a technical building automation scenario.
Resumo:
We present a novel algorithm for joint state-parameter estimation using sequential three dimensional variational data assimilation (3D Var) and demonstrate its application in the context of morphodynamic modelling using an idealised two parameter 1D sediment transport model. The new scheme combines a static representation of the state background error covariances with a flow dependent approximation of the state-parameter cross-covariances. For the case presented here, this involves calculating a local finite difference approximation of the gradient of the model with respect to the parameters. The new method is easy to implement and computationally inexpensive to run. Experimental results are positive with the scheme able to recover the model parameters to a high level of accuracy. We expect that there is potential for successful application of this new methodology to larger, more realistic models with more complex parameterisations.
Resumo:
During April-May 2010 volcanic ash clouds from the Icelandic Eyjafjallajökull volcano reached Europe causing an unprecedented disruption of the EUR/NAT region airspace. Civil aviation authorities banned all flight operations because of the threat posed by volcanic ash to modern turbine aircraft. New quantitative airborne ash mass concentration thresholds, still under discussion, were adopted for discerning regions contaminated by ash. This has implications for ash dispersal models routinely used to forecast the evolution of ash clouds. In this new context, quantitative model validation and assessment of the accuracies of current state-of-the-art models is of paramount importance. The passage of volcanic ash clouds over central Europe, a territory hosting a dense network of meteorological and air quality observatories, generated a quantity of observations unusual for volcanic clouds. From the ground, the cloud was observed by aerosol lidars, lidar ceilometers, sun photometers, other remote-sensing instru- ments and in-situ collectors. From the air, sondes and multiple aircraft measurements also took extremely valuable in-situ and remote-sensing measurements. These measurements constitute an excellent database for model validation. Here we validate the FALL3D ash dispersal model by comparing model results with ground and airplane-based measurements obtained during the initial 14e23 April 2010 Eyjafjallajökull explosive phase. We run the model at high spatial resolution using as input hourly- averaged observed heights of the eruption column and the total grain size distribution reconstructed from field observations. Model results are then compared against remote ground-based and in-situ aircraft-based measurements, including lidar ceilometers from the German Meteorological Service, aerosol lidars and sun photometers from EARLINET and AERONET networks, and flight missions of the German DLR Falcon aircraft. We find good quantitative agreement, with an error similar to the spread in the observations (however depending on the method used to estimate mass eruption rate) for both airborne and ground mass concentration. Such verification results help us understand and constrain the accuracy and reliability of ash transport models and it is of enormous relevance for designing future operational mitigation strategies at Volcanic Ash Advisory Centers.