38 resultados para Application of Data-driven Modelling in Water Sciences
em CentAUR: Central Archive University of Reading - UK
Resumo:
Second language acquisition researchers often face particular challenges when attempting to generalize study findings to the wider learner population. For example, language learners constitute a heterogeneous group, and it is not always clear how a study’s findings may generalize to other individuals who may differ in terms of language background and proficiency, among many other factors. In this paper, we provide an overview of how mixed-effects models can be used to help overcome these and other issues in the field of second language acquisition. We provide an overview of the benefits of mixed-effects models and a practical example of how mixed-effects analyses can be conducted. Mixed-effects models provide second language researchers with a powerful statistical tool in the analysis of a variety of different types of data.
Resumo:
There is little consensus on how agriculture will meet future food demands sustainably. Soils and their biota play a crucial role by mediating ecosystem services that support agricultural productivity. However, a multitude of site-specific environmental factors and management practices interact to affect the ability of soil biota to perform vital functions, confounding the interpretation of results from experimental approaches. Insights can be gained through models, which integrate the physiological, biological and ecological mechanisms underpinning soil functions. We present a powerful modelling approach for predicting how agricultural management practices (pesticide applications and tillage) affect soil functioning through earthworm populations. By combining energy budgets and individual-based simulation models, and integrating key behavioural and ecological drivers, we accurately predict population responses to pesticide applications in different climatic conditions. We use the model to analyse the ecological consequences of different weed management practices. Our results demonstrate that an important link between agricultural management (herbicide applications and zero, reduced and conventional tillage) and earthworms is the maintenance of soil organic matter (SOM). We show how zero and reduced tillage practices can increase crop yields while preserving natural ecosystem functions. This demonstrates how management practices which aim to sustain agricultural productivity should account for their effects on earthworm populations, as their proliferation stimulates agricultural productivity. Synthesis and applications. Our results indicate that conventional tillage practices have longer term effects on soil biota than pesticide control, if the pesticide has a short dissipation time. The risk of earthworm populations becoming exposed to toxic pesticides will be reduced under dry soil conditions. Similarly, an increase in soil organic matter could increase the recovery rate of earthworm populations. However, effects are not necessarily additive and the impact of different management practices on earthworms depends on their timing and the prevailing environmental conditions. Our model can be used to determine which combinations of crop management practices and climatic conditions pose least overall risk to earthworm populations. Linking our model mechanistically to crop yield models would aid the optimization of crop management systems by exploring the trade-off between different ecosystem services.
Resumo:
Ocean prediction systems are now able to analyse and predict temperature, salinity and velocity structures within the ocean by assimilating measurements of the ocean’s temperature and salinity into physically based ocean models. Data assimilation combines current estimates of state variables, such as temperature and salinity, from a computational model with measurements of the ocean and atmosphere in order to improve forecasts and reduce uncertainty in the forecast accuracy. Data assimilation generally works well with ocean models away from the equator but has been found to induce vigorous and unrealistic overturning circulations near the equator. A pressure correction method was developed at the University of Reading and the Met Office to control these circulations using ideas from control theory and an understanding of equatorial dynamics. The method has been used for the last 10 years in seasonal forecasting and ocean prediction systems at the Met Office and European Center for Medium-range Weather Forecasting (ECMWF). It has been an important element in recent re-analyses of the ocean heat uptake that mitigates climate change.
Resumo:
If the fundamental precepts of Farming Systems Research were to be taken literally then it would imply that for each farm 'unique' solutions should be sought. This is an unrealistic expectation, but it has led to the idea of a recommendation domain, implying creating a taxonomy of farms, in order to increase the general applicability of recommendations. Mathematical programming models are an established means of generating recommended solutions, but for such models to be effective they have to be constructed for 'truly' typical or representative situations. The multi-variate statistical techniques provide a means of creating the required typologies, particularly when an exhaustive database is available. This paper illustrates the application of this methodology in two different studies that shared the common purpose of identifying types of farming systems in their respective study areas. The issues related with the use of factor and cluster analyses for farm typification prior to building representative mathematical programming models for Chile and Pakistan are highlighted. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
This paper investigates the application and use of development viability models in the formation of planning policies in the UK. Particular attention is paid to three key areas; the assumed development scheme in development viability models, the use of forecasts and the debate concerning Threshold Land Value. The empirical section reports on the results of an interview survey involving the main producers of development viability models and appraisals. It is concluded that, although development viability models have intrinsic limitations associated with model composition and input uncertainties, the most significant limitations are related to the ways that they have been adapted for use in the planning system. In addition, it is suggested that the contested nature of Threshold Land Value is an example of calculative practices providing a façade of technocratic rationality in the planning system.
Resumo:
Over the last decade the English planning system has placed greater emphasis on the financial viability of development. ‘Calculative’ practices have been used to quantify and capture land value uplifts. Development viability appraisal (DVA) has become a key part of the evidence base used in planning decision-making and informs both ‘site-specific’ negotiations about the level of land value capture for individual schemes and ‘area-wide’ planning policy formation. This paper investigates how implementation of DVA is governed in planning policy formation. It is argued that the increased use of DVA raises important questions about how planning decisions are made and operationalised, not least because DVA is often poorly understood by some key stakeholders. The paper uses the concept of governance to thematically analyse semi-structured interviews conducted with the producers of DVAs and considers key procedural issues including (in)consistencies in appraisal practices, levels of stakeholder consultation and the potential for client and producer bias. Whilst stakeholder consultation is shown to be integral to the appraisal process in order to improve the quality of the appraisals and to legitimise the outputs, participation is restricted to industry experts and excludes some interest groups, including local communities. It is concluded that, largely because of its recent adoption and knowledge asymmetries between local planning authorities and appraisers, DVA is a weakly governed process characterised by emerging and contested guidance and is therefore ‘up for grabs’.
Resumo:
The concentrations of dissolved noble gases in water are widely used as a climate proxy to determine noble gas temperatures (NGTs); i.e., the temperature of the water when gas exchange last occurred. In this paper we make a step forward to apply this principle to fluid inclusions in stalagmites in order to reconstruct the cave temperature prevailing at the time when the inclusion was formed. We present an analytical protocol that allows us accurately to determine noble gas concentrations and isotope ratios in stalagmites, and which includes a precise manometrical determination of the mass of water liberated from fluid inclusions. Most important for NGT determination is to reduce the amount of noble gases liberated from air inclusions, as they mask the temperature-dependent noble gas signal from the water inclusions. We demonstrate that offline pre-crushing in air to subsequently extract noble gases and water from the samples by heating is appropriate to separate gases released from air and water inclusions. Although a large fraction of recent samples analysed by this technique yields NGTs close to present-day cave temperatures, the interpretation of measured noble gas concentrations in terms of NGTs is not yet feasible using the available least squares fitting models. This is because the noble gas concentrations in stalagmites are not only composed of the two components air and air saturated water (ASW), which these models are able to account for. The observed enrichments in heavy noble gases are interpreted as being due to adsorption during sample preparation in air, whereas the excess in He and Ne is interpreted as an additional noble gas component that is bound in voids in the crystallographic structure of the calcite crystals. As a consequence of our study's findings, NGTs will have to be determined in the future using the concentrations of Ar, Kr and Xe only. This needs to be achieved by further optimizing the sample preparation to minimize atmospheric contamination and to further reduce the amount of noble gases released from air inclusions.
Resumo:
The development of high throughput techniques ('chip' technology) for measurement of gene expression and gene polymorphisms (genomics), and techniques for measuring global protein expression (proteomics) and metabolite profile (metabolomics) are revolutionising life science research, including research in human nutrition. In particular, the ability to undertake large-scale genotyping and to identify gene polymorphisms that determine risk of chronic disease (candidate genes) could enable definition of an individual's risk at an early age. However, the search for candidate genes has proven to be more complex, and their identification more elusive, than previously thought. This is largely due to the fact that much of the variability in risk results from interactions between the genome and environmental exposures. Whilst the former is now very well defined via the Human Genome Project, the latter (e.g. diet, toxins, physical activity) are poorly characterised, resulting in inability to account for their confounding effects in most large-scale candidate gene studies. The polygenic nature of most chronic diseases offers further complexity, requiring very large studies to disentangle relatively weak impacts of large numbers of potential 'risk' genes. The efficacy of diet as a preventative strategy could also be considerably increased by better information concerning gene polymorphisms that determine variability in responsiveness to specific diet and nutrient changes. Much of the limited available data are based on retrospective genotyping using stored samples from previously conducted intervention trials. Prospective studies are now needed to provide data that can be used as the basis for provision of individualised dietary advice and development of food products that optimise disease prevention. Application of the new technologies in nutrition research offers considerable potential for development of new knowledge and could greatly advance the role of diet as a preventative disease strategy in the 21st century. Given the potential economic and social benefits offered, funding for research in this area needs greater recognition, and a stronger strategic focus, than is presently the case. Application of genomics in human health offers considerable ethical and societal as well as scientific challenges. Economic determinants of health care provision are more likely to resolve such issues than scientific developments or altruistic concerns for human health.
Resumo:
Data augmentation is a powerful technique for estimating models with latent or missing data, but applications in agricultural economics have thus far been few. This paper showcases the technique in an application to data on milk market participation in the Ethiopian highlands. There, a key impediment to economic development is an apparently low rate of market participation. Consequently, economic interest centers on the “locations” of nonparticipants in relation to the market and their “reservation values” across covariates. These quantities are of policy interest because they provide measures of the additional inputs necessary in order for nonparticipants to enter the market. One quantity of primary interest is the minimum amount of surplus milk (the “minimum efficient scale of operations”) that the household must acquire before market participation becomes feasible. We estimate this quantity through routine application of data augmentation and Gibbs sampling applied to a random-censored Tobit regression. Incorporating random censoring affects markedly the marketable-surplus requirements of the household, but only slightly the covariates requirements estimates and, generally, leads to more plausible policy estimates than the estimates obtained from the zero-censored formulation