77 resultados para literature-data integration
Resumo:
In the continuing debate over the impact of genetically modified (GM) crops on farmers of developing countries, it is important to accurately measure magnitudes such as farm-level yield gains from GM crop adoption. Yet most farm-level studies in the literature do not control for farmer self-selection, a potentially important source of bias in such estimates. We use farm-level panel data from Indian cotton farmers to investigate the yield effect of GM insect-resistant cotton. We explicitly take into account the fact that the choice of crop variety is an endogenous variable which might lead to bias from self-selection. A production function is estimated using a fixed-effects model to control for selection bias. Our results show that efficient farmers adopt Bacillus thuringiensis (Bt) cotton at a higher rate than their less efficient peers. This suggests that cross-sectional estimates of the yield effect of Bt cotton, which do not control for self-selection effects, are likely to be biased upwards. However, after controlling for selection bias, we still find that there is a significant positive yield effect from adoption of Bt cotton that more than offsets the additional cost of Bt seed.
Resumo:
Health care providers, purchasers and policy makers need to make informed decisions regarding the provision of cost-effective care. When a new health care intervention is to be compared with the current standard, an economic evaluation alongside an evaluation of health benefits provides useful information for the decision making process. We consider the information on cost-effectiveness which arises from an individual clinical trial comparing the two interventions. Recent methods for conducting a cost-effectiveness analysis for a clinical trial have focused on the net benefit parameter. The net benefit parameter, a function of costs and health benefits, is positive if the new intervention is cost-effective compared with the standard. In this paper we describe frequentist and Bayesian approaches to cost-effectiveness analysis which have been suggested in the literature and apply them to data from a clinical trial comparing laparoscopic surgery with open mesh surgery for the repair of inguinal hernias. We extend the Bayesian model to allow the total cost to be divided into a number of different components. The advantages and disadvantages of the different approaches are discussed. In January 2001, NICE issued guidance on the type of surgery to be used for inguinal hernia repair. We discuss our example in the light of this information. Copyright © 2003 John Wiley & Sons, Ltd.
Resumo:
Population subdivision complicates analysis of molecular variation. Even if neutrality is assumed, three evolutionary forces need to be considered: migration, mutation, and drift. Simplification can be achieved by assuming that the process of migration among and drift within subpopulations is occurring fast compared to Mutation and drift in the entire population. This allows a two-step approach in the analysis: (i) analysis of population subdivision and (ii) analysis of molecular variation in the migrant pool. We model population subdivision using an infinite island model, where we allow the migration/drift parameter Theta to vary among populations. Thus, central and peripheral populations can be differentiated. For inference of Theta, we use a coalescence approach, implemented via a Markov chain Monte Carlo (MCMC) integration method that allows estimation of allele frequencies in the migrant pool. The second step of this approach (analysis of molecular variation in the migrant pool) uses the estimated allele frequencies in the migrant pool for the study of molecular variation. We apply this method to a Drosophila ananassae sequence data set. We find little indication of isolation by distance, but large differences in the migration parameter among populations. The population as a whole seems to be expanding. A population from Bogor (Java, Indonesia) shows the highest variation and seems closest to the species center.
Resumo:
In the context of the current debate about teaching reading, research to ascertain primary teachers' personal and professional reading practices was undertaken. The study explored teachers' reading habits and preferences, investigated their knowledge of children's literature, and documented their reported use of such texts and involvement with library services. Questionnaire responses were gathered from 1200 teachers. The data were analysed and connections made between the teachers' own reading habits and preferences, their knowledge of children's literature, their accessing practices and pedagogic use of literature in school. This paper reports on part of the dataset and focuses on teachers' knowledge of children's literature; it reveals that primary professionals lean on a narrow repertoire of authors, poets and picture fiction creators. It also discusses teachers' personal reading preferences and considers divergences and connections between these as well as the implications of the teachers' limited repertoires on the reading development of young learners.
Resumo:
Exact error estimates for evaluating multi-dimensional integrals are considered. An estimate is called exact if the rates of convergence for the low- and upper-bound estimate coincide. The algorithm with such an exact rate is called optimal. Such an algorithm has an unimprovable rate of convergence. The problem of existing exact estimates and optimal algorithms is discussed for some functional spaces that define the regularity of the integrand. Important for practical computations data classes are considered: classes of functions with bounded derivatives and Holder type conditions. The aim of the paper is to analyze the performance of two optimal classes of algorithms: deterministic and randomized for computing multidimensional integrals. It is also shown how the smoothness of the integrand can be exploited to construct better randomized algorithms.
Resumo:
This paper describes a prototype grid infrastructure, called the eMinerals minigrid, for molecular simulation scientists. which is based on an integration of shared compute and data resources. We describe the key components, namely the use of Condor pools, Linux/Unix clusters with PBS and IBM's LoadLeveller job handling tools, the use of Globus for security handling, the use of Condor-G tools for wrapping globus job submit commands, Condor's DAGman tool for handling workflow, the Storage Resource Broker for handling data, and the CCLRC dataportal and associated tools for both archiving data with metadata and making data available to other workers.
Resumo:
As integrated software solutions reshape project delivery, they alter the bases for collaboration and competition across firms in complex industries. This paper synthesises and extends literatures on strategy in project-based industries and digitally-integrated work to understand how project-based firms interact with digital infrastructures for project delivery. Four identified strategies are to: 1) develop and use capabilities to shape the integrated software solutions that are used in projects; 2) co-specialize, developing complementary assets to work repeatedly with a particular integrator firm; 3) retain flexibility by developing and maintaining capabilities in multiple digital technologies and processes; and 4) manage interfaces, translating work into project formats for coordination while hiding proprietary data and capabilities in internal systems. The paper articulates the strategic importance of digital infrastructures for delivery as well as product architectures. It concludes by discussing managerial implications of the identified strategies and areas for further research.
Resumo:
Current mathematical models in building research have been limited in most studies to linear dynamics systems. A literature review of past studies investigating chaos theory approaches in building simulation models suggests that as a basis chaos model is valid and can handle the increasing complexity of building systems that have dynamic interactions among all the distributed and hierarchical systems on the one hand, and the environment and occupants on the other. The review also identifies the paucity of literature and the need for a suitable methodology of linking chaos theory to mathematical models in building design and management studies. This study is broadly divided into two parts and presented in two companion papers. Part (I), published in the previous issue, reviews the current state of the chaos theory models as a starting point for establishing theories that can be effectively applied to building simulation models. Part (II) develop conceptual frameworks that approach current model methodologies from the theoretical perspective provided by chaos theory, with a focus on the key concepts and their potential to help to better understand the nonlinear dynamic nature of built environment systems. Case studies are also presented which demonstrate the potential usefulness of chaos theory driven models in a wide variety of leading areas of building research. This study distills the fundamental properties and the most relevant characteristics of chaos theory essential to (1) building simulation scientists and designers (2) initiating a dialogue between scientists and engineers, and (3) stimulating future research on a wide range of issues involved in designing and managing building environmental systems.
Resumo:
This paper presents a new approach to modelling flash floods in dryland catchments by integrating remote sensing and digital elevation model (DEM) data in a geographical information system (GIS). The spectral reflectance of channels affected by recent flash floods exhibit a marked increase, due to the deposition of fine sediments in these channels as the flood recedes. This allows the parts of a catchment that have been affected by a recent flood event to be discriminated from unaffected parts, using a time series of Landsat images. Using images of the Wadi Hudain catchment in southern Egypt, the hillslope areas contributing flow were inferred for different flood events. The SRTM3 DEM was used to derive flow direction, flow length, active channel cross-sectional areas and slope. The Manning Equation was used to estimate the channel flow velocities, and hence the time-area zones of the catchment. A channel reach that was active during a 1985 runoff event, that does not receive any tributary flow, was used to estimate a transmission loss rate of 7·5 mm h−1, given the maximum peak discharge estimate. Runoff patterns resulting from different flood events are quite variable; however the southern part of the catchment appears to have experienced more floods during the period of study (1984–2000), perhaps because the bedrock hillslopes in this area are more effective at runoff production than other parts of the catchment which are underlain by unconsolidated Quaternary sands and gravels. Due to high transmission loss, runoff generated within the upper reaches is rarely delivered to the alluvial fan and Shalateen city situated at the catchment outlet. The synthetic GIS-based time area zones, on their own, cannot be relied on to model the hydrographs reliably; physical parameters, such as rainfall intensity, distribution, and transmission loss, must also be considered.
Resumo:
It is indisputable that climate is an important factor in many livestock diseases. Nevertheless, our knowledge of the impact of climate change on livestock infectious diseases is much less certain.Therefore, the aim of the article is to conduct a systematic review of the literature on the topic utilizing available retrospective data and information. Across a corpus of 175 formal publications,limited empirical evidence was offered to underpin many of the main arguments. The literature reviewed was highly polarized and often inconsistent regarding what the future may hold. Historical explorations were rare. However, identifying past drivers to livestock disease may not fully capture the extent that new and unknown drivers will influence future change. As such, our current predictive capacity is low. We offer a number of recommendations to strengthen this capacity in the coming years. We conclude that our current approach to research on the topic is limiting and unlikely to yield sufficient, actionable evidence to inform future praxis. Therefore, we argue for the creation of a reflexive, knowledge-based system, underpinned by a collective intelligence framework to support the drawing of inferences across the literature.
Resumo:
Wind generation’s contribution to meeting extreme peaks in electricity demand is a key concern for the integration of wind power. In Great Britain (GB), robustly assessing this contribution directly from power system data (i.e. metered wind-supply and electricity demand) is difficult as extreme peaks occur infrequently (by definition) and measurement records are both short and inhomogeneous. Atmospheric circulation-typing combined with meteorological reanalysis data is proposed as a means to address some of these difficulties, motivated by a case study of the extreme peak demand events in January 2010. A preliminary investigation of the physical and statistical properties of these circulation types suggests that they can be used to identify the conditions that are most likely to be associated with extreme peak demand events. Three broad cases are highlighted as requiring further investigation. The high-over-Britain anticyclone is found to be generally associated with very low winds but relatively moderate temperatures (and therefore moderate peak demands, somewhat in contrast to the classic low-wind cold snap that is sometimes apparent in the literature). In contrast, both longitudinally extended blocking over Scotland/Scandinavia and latitudinally extended troughs over western Europe appear to be more closely linked to the very cold GB temperatures (usually associated with extreme peak demands). In both of these latter situations, wind resource averaged across GB appears to be more moderate.
Resumo:
This paper investigates whether using natural logarithms (logs) of price indices for forecasting inflation rates is preferable to employing the original series. Univariate forecasts for annual inflation rates for a number of European countries and the USA based on monthly seasonal consumer price indices are considered. Stochastic seasonality and deterministic seasonality models are used. In many cases, the forecasts based on the original variables result in substantially smaller root mean squared errors than models based on logs. In turn, if forecasts based on logs are superior, the gains are typically small. This outcome sheds doubt on the common practice in the academic literature to forecast inflation rates based on differences of logs.
Resumo:
This paper review the literature on the distribution of commercial real estate returns. There is growing evidence that the assumption of normality in returns is not safe. Distributions are found to be peaked, fat-tailed and, tentatively, skewed. There is some evidence of compound distributions and non-linearity. Public traded real estate assets (such as property company or REIT shares) behave in a fashion more similar to other common stocks. However, as in equity markets, it would be unwise to assume normality uncritically. Empirical evidence for UK real estate markets is obtained by applying distribution fitting routines to IPD Monthly Index data for the aggregate index and selected sub-sectors. It is clear that normality is rejected in most cases. It is often argued that observed differences in real estate returns are a measurement issue resulting from appraiser behaviour. However, unsmoothing the series does not assist in modelling returns. A large proportion of returns are close to zero. This would be characteristic of a thinly-traded market where new information arrives infrequently. Analysis of quarterly data suggests that, over longer trading periods, return distributions may conform more closely to those found in other asset markets. These results have implications for the formulation and implementation of a multi-asset portfolio allocation strategy.