31 resultados para historical data
em CentAUR: Central Archive University of Reading - UK
Resumo:
This paper presents a simple Bayesian approach to sample size determination in clinical trials. It is required that the trial should be large enough to ensure that the data collected will provide convincing evidence either that an experimental treatment is better than a control or that it fails to improve upon control by some clinically relevant difference. The method resembles standard frequentist formulations of the problem, and indeed in certain circumstances involving 'non-informative' prior information it leads to identical answers. In particular, unlike many Bayesian approaches to sample size determination, use is made of an alternative hypothesis that an experimental treatment is better than a control treatment by some specified magnitude. The approach is introduced in the context of testing whether a single stream of binary observations are consistent with a given success rate p(0). Next the case of comparing two independent streams of normally distributed responses is considered, first under the assumption that their common variance is known and then for unknown variance. Finally, the more general situation in which a large sample is to be collected and analysed according to the asymptotic properties of the score statistic is explored. Copyright (C) 2007 John Wiley & Sons, Ltd.
Resumo:
Previous studies of the place of Property in the multi-asset portfolio have generally relied on historical data, and have been concerned with the supposed risk reduction effects that Property would have on such portfolios. In this paper a different approach has been taken. Not only are expectations data used, but we have also concentrated upon the required return that Property would have to offer to achieve a holding of 15% in typical UK pension fund portfolios. Using two benchmark portfolios for pension funds, we have shown that Property's required return is less than that expected, and therefore it could justify a 15% holding.
Resumo:
Acid mine drainage (AMD) is a widespread environmental problem associated with both working and abandoned mining operations. As part of an overall strategy to determine a long-term treatment option for AMD, a pilot passive treatment plant was constructed in 1994 at Wheal Jane Mine in Cornwall, UK. The plant consists of three separate systems, each containing aerobic reed beds, anaerobic cell and rock filters, and represents the largest European experimental facility of its kind. The systems only differ by the type of pretreatment utilised to increase the pH of the influent minewater (pH <4): lime dosed (LD), anoxic limestone drain (ALD) and lime free (LF), which receives no form of pretreatment. Historical data (1994-1997) indicate median Fe reduction between 55% and 92%, sulphate removal in the range of 3-38% and removal of target metals (cadmium, copper and zinc) below detection limits, depending on pretreatment and flow rates through the system. A new model to simulate the processes and dynamics of the wetlands systems is described, as well as the application of the model to experimental data collected at the pilot plant. The model is process based, and utilises reaction kinetic approaches based on experimental microbial techniques rather than an equilibrium approach to metal precipitation. The model is dynamic and utilises numerical integration routines to solve a set of differential equations that describe the behaviour of 20 variables over the 17 pilot plant cells on a daily basis. The model outputs at each cell boundary are evaluated and compared with the measured data, and the model is demonstrated to provide a good representation of the complex behaviour of the wetland system for a wide range of variables. (C) 2004 Elsevier B.V/ All rights reserved.
Resumo:
Heat waves are expected to increase in frequency and magnitude with climate change. The first part of a study to produce projections of the effect of future climate change on heat-related mortality is presented. Separate city-specific empirical statistical models that quantify significant relationships between summer daily maximum temperature (T max) and daily heat-related deaths are constructed from historical data for six cities: Boston, Budapest, Dallas, Lisbon, London, and Sydney. ‘Threshold temperatures’ above which heat-related deaths begin to occur are identified. The results demonstrate significantly lower thresholds in ‘cooler’ cities exhibiting lower mean summer temperatures than in ‘warmer’ cities exhibiting higher mean summer temperatures. Analysis of individual ‘heat waves’ illustrates that a greater proportion of mortality is due to mortality displacement in cities with less sensitive temperature–mortality relationships than in those with more sensitive relationships, and that mortality displacement is no longer a feature more than 12 days after the end of the heat wave. Validation techniques through residual and correlation analyses of modelled and observed values and comparisons with other studies indicate that the observed temperature–mortality relationships are represented well by each of the models. The models can therefore be used with confidence to examine future heat-related deaths under various climate change scenarios for the respective cities (presented in Part 2).
Resumo:
The formulation of a new process-based crop model, the general large-area model (GLAM) for annual crops is presented. The model has been designed to operate on spatial scales commensurate with those of global and regional climate models. It aims to simulate the impact of climate on crop yield. Procedures for model parameter determination and optimisation are described, and demonstrated for the prediction of groundnut (i.e. peanut; Arachis hypogaea L.) yields across India for the period 1966-1989. Optimal parameters (e.g. extinction coefficient, transpiration efficiency, rate of change of harvest index) were stable over space and time, provided the estimate of the yield technology trend was based on the full 24-year period. The model has two location-specific parameters, the planting date, and the yield gap parameter. The latter varies spatially and is determined by calibration. The optimal value varies slightly when different input data are used. The model was tested using a historical data set on a 2.5degrees x 2.5degrees grid to simulate yields. Three sites are examined in detail-grid cells from Gujarat in the west, Andhra Pradesh towards the south, and Uttar Pradesh in the north. Agreement between observed and modelled yield was variable, with correlation coefficients of 0.74, 0.42 and 0, respectively. Skill was highest where the climate signal was greatest, and correlations were comparable to or greater than correlations with seasonal mean rainfall. Yields from all 35 cells were aggregated to simulate all-India yield. The correlation coefficient between observed and simulated yields was 0.76, and the root mean square error was 8.4% of the mean yield. The model can be easily extended to any annual crop for the investigation of the impacts of climate variability (or change) on crop yield over large areas. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Varroa destructor is a parasitic mite of the Eastern honeybee Apis cerana. Fifty years ago, two distinct evolutionary lineages (Korean and Japanese) invaded the Western honeybee Apis mellifera. This haplo-diploid parasite species reproduces mainly through brother sister matings, a system which largely favors the fixation of new mutations. In a worldwide sample of 225 individuals from 21 locations collected on Western honeybees and analyzed at 19 microsatellite loci, a series of de novo mutations was observed. Using historical data concerning the invasion, this original biological system has been exploited to compare three mutation models with allele size constraints for microsatellite markers: stepwise (SMM) and generalized (GSM) mutation models, and a model with mutation rate increasing exponentially with microsatellite length (ESM). Posterior probabilities of the three models have been estimated for each locus individually using reversible jump Markov Chain Monte Carlo. The relative support of each model varies widely among loci, but the GSM is the only model that always receives at least 9% support, whatever the locus. The analysis also provides robust estimates of mutation parameters for each locus and of the divergence time of the two invasive lineages (67,000 generations with a 90% credibility interval of 35,000-174,000). With an average of 10 generations per year, this divergence time fits with the last post-glacial Korea Japan land separation. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
The application of prediction theories has been widely practised for many years in many industries such as manufacturing, defence and aerospace. Although these theories are not new, their application has not been widely used within the building services industry. Collectively, the building services industry should take a deeper look at these approaches in comparison with the traditional deterministic approaches currently being practised. By extending the application into this industry, this paper seeks to provide the industry with an overview of how simplified stochastic modelling coupled with availability and reliability predictions using historical data compiled from various sources could enhance the quality of building services systems.
Resumo:
Daily sunshine duration is commonly reported at weather stations. Beyond the basic duration report, more information is available from scorched cards of Campbell-Stokes sunshine recorders, such as the estimation of direct-beam solar irradiance. Sunshine cards therefore potentially provide information on sky state, as inferred from solar-radiation data. Some sites have been operational since the late 19th century, hence sunshine cards potentially provide underexploited historical data on sky state. Sunshine cards provide an example of an archive source yielding data beyond the measurements originally sought.
Resumo:
Changes in climate variability and, in particular, changes in extreme climate events are likely to be of far more significance for environmentally vulnerable regions than changes in the mean state. It is generally accepted that sea-surface temperatures (SSTs) play an important role in modulating rainfall variability. Consequently, SSTs can be prescribed in global and regional climate modelling in order to study the physical mechanisms behind rainfall and its extremes. Using a satellite-based daily rainfall historical data set, this paper describes the main patterns of rainfall variability over southern Africa, identifies the dates when extreme rainfall occurs within these patterns, and shows the effect of resolution in trying to identify the location and intensity of SST anomalies associated with these extremes in the Atlantic and southwest Indian Ocean. Derived from a Principal Component Analysis (PCA), the results also suggest that, for the spatial pattern accounting for the highest amount of variability, extremes extracted at a higher spatial resolution do give a clearer indication regarding the location and intensity of anomalous SST regions. As the amount of variability explained by each spatial pattern defined by the PCA decreases, it would appear that extremes extracted at a lower resolution give a clearer indication of anomalous SST regions.
Resumo:
Office returns in the City of London are more volatile than in other UK markets. This volatility may reflect fluctuations in capital flows associated with changing patterns of ownership and the growing linkage between real estate and financial markets in the City. Using current and historical data, patterns of ownership in the City are investigated. They reveal that overseas ownership has grown markedly since 1985, that owners are predominantly FIRE-sector firms and that there are strong links between ownership and occupation. This raises concerns about future volatility and systemic risk in a market strongly influenced by the cyclical behaviour and shocks of the international financial system.
Resumo:
Office returns in the City of London are more volatile than in other UK markets. This volatility may reflect fluctuations in capital flows associated with changing patterns of ownership and the growing linkage between real estate and financial markets in the City. Using current and historical data, patterns of ownership in the City are investigated. They reveal that overseas ownership has grown markedly since 1985, that owners are predominantly FIRE sector firms and that there are strong links between ownership and occupation. This raises concerns about future volatility and systemic risk.
Resumo:
This article looks at an important but neglected aspect of medieval sovereign debt, namely ‘accounts payable’ owed by the Crown to merchants and employees. It focuses on the unusually well-documented relationship between Henry III, King of England between 1216 and 1272, and Flemish merchants from the towns of Douai and Ypres, who provided cloth on credit to the royal wardrobe. From the surviving royal documents, we reconstruct the credit advanced to the royal wardrobe by the merchants of Ypres and Douai for each year between 1247 and 1270, together with the king's repayment history. The interactions between the king and the merchants are then analysed. The insights from this analysis are applied to the historical data to explain the trading decisions made by the merchants during this period, as well as why the strategies of the Yprois sometimes differed from those of the Douaissiens.
Resumo:
We present an efficient graph-based algorithm for quantifying the similarity of household-level energy use profiles, using a notion of similarity that allows for small time–shifts when comparing profiles. Experimental results on a real smart meter data set demonstrate that in cases of practical interest our technique is far faster than the existing method for computing the same similarity measure. Having a fast algorithm for measuring profile similarity improves the efficiency of tasks such as clustering of customers and cross-validation of forecasting methods using historical data. Furthermore, we apply a generalisation of our algorithm to produce substantially better household-level energy use forecasts from historical smart meter data.
Resumo:
Abstract We present a refined parametric model for forecasting electricity demand which performed particularly well in the recent Global Energy Forecasting Competition (GEFCom 2012). We begin by motivating and presenting a simple parametric model, treating the electricity demand as a function of the temperature and day of the data. We then set out a series of refinements of the model, explaining the rationale for each, and using the competition scores to demonstrate that each successive refinement step increases the accuracy of the model’s predictions. These refinements include combining models from multiple weather stations, removing outliers from the historical data, and special treatments of public holidays.