22 resultados para PROBABILITY-DISTRIBUTIONS
em CentAUR: Central Archive University of Reading - UK
Resumo:
Techniques are proposed for evaluating forecast probabilities of events. The tools are especially useful when, as in the case of the Survey of Professional Forecasters (SPF) expected probability distributions of inflation, recourse cannot be made to the method of construction in the evaluation of the forecasts. The tests of efficiency and conditional efficiency are applied to the forecast probabilities of events of interest derived from the SPF distributions, and supplement a whole-density evaluation of the SPF distributions based on the probability integral transform approach.
Resumo:
We consider whether survey respondents’ probability distributions, reported as histograms, provide reliable and coherent point predictions, when viewed through the lens of a Bayesian learning model. We argue that a role remains for eliciting directly-reported point predictions in surveys of professional forecasters.
Resumo:
AEA Technology has provided an assessment of the probability of α-mode containment failure for the Sizewell B PWR. After a preliminary review of the methodologies available it was decided to use the probabilistic approach described in the paper, based on an extension of the methodology developed by Theofanous et al. (Nucl. Sci. Eng. 97 (1987) 259–325). The input to the assessment is 12 probability distributions; the bases for the quantification of these distributions are discussed. The α-mode assessment performed for the Sizewell B PWR has demonstrated the practicality of the event-tree method with input data represented by probability distributions. The assessment itself has drawn attention to a number of topics, which may be plant and sequence dependent, and has indicated the importance of melt relocation scenarios. The α-mode failure probability following an accident that leads to core melt relocation to the lower head for the Sizewell B PWR has been assessed as a few parts in 10 000, on the basis of current information. This assessment has been the first to consider elevated pressures (6 MPa and 15 MPa) besides atmospheric pressure, but the results suggest only a modest sensitivity to system pressure.
Resumo:
A high-resolution record of sea-level change spanning the past 1000 years is derived from foraminiferal and chronological analyses of a 2m thick salt-marsh peat sequence at Chezzetcook, Nova Scotia, Canada. Former mean tide level positions are reconstructed with a precision of +/- 0.055 in using a transfer function derived from distributions of modern salt-marsh foraminifera. Our age model for the core section older than 300 years is based on 19 AMS C-14 ages and takes into account the individual probability distributions of calibrated radiocarbon ages. The past 300 years is dated by pollen and the isotopes Pb-206, Pb-207, Pb-210, Cs-137 and Am-241. Between AD 1000 and AD 1800, relative sea level rose at a mean rate of 17cm per century. Apparent pre-industrial rises of sea level dated at AD 1500-1550 and AD 1700-1800 cannot be clearly distinguished when radiocarbon age errors are taken into account. Furthermore, they may be an artefact of fluctuations in atmospheric C-14 production. In the 19th century sea level rose at a mean rate of 1.6mm/yr. Between AD 1900 and AD 1920, sea-level rise accelerated to the modern mean rate of 3.2mm/yr. This acceleration corresponds in time with global temperature rise and may therefore be associated with recent global warming. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
A water quality model is used to assess the impact of possible climate change on dissolved oxygen (DO) in the Thames. The Thames catchment is densely populated and, typically, many pressures are anthropogenic. However, that same population also relies on the river for potable water supply and as a disposal route for treated wastewater. Thus, future water quality will be highly dependent on future activity. Dynamic and stochastic modelling has been used to assess the likely impacts on DO dynamics along the river system and the probability distributions associated with future variability. The modelling predictions indicate that warmer river temperatures and drought act to reduce dissolved oxygen concentrations in lowland river systems
Resumo:
A suite of climate change indices derived from daily temperature and precipitation data, with a primary focus on extreme events, were computed and analyzed. By setting an exact formula for each index and using specially designed software, analyses done in different countries have been combined seamlessly. This has enabled the presentation of the most up-to-date and comprehensive global picture of trends in extreme temperature and precipitation indices using results from a number of workshops held in data-sparse regions and high-quality station data supplied by numerous scientists world wide. Seasonal and annual indices for the period 1951-2003 were gridded. Trends in the gridded fields were computed and tested for statistical significance. Results showed widespread significant changes in temperature extremes associated with warming, especially for those indices derived from daily minimum temperature. Over 70% of the global land area sampled showed a significant decrease in the annual occurrence of cold nights and a significant increase in the annual occurrence of warm nights. Some regions experienced a more than doubling of these indices. This implies a positive shift in the distribution of daily minimum temperature throughout the globe. Daily maximum temperature indices showed similar changes but with smaller magnitudes. Precipitation changes showed a widespread and significant increase, but the changes are much less spatially coherent compared with temperature change. Probability distributions of indices derived from approximately 200 temperature and 600 precipitation stations, with near-complete data for 1901-2003 and covering a very large region of the Northern Hemisphere midlatitudes (and parts of Australia for precipitation) were analyzed for the periods 1901-1950, 1951-1978 and 1979-2003. Results indicate a significant warming throughout the 20th century. Differences in temperature indices distributions are particularly pronounced between the most recent two periods and for those indices related to minimum temperature. An analysis of those indices for which seasonal time series are available shows that these changes occur for all seasons although they are generally least pronounced for September to November. Precipitation indices show a tendency toward wetter conditions throughout the 20th century.
Resumo:
Objectives: To clarify the role of growth monitoring in primary school children, including obesity, and to examine issues that might impact on the effectiveness and cost-effectiveness of such programmes. Data sources: Electronic databases were searched up to July 2005. Experts in the field were also consulted. Review methods: Data extraction and quality assessment were performed on studies meeting the review's inclusion criteria. The performance of growth monitoring to detect disorders of stature and obesity was evaluated against National Screening Committee (NSC) criteria. Results: In the 31 studies that were included in the review, there were no controlled trials of the impact of growth monitoring and no studies of the diagnostic accuracy of different methods for growth monitoring. Analysis of the studies that presented a 'diagnostic yield' of growth monitoring suggested that one-off screening might identify between 1: 545 and 1: 1793 new cases of potentially treatable conditions. Economic modelling suggested that growth monitoring is associated with health improvements [ incremental cost per quality-adjusted life-year (QALY) of pound 9500] and indicated that monitoring was cost-effective 100% of the time over the given probability distributions for a willingness to pay threshold of pound 30,000 per QALY. Studies of obesity focused on the performance of body mass index against measures of body fat. A number of issues relating to human resources required for growth monitoring were identified, but data on attitudes to growth monitoring were extremely sparse. Preliminary findings from economic modelling suggested that primary prevention may be the most cost-effective approach to obesity management, but the model incorporated a great deal of uncertainty. Conclusions: This review has indicated the potential utility and cost-effectiveness of growth monitoring in terms of increased detection of stature-related disorders. It has also pointed strongly to the need for further research. Growth monitoring does not currently meet all NSC criteria. However, it is questionable whether some of these criteria can be meaningfully applied to growth monitoring given that short stature is not a disease in itself, but is used as a marker for a range of pathologies and as an indicator of general health status. Identification of effective interventions for the treatment of obesity is likely to be considered a prerequisite to any move from monitoring to a screening programme designed to identify individual overweight and obese children. Similarly, further long-term studies of the predictors of obesity-related co-morbidities in adulthood are warranted. A cluster randomised trial comparing growth monitoring strategies with no growth monitoring in the general population would most reliably determine the clinical effectiveness of growth monitoring. Studies of diagnostic accuracy, alongside evidence of effective treatment strategies, could provide an alternative approach. In this context, careful consideration would need to be given to target conditions and intervention thresholds. Diagnostic accuracy studies would require long-term follow-up of both short and normal children to determine sensitivity and specificity of growth monitoring.
Resumo:
We describe a Bayesian approach to analyzing multilocus genotype or haplotype data to assess departures from gametic (linkage) equilibrium. Our approach employs a Markov chain Monte Carlo (MCMC) algorithm to approximate the posterior probability distributions of disequilibrium parameters. The distributions are computed exactly in some simple settings. Among other advantages, posterior distributions can be presented visually, which allows the uncertainties in parameter estimates to be readily assessed. In addition, background knowledge can be incorporated, where available, to improve the precision of inferences. The method is illustrated by application to previously published datasets; implications for multilocus forensic match probabilities and for simple association-based gene mapping are also discussed.
Resumo:
Real estate development appraisal is a quantification of future expectations. The appraisal model relies upon the valuer/developer having an understanding of the future in terms of the future marketability of the completed development and the future cost of development. In some cases the developer has some degree of control over the possible variation in the variables, as with the cost of construction through the choice of specification. However, other variables, such as the sale price of the final product, are totally dependent upon the vagaries of the market at the completion date. To try to address the risk of a different outcome to the one expected (modelled) the developer will often carry out a sensitivity analysis on the development. However, traditional sensitivity analysis has generally only looked at the best and worst scenarios and has focused on the anticipated or expected outcomes. This does not take into account uncertainty and the range of outcomes that can happen. A fuller analysis should include examination of the uncertainties in each of the components of the appraisal and account for the appropriate distributions of the variables. Similarly, as many of the variables in the model are not independent, the variables need to be correlated. This requires a standardised approach and we suggest that the use of a generic forecasting software package, in this case Crystal Ball, allows the analyst to work with an existing development appraisal model set up in Excel (or other spreadsheet) and to work with a predetermined set of probability distributions. Without a full knowledge of risk, developers are unable to determine the anticipated level of return that should be sought to compensate for the risk. This model allows the user a better understanding of the possible outcomes for the development. Ultimately the final decision will be made relative to current expectations and current business constraints, but by assessing the upside and downside risks more appropriately, the decision maker should be better placed to make a more informed and “better”.
Resumo:
An efficient method of combining neutron diffraction data over an extended Q range with detailed atomistic models is presented. A quantitative and qualitative mapping of the organization of the chain conformation in both glass and liquid phase has been performed. The proposed structural refinement method is based on the exploitation of the intrachain features of the diffraction pattern by the use of internal coordinates for bond lengths, valence angles and torsion rotations. Models are built stochastically by assignment of these internal coordinates from probability distributions with limited variable parameters. Variation of these parameters is used in the construction of models that minimize the differences between the observed and calculated structure factors. A series of neutron scattering data of 1,4-polybutadiene at the region 20320 K is presented. Analysis of the experimental data yield bond lengths for C-C and C=C of 1.54 and 1.35 Å respectively. Valence angles of the backbone were found to be at 112 and 122.8 for the CCC and CC=C respectively. Three torsion angles corresponding to the double bond and the adjacent R and β bonds were found to occupy cis and trans, s(, trans and g( and trans states, respectively. We compare our results with theoretical predictions, computer simulations, RIS models, and previously reported experimental results.
Resumo:
Risk and uncertainty are, to say the least, poorly considered by most individuals involved in real estate analysis - in both development and investment appraisal. Surveyors continue to express 'uncertainty' about the value (risk) of using relatively objective methods of analysis to account for these factors. These methods attempt to identify the risk elements more explicitly. Conventionally this is done by deriving probability distributions for the uncontrolled variables in the system. A suggested 'new' way of "being able to express our uncertainty or slight vagueness about some of the qualitative judgements and not entirely certain data required in the course of the problem..." uses the application of fuzzy logic. This paper discusses and demonstrates the terminology and methodology of fuzzy analysis. In particular it attempts a comparison of the procedures with those used in 'conventional' risk analysis approaches and critically investigates whether a fuzzy approach offers an alternative to the use of probability based analysis for dealing with aspects of risk and uncertainty in real estate analysis
Resumo:
Risk and uncertainty are, to say the least, poorly considered by most individuals involved in real estate analysis - in both development and investment appraisal. Surveyors continue to express 'uncertainty' about the value (risk) of using relatively objective methods of analysis to account for these factors. These methods attempt to identify the risk elements more explicitly. Conventionally this is done by deriving probability distributions for the uncontrolled variables in the system. A suggested 'new' way of "being able to express our uncertainty or slight vagueness about some of the qualitative judgements and not From its modern origins, associated with the urbanising effect of industrialisation, walking has remained a popular form of outdoor recreation. It has, furthermore, remained an important site of class struggle, with the 'landless' seeking to establish their moral 'citizen' right to roam over open country in contradistinction to the 'landed', who have successfully limited this right to legally-defined public rights of way. In the face of declining farm incomes, however, farmers and landowners have, apparently, modified their attitudes towards public access, but only in return for compensation and management payments under grant schemes such as Countryside Stewardship and the Countryside Premium Scheme. With the Ministry of Agriculture, Fisheries and Food now seeking to extend paid access arrangements to other grant schemes, as part of its response to the European Union's Agri-Environment Regulations, access 'rights' are assuming an increasingly commodified form, thereby questioning, if not undermining, the former citizen claims. For rather than being a benefit of citizenship, the existence of limited, often poorly maintained and inadequately signposted, public rights of way has tied inextricably the extension of legally-enforceable access to the needs of the landowners and farmers. At a time of falling prosperity in agriculture, therefore, they have now exercised their discretion by annexing the populism of consumer culture to reproduce the bourgeois liberal values of the market as a principal determinant of the extension of citizen rights of access to the countryside.
Resumo:
During the 20th century, solar activity increased in magnitude to a so-called grand maximum. It is probable that this high level of solar activity is at or near its end. It is of great interest whether any future reduction in solar activity could have a significant impact on climate that could partially offset the projected anthropogenic warming. Observations and reconstructions of solar activity over the last 9000 years are used as a constraint on possible future variations to produce probability distributions of total solar irradiance over the next 100 years. Using this information, with a simple climate model, we present results of the potential implications for future projections of climate on decadal to multidecadal timescales. Using one of the most recent reconstructions of historic total solar irradiance, the likely reduction in the warming by 2100 is found to be between 0.06 and 0.1 K, a very small fraction of the projected anthropogenic warming. However, if past total solar irradiance variations are larger and climate models substantially underestimate the response to solar variations, then there is a potential for a reduction in solar activity to mitigate a small proportion of the future warming, a scenario we cannot totally rule out. While the Sun is not expected to provide substantial delays in the time to reach critical temperature thresholds, any small delays it might provide are likely to be greater for lower anthropogenic emissions scenarios than for higher-emissions scenarios.
Resumo:
A comparison of the point forecasts and the probability distributions of inflation and output growth made by individual respondents to the US Survey of Professional Forecasters indicates that the two sets of forecasts are sometimes inconsistent. We evaluate a number of possible explanations, and find that not all forecasters update their histogram forecasts as new information arrives. This is supported by the finding that the point forecasts are more accurate than the histograms in terms of first-moment prediction.