996 resultados para 519 Probabilities
Resumo:
The intensity and distribution of daily precipitation is predicted to change under scenarios of increased greenhouse gases (GHGs). In this paper, we analyse the ability of HadCM2, a general circulation model (GCM), and a high-resolution regional climate model (RCM), both developed at the Met Office's Hadley Centre, to simulate extreme daily precipitation by reference to observations. A detailed analysis of daily precipitation is made at two UK grid boxes, where probabilities of reaching daily thresholds in the GCM and RCM are compared with observations. We find that the RCM generally overpredicts probabilities of extreme daily precipitation but that, when the GCM and RCM simulated values are scaled to have the same mean as the observations, the RCM captures the upper-tail distribution more realistically. To compare regional changes in daily precipitation in the GHG-forced period 2080-2100 in the GCM and the RCM, we develop two methods. The first considers the fractional changes in probability of local daily precipitation reaching or exceeding a fixed 15 mm threshold in the anomaly climate compared with the control. The second method uses the upper one-percentile of the control at each point as the threshold. Agreement between the models is better in both seasons with the latter method, which we suggest may be more useful when considering larger scale spatial changes. On average, the probability of precipitation exceeding the 1% threshold increases by a factor of 2.5 (GCM and RCM) in winter and by I .7 (GCM) or 1.3 (RCM) in summer.
Resumo:
Empirical studies using satellite data and radiosondes have shown that precipitation increases with column water vapor (CWV) in the tropics, and that this increase is much steeper above some critical CWV value. Here, eight years of 1-min-resolution microwave radiometer and optical gauge data at Nauru Island are analyzed to better understand the relationships among CWV, column liquid water (CLW), and precipitation at small time scales. CWV is found to have large autocorrelation times compared with CLW and precipitation. Before precipitation events, CWV increases on both a synoptic-scale time period and a subsequent shorter time period consistent with mesoscale convective activity; the latter period is associated with the highest CWV levels. Probabilities of precipitation increase greatly with CWV. Given initial high CWV, this increased probability of precipitation persists at least 1012 h. Even in periods of high CWV, however, probabilities of initial precipitation in a 5-min period remain low enough that there tends to be a lag before the start of the next precipitation event. This is consistent with precipitation occurring stochastically within environments containing high CWV, with the latter being established by a combination of synoptic-scale and mesoscale forcing.
Resumo:
The consistency of ensemble forecasts from three global medium-range prediction systems with the observed transition behaviour of a three-cluster model of the North Atlantic eddy-driven jet is examined. The three clusters consist of a mid jet cluster taken to represent an undisturbed jet and south and north jet clusters representing southward and northward shifts of the jet. The ensemble forecasts span a period of three extended winters (OctoberFebruary) from October 2007February 2010. The mean probabilities of transitions between the clusters calculated from the ensemble forecasts are compared with those calculated from a 23-extended-winter climatology taken from the European Centre for Medium-Range Weather Forecasts 40-Year Re-analysis (ERA40) dataset. No evidence of a drift with increasing lead time of the ensemble forecast transition probabilities towards values inconsistent with the 23-extended-winter climatology is found. The ensemble forecasts of transition probabilities are found to have positive Brier Skill at 15 day lead times. It is found that for the three-extended-winter forecast set, probabilistic forecasts initialized in the north jet cluster are generally less skilful than those initialized in the other clusters. This is consistent with the shorter persistence time-scale of the north jet cluster observed in the ERA40 23-extended-winter climatology. Copyright 2011 Royal Meteorological Society
Resumo:
The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain
Resumo:
Statistical graphics are a fundamental, yet often overlooked, set of components in the repertoire of data analytic tools. Graphs are quick and efficient, yet simple instruments of preliminary exploration of a dataset to understand its structure and to provide insight into influential aspects of inference such as departures from assumptions and latent patterns. In this paper, we present and assess a graphical device for choosing a method for estimating population size in capture-recapture studies of closed populations. The basic concept is derived from a homogeneous Poisson distribution where the ratios of neighboring Poisson probabilities multiplied by the value of the larger neighbor count are constant. This property extends to the zero-truncated Poisson distribution which is of fundamental importance in capturerecapture studies. In practice however, this distributional property is often violated. The graphical device developed here, the ratio plot, can be used for assessing specific departures from a Poisson distribution. For example, simple contaminations of an otherwise homogeneous Poisson model can be easily detected and a robust estimator for the population size can be suggested. Several robust estimators are developed and a simulation study is provided to give some guidance on which should be used in practice. More systematic departures can also easily be detected using the ratio plot. In this paper, the focus is on Gamma mixtures of the Poisson distribution which leads to a linear pattern (called structured heterogeneity) in the ratio plot. More generally, the paper shows that the ratio plot is monotone for arbitrary mixtures of power series densities.
Resumo:
Preference reversals are frequently observed in the lab, but almost all designs use completely transparent prospects, which are rarely features of decision making elsewhere. This raises questions of external validity. We test the robustness of the phenomenon to gambles that incorporate realistic ambiguity in both payoffs and probabilities. In addition, we test a recent explanation of preference reversals by loss aversion, which would also restrict the incidence of reversals outside the lab. According to this account, reversals occur largely because the valuation task endows subject with a gamble, activating loss aversion. This contrasts with the choice task, where the reference point is pre-experiment wealth. We test this explanation by holding the reference point constant. Our evidence suggests that reversals are only slightly diminished with ambiguity. We find no evidence supporting their explanation by loss aversion.
Resumo:
This paper constructs a housing market model to analyse conditions for different generations of households in the UK. Previous policy work has suggested that baby-boomers have benefitted at the expense of younger generations. The model relies on a form of financial accelerator in which existing homeowners reinvest a proportion of the capital gains on moving home. The model is extended to look at homeownership probabilities. It also explains why an increasing share of mortgages has gone to existing owners, despite market liberalisation and securitisation. In addition, the model contributes to the explanation of volatility.
Resumo:
An ensemble forecast is a collection of runs of a numerical dynamical model, initialized with perturbed initial conditions. In modern weather prediction for example, ensembles are used to retrieve probabilistic information about future weather conditions. In this contribution, we are concerned with ensemble forecasts of a scalar quantity (say, the temperature at a specific location). We consider the event that the verification is smaller than the smallest, or larger than the largest ensemble member. We call these events outliers. If a K-member ensemble accurately reflected the variability of the verification, outliers should occur with a base rate of 2/(K + 1). In operational forecast ensembles though, this frequency is often found to be higher. We study the predictability of outliers and find that, exploiting information available from the ensemble, forecast probabilities for outlier events can be calculated which are more skilful than the unconditional base rate. We prove this analytically for statistically consistent forecast ensembles. Further, the analytical results are compared to the predictability of outliers in an operational forecast ensemble by means of model output statistics. We find the analytical and empirical results to agree both qualitatively and quantitatively.
Resumo:
To retain competitiveness, succeed and flourish, organizations are forced to continuously innovate. This drive for innovation is not solely limited to product/process innovation but more profoundly relates to a continuous process of improving how organizations work internally, requiring a constant stream of ideas and suggestions from motivated employees. In this chapter we investigate some recent developments and propose a conceptual framework for creative participation as a personality driven interface between creativity and innovation. Under the assumption that employees intrinsic willingness to contribute novel ideas and solutions requires a set of personal characteristics and necessary skill that might well be unique to each organizational unit, the chapter then explores personal characteristics associated with creativity, innovation and innovative behavior. Various studies on the correlation between creativity and personality types are also reviewed. The chapter provides a discussion of solutions and future development together with recommendations for the future research.
Resumo:
This study presents the first global-scale multi-sectoral regional assessment of the magnitude and uncertainty in the impacts of climate change avoided by emissions policies. The analysis suggests that the most stringent emissions policy considered here which gives a 50% chance of remaining below a 2oC temperature rise target - reduces impacts by 20-65% by 2100 relative to a business-as-usual pathway (A1B) which reaches 4oC, and can delay impacts by several decades. Effects vary between sector and region, and there are few noticeable effects of mitigation policy by 2030. The impacts avoided by 2100 are more strongly influenced by the date and level at which emissions peak than the rate of decline of emissions, with an earlier and lower emissions peak avoiding more impacts. The estimated proportion of impacts avoided at the global scale is relatively robust despite uncertainty in the spatial pattern of climate change, but the absolute amount of avoided impacts is considerably more variable and therefore uncertain.
Resumo:
Glacier fluctuations exclusively due to internal variations in the climate system are simulated using downscaled integrations of the ECHAM4/OPYC coupled general circulation model (GCM). A process-based modeling approach using a mass balance model of intermediate complexity and a dynamic ice flow model considering simple shearing flow and sliding are applied. Multimillennia records of glacier length fluctuations for Nigardsbreen (Norway) and Rhonegletscher (Switzerland) are simulated using autoregressive processes determined by statistically downscaled GCM experiments. Return periods and probabilities of specific glacier length changes using GCM integrations excluding external forcings such as solar irradiation changes, volcanic, or anthropogenic effects are analyzed and compared to historical glacier length records. Preindustrial fluctuations of the glaciers as far as observed or reconstructed, including their advance during the Little Ice Age, can be explained by internal variability in the climate system as represented by a GCM. However, fluctuations comparable to the present-day glacier retreat exceed any variation simulated by the GCM control experiments and must be caused by external forcing, with anthropogenic forcing being a likely candidate.
Resumo:
A necessary condition for a good probabilistic forecast is that the forecast system is shown to be reliable: forecast probabilities should equal observed probabilities verified over a large number of cases. As climate change trends are now emerging from the natural variability, we can apply this concept to climate predictions and compute the reliability of simulated local and regional temperature and precipitation trends (19502011) in a recent multi-model ensemble of climate model simulations prepared for the Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5). With only a single verification time, the verification is over the spatial dimension. The local temperature trends appear to be reliable. However, when the global mean climate response is factored out, the ensemble is overconfident: the observed trend is outside the range of modelled trends in many more regions than would be expected by the model estimate of natural variability and model spread. Precipitation trends are overconfident for all trend definitions. This implies that for near-term local climate forecasts the CMIP5 ensemble cannot simply be used as a reliable probabilistic forecast.
Resumo:
In this paper, we develop a method, termed the Interaction Distribution (ID) method, for analysis of quantitative ecological network data. In many cases, quantitative network data sets are under-sampled, i.e. many interactions are poorly sampled or remain unobserved. Hence, the output of statistical analyses may fail to differentiate between patterns that are statistical artefacts and those which are real characteristics of ecological networks. The ID method can support assessment and inference of under-sampled ecological network data. In the current paper, we illustrate and discuss the ID method based on the properties of plant-animal pollination data sets of flower visitation frequencies. However, the ID method may be applied to other types of ecological networks. The method can supplement existing network analyses based on two definitions of the underlying probabilities for each combination of pollinator and plant species: (1), pi,j: the probability for a visit made by the ith pollinator species to take place on the jth plant species; (2), qi,j: the probability for a visit received by the jth plant species to be made by the ith pollinator. The method applies the Dirichlet distribution to estimate these two probabilities, based on a given empirical data set. The estimated mean values for pi,j and qi,j reflect the relative differences between recorded numbers of visits for different pollinator and plant species, and the estimated uncertainty of pi,j and qi,j decreases with higher numbers of recorded visits.