951 resultados para Probabilities.
Resumo:
In this paper, we evaluate the Probabilistic Occupancy Map (POM) pedestrian detection algorithm on the PETS 2009 benchmark dataset. POM is a multi-camera generative detection method, which estimates ground plane occupancy from multiple background subtraction views. Occupancy probabilities are iteratively estimated by fitting a synthetic model of the background subtraction to the binary foreground motion. Furthermore, we test the integration of this algorithm into a larger framework designed for understanding human activities in real environments. We demonstrate accurate detection and localization on the PETS dataset, despite suboptimal calibration and foreground motion segmentation input.
Resumo:
We quantify the risks of climate-induced changes in key ecosystem processes during the 21st century by forcing a dynamic global vegetation model with multiple scenarios from 16 climate models and mapping the proportions of model runs showing forest/nonforest shifts or exceedance of natural variability in wildfire frequency and freshwater supply. Our analysis does not assign probabilities to scenarios or weights to models. Instead, we consider distribution of outcomes within three sets of model runs grouped by the amount of global warming they simulate: <2°C (including simulations in which atmospheric composition is held constant, i.e., in which the only climate change is due to greenhouse gases already emitted), 2–3°C, and >3°C. High risk of forest loss is shown for Eurasia, eastern China, Canada, Central America, and Amazonia, with forest extensions into the Arctic and semiarid savannas; more frequent wildfire in Amazonia, the far north, and many semiarid regions; more runoff north of 50°N and in tropical Africa and northwestern South America; and less runoff in West Africa, Central America, southern Europe, and the eastern U.S. Substantially larger areas are affected for global warming >3°C than for <2°C; some features appear only at higher warming levels. A land carbon sink of ≈1 Pg of C per yr is simulated for the late 20th century, but for >3°C this sink converts to a carbon source during the 21st century (implying a positive climate feedback) in 44% of cases. The risks continue increasing over the following 200 years, even with atmospheric composition held constant.
Resumo:
Bayesian Model Averaging (BMA) is used for testing for multiple break points in univariate series using conjugate normal-gamma priors. This approach can test for the number of structural breaks and produce posterior probabilities for a break at each point in time. Results are averaged over specifications including: stationary; stationary around trend and unit root models, each containing different types and number of breaks and different lag lengths. The procedures are used to test for structural breaks on 14 annual macroeconomic series and 11 natural resource price series. The results indicate that there are structural breaks in all of the natural resource series and most of the macroeconomic series. Many of the series had multiple breaks. Our findings regarding the existence of unit roots, having allowed for structural breaks in the data, are largely consistent with previous work.
Resumo:
Assigning probabilities to alleged relationships, given DNA profiles, requires, among other things, calculation of a likelihood ratio (LR). Such calculations usually assume independence of genes: this assumption is not appropriate when the tested individuals share recent ancestry due to population substructure. Adjusted LR formulae, incorporating the coancestry coefficient F(ST), are presented here for various two-person relationships, and the issue of mutations in parentage testing is also addressed.
Resumo:
A two-locus match probability is presented that incorporates the effects of within-subpopulation inbreeding (consanguinity) in addition to population subdivision. The usual practice of calculating multi-locus match probabilities as the product of single-locus probabilities assumes independence between loci. There are a number of population genetics phenomena that can violate this assumption: in addition to consanguinity, which increases homozygosity at all loci simultaneously, gametic disequilibrium will introduce dependence into DNA profiles. However, in forensics the latter problem is usually addressed in part by the careful choice of unlinked loci. Hence, as is conventional, we assume gametic equilibrium here, and focus instead on between-locus dependence due to consanguinity. The resulting match probability formulae are an extension of existing methods in the literature, and are shown to be more conservative than these methods in the case of double homozygote matches. For two-locus profiles involving one or more heterozygous genotypes, results are similar to, or smaller than, the existing approaches.
Resumo:
We describe a Bayesian approach to analyzing multilocus genotype or haplotype data to assess departures from gametic (linkage) equilibrium. Our approach employs a Markov chain Monte Carlo (MCMC) algorithm to approximate the posterior probability distributions of disequilibrium parameters. The distributions are computed exactly in some simple settings. Among other advantages, posterior distributions can be presented visually, which allows the uncertainties in parameter estimates to be readily assessed. In addition, background knowledge can be incorporated, where available, to improve the precision of inferences. The method is illustrated by application to previously published datasets; implications for multilocus forensic match probabilities and for simple association-based gene mapping are also discussed.
Resumo:
The intensity and distribution of daily precipitation is predicted to change under scenarios of increased greenhouse gases (GHGs). In this paper, we analyse the ability of HadCM2, a general circulation model (GCM), and a high-resolution regional climate model (RCM), both developed at the Met Office's Hadley Centre, to simulate extreme daily precipitation by reference to observations. A detailed analysis of daily precipitation is made at two UK grid boxes, where probabilities of reaching daily thresholds in the GCM and RCM are compared with observations. We find that the RCM generally overpredicts probabilities of extreme daily precipitation but that, when the GCM and RCM simulated values are scaled to have the same mean as the observations, the RCM captures the upper-tail distribution more realistically. To compare regional changes in daily precipitation in the GHG-forced period 2080-2100 in the GCM and the RCM, we develop two methods. The first considers the fractional changes in probability of local daily precipitation reaching or exceeding a fixed 15 mm threshold in the anomaly climate compared with the control. The second method uses the upper one-percentile of the control at each point as the threshold. Agreement between the models is better in both seasons with the latter method, which we suggest may be more useful when considering larger scale spatial changes. On average, the probability of precipitation exceeding the 1% threshold increases by a factor of 2.5 (GCM and RCM) in winter and by I .7 (GCM) or 1.3 (RCM) in summer.
Resumo:
Empirical studies using satellite data and radiosondes have shown that precipitation increases with column water vapor (CWV) in the tropics, and that this increase is much steeper above some critical CWV value. Here, eight years of 1-min-resolution microwave radiometer and optical gauge data at Nauru Island are analyzed to better understand the relationships among CWV, column liquid water (CLW), and precipitation at small time scales. CWV is found to have large autocorrelation times compared with CLW and precipitation. Before precipitation events, CWV increases on both a synoptic-scale time period and a subsequent shorter time period consistent with mesoscale convective activity; the latter period is associated with the highest CWV levels. Probabilities of precipitation increase greatly with CWV. Given initial high CWV, this increased probability of precipitation persists at least 10–12 h. Even in periods of high CWV, however, probabilities of initial precipitation in a 5-min period remain low enough that there tends to be a lag before the start of the next precipitation event. This is consistent with precipitation occurring stochastically within environments containing high CWV, with the latter being established by a combination of synoptic-scale and mesoscale forcing.
Resumo:
The consistency of ensemble forecasts from three global medium-range prediction systems with the observed transition behaviour of a three-cluster model of the North Atlantic eddy-driven jet is examined. The three clusters consist of a mid jet cluster taken to represent an undisturbed jet and south and north jet clusters representing southward and northward shifts of the jet. The ensemble forecasts span a period of three extended winters (October–February) from October 2007–February 2010. The mean probabilities of transitions between the clusters calculated from the ensemble forecasts are compared with those calculated from a 23-extended-winter climatology taken from the European Centre for Medium-Range Weather Forecasts 40-Year Re-analysis (ERA40) dataset. No evidence of a drift with increasing lead time of the ensemble forecast transition probabilities towards values inconsistent with the 23-extended-winter climatology is found. The ensemble forecasts of transition probabilities are found to have positive Brier Skill at 15 day lead times. It is found that for the three-extended-winter forecast set, probabilistic forecasts initialized in the north jet cluster are generally less skilful than those initialized in the other clusters. This is consistent with the shorter persistence time-scale of the north jet cluster observed in the ERA40 23-extended-winter climatology. Copyright © 2011 Royal Meteorological Society
Resumo:
The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain
Resumo:
Statistical graphics are a fundamental, yet often overlooked, set of components in the repertoire of data analytic tools. Graphs are quick and efficient, yet simple instruments of preliminary exploration of a dataset to understand its structure and to provide insight into influential aspects of inference such as departures from assumptions and latent patterns. In this paper, we present and assess a graphical device for choosing a method for estimating population size in capture-recapture studies of closed populations. The basic concept is derived from a homogeneous Poisson distribution where the ratios of neighboring Poisson probabilities multiplied by the value of the larger neighbor count are constant. This property extends to the zero-truncated Poisson distribution which is of fundamental importance in capture–recapture studies. In practice however, this distributional property is often violated. The graphical device developed here, the ratio plot, can be used for assessing specific departures from a Poisson distribution. For example, simple contaminations of an otherwise homogeneous Poisson model can be easily detected and a robust estimator for the population size can be suggested. Several robust estimators are developed and a simulation study is provided to give some guidance on which should be used in practice. More systematic departures can also easily be detected using the ratio plot. In this paper, the focus is on Gamma mixtures of the Poisson distribution which leads to a linear pattern (called structured heterogeneity) in the ratio plot. More generally, the paper shows that the ratio plot is monotone for arbitrary mixtures of power series densities.
Resumo:
Preference reversals are frequently observed in the lab, but almost all designs use completely transparent prospects, which are rarely features of decision making elsewhere. This raises questions of external validity. We test the robustness of the phenomenon to gambles that incorporate realistic ambiguity in both payoffs and probabilities. In addition, we test a recent explanation of preference reversals by loss aversion, which would also restrict the incidence of reversals outside the lab. According to this account, reversals occur largely because the valuation task endows subject with a gamble, activating loss aversion. This contrasts with the choice task, where the reference point is pre-experiment wealth. We test this explanation by holding the reference point constant. Our evidence suggests that reversals are only slightly diminished with ambiguity. We find no evidence supporting their explanation by loss aversion.
Resumo:
This paper constructs a housing market model to analyse conditions for different generations of households in the UK. Previous policy work has suggested that baby-boomers have benefitted at the expense of younger generations. The model relies on a form of financial accelerator in which existing homeowners reinvest a proportion of the capital gains on moving home. The model is extended to look at homeownership probabilities. It also explains why an increasing share of mortgages has gone to existing owners, despite market liberalisation and securitisation. In addition, the model contributes to the explanation of volatility.
Resumo:
An ensemble forecast is a collection of runs of a numerical dynamical model, initialized with perturbed initial conditions. In modern weather prediction for example, ensembles are used to retrieve probabilistic information about future weather conditions. In this contribution, we are concerned with ensemble forecasts of a scalar quantity (say, the temperature at a specific location). We consider the event that the verification is smaller than the smallest, or larger than the largest ensemble member. We call these events outliers. If a K-member ensemble accurately reflected the variability of the verification, outliers should occur with a base rate of 2/(K + 1). In operational forecast ensembles though, this frequency is often found to be higher. We study the predictability of outliers and find that, exploiting information available from the ensemble, forecast probabilities for outlier events can be calculated which are more skilful than the unconditional base rate. We prove this analytically for statistically consistent forecast ensembles. Further, the analytical results are compared to the predictability of outliers in an operational forecast ensemble by means of model output statistics. We find the analytical and empirical results to agree both qualitatively and quantitatively.
Resumo:
Glacier fluctuations exclusively due to internal variations in the climate system are simulated using downscaled integrations of the ECHAM4/OPYC coupled general circulation model (GCM). A process-based modeling approach using a mass balance model of intermediate complexity and a dynamic ice flow model considering simple shearing flow and sliding are applied. Multimillennia records of glacier length fluctuations for Nigardsbreen (Norway) and Rhonegletscher (Switzerland) are simulated using autoregressive processes determined by statistically downscaled GCM experiments. Return periods and probabilities of specific glacier length changes using GCM integrations excluding external forcings such as solar irradiation changes, volcanic, or anthropogenic effects are analyzed and compared to historical glacier length records. Preindustrial fluctuations of the glaciers as far as observed or reconstructed, including their advance during the “Little Ice Age,” can be explained by internal variability in the climate system as represented by a GCM. However, fluctuations comparable to the present-day glacier retreat exceed any variation simulated by the GCM control experiments and must be caused by external forcing, with anthropogenic forcing being a likely candidate.