951 resultados para exceedance probabilities


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The authors focus on one of the methods for connection acceptance control (CAC) in an ATM network: the convolution approach. With the aim of reducing the cost in terms of calculation and storage requirements, they propose the use of the multinomial distribution function. This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements. This in turn makes possible a simple deconvolution process. Moreover, under certain conditions additional improvements may be achieved

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Estimates of drug resistance incidence to modern first-line combination antiretroviral therapies against human immunodeficiency virus (HIV) type 1 are complicated by limited availability of genotypic drug resistance tests (GRTs) and uncertain timing of resistance emergence. METHODS: Five first-line combinations were studied (all paired with lamivudine or emtricitabine): efavirenz (EFV) plus zidovudine (AZT) (n = 524); EFV plus tenofovir (TDF) (n = 615); lopinavir (LPV) plus AZT (n = 573); LPV plus TDF (n = 301); and ritonavir-boosted atazanavir (ATZ/r) plus TDF (n = 250). Virological treatment outcomes were classified into 3 risk strata for emergence of resistance, based on whether undetectable HIV RNA levels were maintained during therapy and, if not, whether viral loads were >500 copies/mL during treatment. Probabilities for presence of resistance mutations were estimated from GRTs (n = 2876) according to risk stratum and therapy received at time of testing. On the basis of these data, events of resistance emergence were imputed for each individual and were assessed using survival analysis. Imputation was repeated 100 times, and results were summarized by median values (2.5th-97.5th percentile range). RESULTS: Six years after treatment initiation, EFV plus AZT showed the highest cumulative resistance incidence (16%) of all regimens (<11%). Confounder-adjusted Cox regression confirmed that first-line EFV plus AZT (reference) was associated with a higher median hazard for resistance emergence, compared with other treatments: EFV plus TDF (hazard ratio [HR], 0.57; range, 0.42-0.76), LPV plus AZT (HR, 0.63; range, 0.45-0.89), LPV plus TDF (HR, 0.55; range, 0.33-0.83), ATZ/r plus TDF (HR, 0.43; range, 0.17-0.83). Two-thirds of resistance events were associated with detectable HIV RNA level ≤500 copies/mL during treatment, and only one-third with virological failure (HIV RNA level, >500 copies/mL). CONCLUSIONS: The inclusion of TDF instead of AZT and ATZ/r was correlated with lower rates of resistance emergence, most likely because of improved tolerability and pharmacokinetics resulting from a once-daily dosage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel test of spatial independence of the distribution of crystals or phases in rocksbased on compositional statistics is introduced. It improves and generalizes the commonjoins-count statistics known from map analysis in geographic information systems.Assigning phases independently to objects in RD is modelled by a single-trial multinomialrandom function Z(x), where the probabilities of phases add to one and areexplicitly modelled as compositions in the K-part simplex SK. Thus, apparent inconsistenciesof the tests based on the conventional joins{count statistics and their possiblycontradictory interpretations are avoided. In practical applications we assume that theprobabilities of phases do not depend on the location but are identical everywhere inthe domain of de nition. Thus, the model involves the sum of r independent identicalmultinomial distributed 1-trial random variables which is an r-trial multinomialdistributed random variable. The probabilities of the distribution of the r counts canbe considered as a composition in the Q-part simplex SQ. They span the so calledHardy-Weinberg manifold H that is proved to be a K-1-affine subspace of SQ. This isa generalisation of the well-known Hardy-Weinberg law of genetics. If the assignmentof phases accounts for some kind of spatial dependence, then the r-trial probabilitiesdo not remain on H. This suggests the use of the Aitchison distance between observedprobabilities to H to test dependence. Moreover, when there is a spatial uctuation ofthe multinomial probabilities, the observed r-trial probabilities move on H. This shiftcan be used as to check for these uctuations. A practical procedure and an algorithmto perform the test have been developed. Some cases applied to simulated and realdata are presented.Key words: Spatial distribution of crystals in rocks, spatial distribution of phases,joins-count statistics, multinomial distribution, Hardy-Weinberg law, Hardy-Weinbergmanifold, Aitchison geometry

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Barraclough and co-workers (in a paper published in 1996) observed that there was a significant positive correlation between the rate of evolution of the rbcL chloroplast gene within families of flowering plants and the number of species in those families. We tested three additional data sets of our own (based on both plastid and nuclear genes) and used methods designed specifically for the comparison of sister families (based on random speciation and extinction). We show that, over all sister groups, the correlation between the rate of gene evolution and an increased diversity is not always present. Despite tending towards a positive association, the observation of individual probabilities presents a U-shaped distribution of association (i.e. it can be either significantly positive or negative). We discuss the influence of both phylogenetic sampling and applied taxonomies on the results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents and discusses the use of Bayesian procedures - introduced through the use of Bayesian networks in Part I of this series of papers - for 'learning' probabilities from data. The discussion will relate to a set of real data on characteristics of black toners commonly used in printing and copying devices. Particular attention is drawn to the incorporation of the proposed procedures as an integral part in probabilistic inference schemes (notably in the form of Bayesian networks) that are intended to address uncertainties related to particular propositions of interest (e.g., whether or not a sample originates from a particular source). The conceptual tenets of the proposed methodologies are presented along with aspects of their practical implementation using currently available Bayesian network software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the social, demographic and economic origins of social security. The data for the U.S. and for a cross section of countries suggest that urbanization and industrialization are associated with the rise of social insurance. We describe an OLG model in which demographics, technology, and social security are linked together in a political economy equilibrium. In the model economy, there are two locations (sectors), the farm (agricultural) and the city (industrial) and the decision to migrate from rural to urban locations is endogenous and linked to productivity differences between the two locations and survival probabilities. Farmers rely on land inheritance for their old age and do not support a pay-as-you-go social security system. With structural change, people migrate to the city, the land loses its importance and support for social security arises. We show that a calibrated version of this economy, where social security taxes are determined by majority voting, is consistent with the historical transformation in the United States.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dynamics of homogeneously heated granular gases which fragment due to particle collisions is analyzed. We introduce a kinetic model which accounts for correlations induced at the grain collisions and analyze both the kinetics and relevant distribution functions these systems develop. The work combines analytical and numerical studies based on direct simulation Monte Carlo calculations. A broad family of fragmentation probabilities is considered, and its implications for the system kinetics are discussed. We show that generically these driven materials evolve asymptotically into a dynamical scaling regime. If the fragmentation probability tends to a constant, the grain number diverges at a finite time, leading to a shattering singularity. If the fragmentation probability vanishes, then the number of grains grows monotonously as a power law. We consider different homogeneous thermostats and show that the kinetics of these systems depends weakly on both the grain inelasticity and driving. We observe that fragmentation plays a relevant role in the shape of the velocity distribution of the particles. When the fragmentation is driven by local stochastic events, the longvelocity tail is essentially exponential independently of the heating frequency and the breaking rule. However, for a Lowe-Andersen thermostat, numerical evidence strongly supports the conjecture that the scaled velocity distribution follows a generalized exponential behavior f (c)~exp (−cⁿ), with n ≈1.2, regarding less the fragmentation mechanisms

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: Negative lifestyle factors are known to be associated with increased cardiovascular risk (CVR) in children, but research on their combined impact on a general population of children is sparse. Therefore, we aimed to quantify the combined impact of easily assessable negative lifestyle factors on the CVR scores of randomly selected children after 4 years. METHODS: Of the 540 randomly selected 6- to 13-year-old children, 502 children participated in a baseline health assessment, and 64% were assessed again after 4 years. Measures included anthropometry, fasting blood samples, and a health assessment questionnaire. Participants scored one point for each negative lifestyle factor at baseline: overweight; physical inactivity; high media consumption; little outdoor time; skipping breakfast; and having a parent who has ever smoked, is inactive, or overweight. A CVR score at follow-up was constructed by averaging sex- and age-related z-scores of waist circumference, blood pressure, glucose, inverted high-density lipoprotein, and triglycerides. RESULTS: The age-, sex-, pubertal stage-, and social class-adjusted probabilities (95% confidence interval) for being in the highest CVR score tertile at follow-up for children who had at most one (n = 48), two (n = 64), three (n = 56), four (n = 41), or five or more (n = 14) risky lifestyle factors were 15.4% (8.9-25.3), 24.3% (17.4-32.8), 36.0% (28.6-44.2), 49.8% (38.6-61.0), and 63.5% (47.2-77.2), respectively. CONCLUSIONS: Even in childhood, an accumulation of negative lifestyle factors is associated with higher CVR scores after 4 years. These negative lifestyle factors are easy to assess in clinical practice and allow early detection and prevention of CVR in childhood.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Responses to external stimuli are typically investigated by averaging peri-stimulus electroencephalography (EEG) epochs in order to derive event-related potentials (ERPs) across the electrode montage, under the assumption that signals that are related to the external stimulus are fixed in time across trials. We demonstrate the applicability of a single-trial model based on patterns of scalp topographies (De Lucia et al, 2007) that can be used for ERP analysis at the single-subject level. The model is able to classify new trials (or groups of trials) with minimal a priori hypotheses, using information derived from a training dataset. The features used for the classification (the topography of responses and their latency) can be neurophysiologically interpreted, because a difference in scalp topography indicates a different configuration of brain generators. An above chance classification accuracy on test datasets implicitly demonstrates the suitability of this model for EEG data. Methods: The data analyzed in this study were acquired from two separate visual evoked potential (VEP) experiments. The first entailed passive presentation of checkerboard stimuli to each of the four visual quadrants (hereafter, "Checkerboard Experiment") (Plomp et al, submitted). The second entailed active discrimination of novel versus repeated line drawings of common objects (hereafter, "Priming Experiment") (Murray et al, 2004). Four subjects per experiment were analyzed, using approx. 200 trials per experimental condition. These trials were randomly separated in training (90%) and testing (10%) datasets in 10 independent shuffles. In order to perform the ERP analysis we estimated the statistical distribution of voltage topographies by a Mixture of Gaussians (MofGs), which reduces our original dataset to a small number of representative voltage topographies. We then evaluated statistically the degree of presence of these template maps across trials and whether and when this was different across experimental conditions. Based on these differences, single-trials or sets of a few single-trials were classified as belonging to one or the other experimental condition. Classification performance was assessed using the Receiver Operating Characteristic (ROC) curve. Results: For the Checkerboard Experiment contrasts entailed left vs. right visual field presentations for upper and lower quadrants, separately. The average posterior probabilities, indicating the presence of the computed template maps in time and across trials revealed significant differences starting at ~60-70 ms post-stimulus. The average ROC curve area across all four subjects was 0.80 and 0.85 for upper and lower quadrants, respectively and was in all cases significantly higher than chance (unpaired t-test, p<0.0001). In the Priming Experiment, we contrasted initial versus repeated presentations of visual object stimuli. Their posterior probabilities revealed significant differences, which started at 250ms post-stimulus onset. The classification accuracy rates with single-trial test data were at chance level. We therefore considered sub-averages based on five single trials. We found that for three out of four subjects' classification rates were significantly above chance level (unpaired t-test, p<0.0001). Conclusions: The main advantage of the present approach is that it is based on topographic features that are readily interpretable along neurophysiologic lines. As these maps were previously normalized by the overall strength of the field potential on the scalp, a change in their presence across trials and between conditions forcibly reflects a change in the underlying generator configurations. The temporal periods of statistical difference between conditions were estimated for each training dataset for ten shuffles of the data. Across the ten shuffles and in both experiments, we observed a high level of consistency in the temporal periods over which the two conditions differed. With this method we are able to analyze ERPs at the single-subject level providing a novel tool to compare normal electrophysiological responses versus single cases that cannot be considered part of any cohort of subjects. This aspect promises to have a strong impact on both basic and clinical research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Osteoporosis is a serious worldwide epidemic. FRAX® is a web-based tool developed by the Sheffield WHO Collaborating Center team, that integrates clinical risk factors and femoral neck BMD and calculates the 10 year fracture probability in order to help health care professionals identify patients who need treatment. However, only 31 countries have a FRAX® calculator. In the absence of a FRAX® model for a particular country, it has been suggested to use a surrogate country for which the epidemiology of osteoporosis most closely approximates the index country. More specific recommendations for clinicians in these countries are not available. In North America, concerns have also been raised regarding the assumptions used to construct the US ethnic specific FRAX® calculators with respect to the correction factors applied to derive fracture probabilities in Blacks, Asians and Hispanics in comparison to Whites. In addition, questions were raised about calculating fracture risk in other ethnic groups e.g., Native Americans and First Canadians. The International Society for Clinical Densitometry (ISCD) in conjunction with the International Osteoporosis Foundation (IOF) assembled an international panel of experts that ultimately developed joint Official Positions of the ISCD and IOF advising clinicians regarding FRAX® usage. As part of the process, the charge of the FRAX® International Task Force was to review and synthesize data regarding geographic and race/ethnic variability in hip fractures, non-hip osteoporotic fractures, and make recommendations about the use of FRAX® in ethnic groups and countries without a FRAX® calculator. This synthesis was presented to the expert panel and constitutes the data on which the subsequent Official Positions are predicated. A summary of the International Task Force composition and charge is presented here.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show that a simple mixing idea allows one to establish a number of explicit formulas for ruin probabilities and related quantities in collective risk models with dependence among claim sizes and among claim inter-occurrence times. Examples include compound Poisson risk models with completely monotone marginal claim size distributions that are dependent according to Archimedean survival copulas as well as renewal risk models with dependent inter-occurrence times.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a theory of choice among lotteries in which the decision maker's attention is drawn to (precisely defined) salient payoffs. This leads the decision maker to a context-dependent representation of lotteries in which true probabilities are replaced by decision weights distorted in favor of salient payoffs. By endogenizing decision weights as a function of payoffs, our model provides a novel and unified account of many empirical phenomena, including frequent risk-seeking behavior, invariance failures such as the Allais paradox, and preference reversals. It also yields new predictions, including some that distinguish it from Prospect Theory, which we test.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prior probabilities represent a core element of the Bayesian probabilistic approach to relatedness testing. This letter opinions on the commentary 'Use of prior odds for missing persons identifications' by Budowle et al. (2011), published recently in this journal. Contrary to Budowle et al. (2011), we argue that the concept of prior probabilities (i) is not endowed with the notion of objectivity, (ii) is not a case for computation and (iii) does not require new guidelines edited by the forensic DNA community - as long as probability is properly considered as an expression of personal belief. Please see related article: http://www.investigativegenetics.com/content/3/1/3

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the assumptions of the Capacitated Facility Location Problem (CFLP) is thatdemand is known and fixed. Most often, this is not the case when managers take somestrategic decisions such as locating facilities and assigning demand points to thosefacilities. In this paper we consider demand as stochastic and we model each of thefacilities as an independent queue. Stochastic models of manufacturing systems anddeterministic location models are put together in order to obtain a formula for thebacklogging probability at a potential facility location.Several solution techniques have been proposed to solve the CFLP. One of the mostrecently proposed heuristics, a Reactive Greedy Adaptive Search Procedure, isimplemented in order to solve the model formulated. We present some computationalexperiments in order to evaluate the heuristics performance and to illustrate the use ofthis new formulation for the CFLP. The paper finishes with a simple simulationexercise.