34 resultados para Cumulative probability distribution functions


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe several simulation algorithms that yield random probability distributions with given values of risk measures. In case of vanilla risk measures, the algorithms involve combining and transforming random cumulative distribution functions or random Lorenz curves obtained by simulating rather general random probability distributions on the unit interval. A new algorithm based on the simulation of a weighted barycentres array is suggested to generate random probability distributions with a given value of the spectral risk measure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyse the variability of the probability distribution of daily wind speed in wintertime over Northern and Central Europe in a series of global and regional climate simulations covering the last centuries, and in reanalysis products covering approximately the last 60 years. The focus of the study lies on identifying the link of the variations in the wind speed distribution to the regional near-surface temperature, to the meridional temperature gradient and to the North Atlantic Oscillation. Our main result is that the link between the daily wind distribution and the regional climate drivers is strongly model dependent. The global models tend to behave similarly, although they show some discrepancies. The two regional models also tend to behave similarly to each other, but surprisingly the results derived from each regional model strongly deviates from the results derived from its driving global model. In addition, considering multi-centennial timescales, we find in two global simulations a long-term tendency for the probability distribution of daily wind speed to widen through the last centuries. The cause for this widening is likely the effect of the deforestation prescribed in these simulations. We conclude that no clear systematic relationship between the mean temperature, the temperature gradient and/or the North Atlantic Oscillation, with the daily wind speed statistics can be inferred from these simulations. The understand- ing of past and future changes in the distribution of wind speeds, and thus of wind speed extremes, will require a detailed analysis of the representation of the interaction between large-scale and small-scale dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives  To determine the improvement in positive predictive value of immunological failure criteria for identifying virological failure in HIV-infected children on antiretroviral therapy (ART) when a single targeted viral load measurement is performed in children identified as having immunological failure. Methods  Analysis of data from children (<16 years at ART initiation) at South African ART sites at which CD4 count/per cent and HIV-RNA monitoring are performed 6-monthly. Immunological failure was defined according to both WHO 2010 and United States Department of Health and Human Services (DHHS) 2008 criteria. Confirmed virological failure was defined as HIV-RNA >5000 copies/ml on two consecutive occasions <365 days apart in a child on ART for ≥18 months. Results  Among 2798 children on ART for ≥18 months [median (IQR) age 50 (21-84) months at ART initiation], the cumulative probability of confirmed virological failure by 42 months on ART was 6.3%. Using targeted viral load after meeting DHHS immunological failure criteria rather than DHHS immunological failure criteria alone increased positive predictive value from 28% to 82%. Targeted viral load improved the positive predictive value of WHO 2010 criteria for identifying confirmed virological failure from 49% to 82%. Conclusion  The addition of a single viral load measurement in children identified as failing immunologically will prevent most switches to second-line treatment in virologically suppressed children.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Opportunistic screening for genital chlamydia infection is being introduced in England, but evidence for the effectiveness of this approach is lacking. There are insufficient data about young peoples' use of primary care services to determine the potential coverage of opportunistic screening in comparison with a systematic population-based approach. AIM: To estimate use of primary care services by young men and women; to compare potential coverage of opportunistic chlamydia screening with a systematic postal approach. DESIGN OF STUDY: Population based cross-sectional study. SETTING: Twenty-seven general practices around Bristol and Birmingham. METHOD: A random sample of patients aged 16-24 years were posted a chlamydia screening pack. We collected details of face-to-face consultations from general practice records. Survival and person-time methods were used to estimate the cumulative probability of attending general practice in 1 year and the coverage achieved by opportunistic and systematic postal chlamydia screening. RESULTS: Of 12 973 eligible patients, an estimated 60.4% (95% confidence interval [CI] = 58.3 to 62.5%) of men and 75.3% (73.7 to 76.9%) of women aged 16-24 years attended their practice at least once in a 1-year period. During this period, an estimated 21.3% of patients would not attend their general practice but would be reached by postal screening, 9.2% would not receive a postal invitation but would attend their practice, and 11.8% would be missed by both methods. CONCLUSIONS: Opportunistic and population-based approaches to chlamydia screening would both fail to contact a substantial minority of the target group, if used alone. A pragmatic approach combining both strategies might achieve higher coverage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Let P be a probability distribution on q -dimensional space. The so-called Diaconis-Freedman effect means that for a fixed dimension d<distributions. The present paper provides necessary and sufficient conditions for this phenomenon in a suitable asymptotic framework with increasing dimension q . It turns out, that the conditions formulated by Diaconis and Freedman (1984) are not only sufficient but necessary as well. Moreover, letting P ^ be the empirical distribution of n independent random vectors with distribution P , we investigate the behavior of the empirical process n √ (P ^ −P) under random projections, conditional on P ^ .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To assess health care utilisation for patients co-infected with TB and HIV (TB-HIV), and to develop a weighted health care index (HCI) score based on commonly used interventions and compare it with patient outcome. METHODS: A total of 1061 HIV patients diagnosed with TB in four regions, Central/Northern, Southern and Eastern Europe and Argentina, between January 2004 and December 2006 were enrolled in the TB-HIV study. A weighted HCI score (range 0–5), based on independent prognostic factors identified in multivariable Cox models and the final score, included performance of TB drug susceptibility testing (DST), an initial TB regimen containing a rifamycin, isoniazid and pyrazinamide, and start of combination antiretroviral treatment (cART). RESULTS: The mean HCI score was highest in Central/Northern Europe (3.2, 95%CI 3.1–3.3) and lowest in Eastern Europe (1.6, 95%CI 1.5–1.7). The cumulative probability of death 1 year after TB diagnosis decreased from 39% (95%CI 31–48) among patients with an HCI score of 0, to 9% (95%CI 6–13) among those with a score of ≥4. In an adjusted Cox model, a 1-unit increase in the HCI score was associated with 27% reduced mortality (relative hazard 0.73, 95%CI 0.64–0.84). CONCLUSIONS: Our results suggest that DST, standard anti-tuberculosis treatment and early cART may improve outcome for TB-HIV patients. The proposed HCI score provides a tool for future research and monitoring of the management of TB-HIV patients. The highest HCI score may serve as a benchmark to assess TB-HIV management, encouraging continuous health care improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a technique for interactive rendering of diffraction effects produced by biological nanostructures such as snake skin surface gratings. Our approach uses imagery from atomic force microscopy that accurately captures the nanostructures responsible for structural coloration, that is, coloration due to wave interference, in a variety of animals. We develop a rendering technique that constructs bidirectional reflection distribution functions (BRDFs) directly from the measured data and leverages precomputation to achieve interactive performance. We demonstrate results of our approach using various shapes of the surface grating nanostructures. Finally, we evaluate the accuracy of our precomputation-based technique and compare to a reference BRDF construction technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two strategies, viz. a globalist as suggested by Loewer, and a localist as advocated by Frigg and Hoefer. Both strategies fail because the system they are part of have rivals that are roughly equally good, while ontic probabilities should be part of a clearly winning system. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The inclusive jet cross-section has been measured in proton-proton collisions at root s = 2.76 TeV in a dataset corresponding to an integrated luminosity of 0.20 pb(-1) collected with the ATLAS detector at the Large Hadron Collider in 2011. Jets are identified using the anti-k(t) algorithm with two radius parameters of 0.4 and 0.6. The inclusive jet double-differential cross-section is presented as a function of the jet transverse momentum p(T) and jet rapidity y, covering a range of 20 <= p(T) < 430 GeV and vertical bar y vertical bar < 4.4. The ratio of the cross-section to the inclusive jet cross-section measurement at root s = 7 TeV, published by the ATLAS Collaboration, is calculated as a function of both transverse momentum and the dimensionless quantity x(T) = 2p(T)/root s, in bins of jet rapidity. The systematic uncertainties on the ratios are significantly reduced due to the cancellation of correlated uncertainties in the two measurements. Results are compared to the prediction from next-to-leading order perturbative QCD calculations corrected for non-perturbative effects, and next-to-leading order Monte Carlo simulation. Furthermore, the ATLAS jet cross-section measurements at root s = 2.76 TeV and root s = 7 TeV are analysed within a framework of next-to-leading order perturbative QCD calculations to determine parton distribution functions of the proton, taking into account the correlations between the measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a technique for interactive rendering of diffraction effects produced by biological nanostructures, such as snake skin surface gratings. Our approach uses imagery from atomic force microscopy that accurately captures the geometry of the nanostructures responsible for structural colouration, that is, colouration due to wave interference, in a variety of animals. We develop a rendering technique that constructs bidirectional reflection distribution functions (BRDFs) directly from the measured data and leverages pre-computation to achieve interactive performance. We demonstrate results of our approach using various shapes of the surface grating nanostructures. Finally, we evaluate the accuracy of our pre-computation-based technique and compare to a reference BRDF construction technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two Boltzmannian accounts of the Second Law, viz.\ a globalist and a localist one. In both cases, the probabilities fail to be chances because they have rivals that are roughly equally good. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Double-differential dijet cross-sections measured in pp collisions at the LHC with a 7TeV centre-of-mass energy are presented as functions of dijet mass and half the rapidity separation of the two highest-pT jets. These measurements are obtained using data corresponding to an integrated luminosity of 4.5 fb−1, recorded by the ATLAS detector in 2011. The data are corrected for detector effects so that cross-sections are presented at the particle level. Cross-sections are measured up to 5TeV dijet mass using jets reconstructed with the anti-kt algorithm for values of the jet radius parameter of 0.4 and 0.6. The cross-sections are compared with next-to-leading-order perturbative QCD calculations by NLOJet++ corrected to account for non-perturbative effects. Comparisons with POWHEG predictions, using a next-to-leading-order matrix element calculation interfaced to a partonshower Monte Carlo simulation, are also shown. Electroweak effects are accounted for in both cases. The quantitative comparison of data and theoretical predictions obtained using various parameterizations of the parton distribution functions is performed using a frequentist method. In general, good agreement with data is observed for the NLOJet++ theoretical predictions when using the CT10, NNPDF2.1 and MSTW 2008 PDF sets. Disagreement is observed when using the ABM11 and HERAPDF1.5 PDF sets for some ranges of dijet mass and half the rapidity separation. An example setting a lower limit on the compositeness scale for a model of contact interactions is presented, showing that the unfolded results can be used to constrain contributions to dijet production beyond that predicted by the Standard Model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Physicians traditionally treat ulcerative colitis (UC) using a step-up approach. Given the paucity of data, we aimed to assess the cumulative probability of UC-related need for step-up therapy and to identify escalation-associated risk factors. METHODS Patients with UC enrolled into the Swiss IBD Cohort Study were analyzed. The following steps from the bottom to the top of the therapeutic pyramid were examined: (1) 5-aminosalicylic acid and/or rectal corticosteroids, (2) systemic corticosteroids, (3) immunomodulators (IM) (azathioprine, 6-mercaptopurine, methotrexate), (4) TNF antagonists, (5) calcineurin inhibitors, and (6) colectomy. RESULTS Data on 996 patients with UC with a median disease duration of 9 years were examined. The point estimates of cumulative use of different treatments at years 1, 5, 10, and 20 after UC diagnosis were 91%, 96%, 96%, and 97%, respectively, for 5-ASA and/or rectal corticosteroids, 63%, 69%, 72%, and 79%, respectively, for systemic corticosteroids, 43%, 57%, 59%, and 64%, respectively, for IM, 15%, 28%, and 35% (up to year 10 only), respectively, for TNF antagonists, 5%, 9%, 11%, and 12%, respectively, for calcineurin inhibitors, 1%, 5%, 9%, and 18%, respectively, for colectomy. The presence of extraintestinal manifestations and extended disease location (at least left-sided colitis) were identified as risk factors for step-up in therapy with systemic corticosteroids, IM, TNF antagonists, calcineurin inhibitors, and surgery. Cigarette smoking at diagnosis was protective against surgery. CONCLUSIONS The presence of extraintestinal manifestations, left-sided colitis, and extensive colitis/pancolitis at the time of diagnosis were associated with use of systemic corticosteroids, IM, TNF antagonists, calcineurin inhibitors, and colectomy during the disease course.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We resolve the real-time dynamics of a purely dissipative s=1/2 quantum spin or, equivalently, hard-core boson model on a hypercubic d-dimensional lattice. The considered quantum dissipative process drives the system to a totally symmetric macroscopic superposition in each of the S3 sectors. Different characteristic time scales are identified for the dynamics and we determine their finite-size scaling. We introduce the concept of cumulative entanglement distribution to quantify multiparticle entanglement and show that the considered protocol serves as an efficient method to prepare a macroscopically entangled Bose-Einstein condensate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High ³⁷Ar activity concentration in soil gas is proposed as a key evidence for the detection of underground nuclear explosion by the Comprehensive Nuclear Test-Ban Treaty. However, such a detection is challenged by the natural background of ³⁷Ar in the subsurface, mainly due to Ca activation by cosmic rays. A better understanding and improved capability to predict ³⁷Ar activity concentration in the subsurface and its spatial and temporal variability is thus required. A numerical model integrating ³⁷Ar production and transport in the subsurface is developed, including variable soil water content and water infiltration at the surface. A parameterized equation for ³⁷Ar production in the first 15 m below the surface is studied, taking into account the major production reactions and the moderation effect of soil water content. Using sensitivity analysis and uncertainty quantification, a realistic and comprehensive probability distribution of natural ³⁷Ar activity concentrations in soil gas is proposed, including the effects of water infiltration. Site location and soil composition are identified as the parameters allowing for a most effective reduction of the possible range of ³⁷Ar activity concentrations. The influence of soil water content on ³⁷Ar production is shown to be negligible to first order, while ³⁷Ar activity concentration in soil gas and its temporal variability appear to be strongly influenced by transient water infiltration events. These results will be used as a basis for practical CTBTO concepts of operation during an OSI.