978 resultados para Risk-Neutral Probability
Resumo:
PURPOSE Rapid assessment and intervention is important for the prognosis of acutely ill patients admitted to the emergency department (ED). The aim of this study was to prospectively develop and validate a model predicting the risk of in-hospital death based on all available information available at the time of ED admission and to compare its discriminative performance with a non-systematic risk estimate by the triaging first health-care provider. METHODS Prospective cohort analysis based on a multivariable logistic regression for the probability of death. RESULTS A total of 8,607 consecutive admissions of 7,680 patients admitted to the ED of a tertiary care hospital were analysed. Most frequent APACHE II diagnostic categories at the time of admission were neurological (2,052, 24 %), trauma (1,522, 18 %), infection categories [1,328, 15 %; including sepsis (357, 4.1 %), severe sepsis (249, 2.9 %), septic shock (27, 0.3 %)], cardiovascular (1,022, 12 %), gastrointestinal (848, 10 %) and respiratory (449, 5 %). The predictors of the final model were age, prolonged capillary refill time, blood pressure, mechanical ventilation, oxygen saturation index, Glasgow coma score and APACHE II diagnostic category. The model showed good discriminative ability, with an area under the receiver operating characteristic curve of 0.92 and good internal validity. The model performed significantly better than non-systematic triaging of the patient. CONCLUSIONS The use of the prediction model can facilitate the identification of ED patients with higher mortality risk. The model performs better than a non-systematic assessment and may facilitate more rapid identification and commencement of treatment of patients at risk of an unfavourable outcome.
Resumo:
Assessing and managing risks relating to the consumption of food stuffs for humans and to the environment has been one of the most complex legal issues in WTO law, ever since the Agreement on Sanitary and Phytosanitary Measures was adopted at the end of the Uruguay Round and entered into force in 1995. The problem was expounded in a number of cases. Panels and the Appellate Body adopted different philosophies in interpreting the agreement and the basic concept of risk assessment as defined in Annex A para. 4 of the Agreement. Risk assessment entails fundamental question on law and science. Different interpretations reflect different underlying perceptions of science and its relationship to the law. The present thesis supported by the Swiss National Research Foundation undertakes an in-depth analysis of these underlying perceptions. The author expounds the essence and differences of positivism and relativism in philosophy and natural sciences. He clarifies the relationship of fundamental concepts such as risk, hazards and probability. This investigation is a remarkable effort on the part of lawyer keen to learn more about the fundamentals based upon which the law – often unconsciously – is operated by the legal profession and the trade community. Based upon these insights, he turns to a critical assessment of jurisprudence both of panels and the Appellate Body. Extensively referring and discussing the literature, he deconstructs findings and decisions in light of implied and assumed underlying philosophies and perceptions as to the relationship of law and science, in particular in the field of food standards. Finding that both positivism and relativism does not provide adequate answers, the author turns critical rationalism and applies the methodologies of falsification developed by Karl R. Popper. Critical rationalism allows combining discourse in science and law and helps preparing the ground for a new approach to risk assessment and risk management. Linking the problem to the doctrine of multilevel governance the author develops a theory allocating risk assessment to international for a while leaving the matter of risk management to national and democratically accountable government. While the author throughout the thesis questions the possibility of separating risk assessment and risk management, the thesis offers new avenues which may assist in structuring a complex and difficult problem
Resumo:
Off-site effects of soil erosion are becoming increasingly important, particularly the pollution of surface waters. In order to develop environmentally efficient and cost effective mitigation options it is essential to identify areas that bear both a high erosion risk and high connectivity to surface waters. This paper introduces a simple risk assessment tool that allows the delineation of potential critical source areas (CSA) of sediment input into surface waters concerning the agricultural areas of Switzerland. The basis are the erosion risk map with a 2 m resolution (ERM2) and the drainage network, which is extended by drained roads, farm tracks, and slope depressions. The probability of hydrological and sedimentological connectivity is assessed by combining soil erosion risk and extended drainage network with flow distance calculation. A GIS-environment with multiple-flow accumulation algorithms is used for routing runoff generation and flow pathways. The result is a high resolution connectivity map of the agricultural area of Switzerland (888,050 ha). Fifty-five percent of the computed agricultural area is potentially connected with surface waters, 45% is not connected. Surprisingly, the larger part of 34% (62% of the connected area) is indirectly connected with surface waters through drained roads, and only 21% are directly connected. The reason is the topographic complexity and patchiness of the landscape due to a dense road and drainage network. A total of 24% of the connected area and 13% of the computed agricultural area, respectively, are rated with a high connectivity probability. On these CSA an adapted land use is recommended, supported by vegetated buffer strips preventing sediment load. Even areas that are far away from open water bodies can be indirectly connected and need to be included in planning of mitigation measures. Thus, the connectivity map presented is an important decision-making tool for policy-makers and extension services. The map is published on the web and thus available for application.
Resumo:
BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20% or 40% of patients in seven cohorts of patients starting ART in South Africa, and plotted cut-offs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia and the Asia-Pacific. FINDINGS 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African, from 64% to 93% in the Zambian and from 73% to 96% in the Asia-Pacific cohorts. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia and from 37% to 71% in Asia-Pacific. The area under the receiver-operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia and from 0.77 to 0.92 in Asia Pacific. INTERPRETATION CD4-based risk charts with optimal cut-offs for targeted VL testing may be useful to monitor ART in settings where VL capacity is limited.
Resumo:
We investigated the neural mechanisms and the autonomic and cognitive responses associated with visual avoidance behavior in spider phobia. Spider phobic and control participants imagined visiting different forest locations with the possibility of encountering spiders, snakes, or birds (neutral reference category). In each experimental trial, participants saw a picture of a forest location followed by a picture of a spider, snake, or bird, and then rated their personal risk of encountering these animals in this context, as well as their fear. The greater the visual avoidance of spiders that a phobic participant demonstrated (as measured by eye tracking), the higher were her autonomic arousal and neural activity in the amygdala, orbitofrontal cortex (OFC), anterior cingulate cortex (ACC), and precuneus at picture onset. Visual avoidance of spiders in phobics also went hand in hand with subsequently reduced cognitive risk of encounters. Control participants, in contrast, displayed a positive relationship between gaze duration toward spiders, on the one hand, and autonomic responding, as well as OFC, ACC, and precuneus activity, on the other hand. In addition, they showed reduced encounter risk estimates when they looked longer at the animal pictures. Our data are consistent with the idea that one reason for phobics to avoid phobic information may be grounded in heightened activity in the fear circuit, which signals potential threat. Because of the absence of alternative efficient regulation strategies, visual avoidance may then function to down-regulate cognitive risk evaluations for threatening information about the phobic stimuli. Control participants, in contrast, may be characterized by a different coping style, whereby paying visual attention to potentially threatening information may help them to actively down-regulate cognitive evaluations of risk.
Resumo:
The fatality risk caused by avalanches on road networks can be analysed using a long-term approach, resulting in a mean value of risk, and with emphasis on short-term fluctuations due to the temporal variability of both, the hazard potential and the damage potential. In this study, the approach for analysing the long-term fatality risk has been adapted by modelling the highly variable short-term risk. The emphasis was on the temporal variability of the damage potential and the related risk peaks. For defined hazard scenarios resulting from classified amounts of snow accumulation, the fatality risk was calculated by modelling the hazard potential and observing the traffic volume. The avalanche occurrence probability was calculated using a statistical relationship between new snow height and observed avalanche releases. The number of persons at risk was determined from the recorded traffic density. The method resulted in a value for the fatality risk within the observed time frame for the studied road segment. The long-term fatality risk due to snow avalanches as well as the short-term fatality risk was compared to the average fatality risk due to traffic accidents. The application of the method had shown that the long-term avalanche risk is lower than the fatality risk due to traffic accidents. The analyses of short-term avalanche-induced fatality risk provided risk peaks that were 50 times higher than the statistical accident risk. Apart from situations with high hazard level and high traffic density, risk peaks result from both, a high hazard level combined with a low traffic density and a high traffic density combined with a low hazard level. This provided evidence for the importance of the temporal variability of the damage potential for risk simulations on road networks. The assumed dependence of the risk calculation on the sum of precipitation within three days is a simplified model. Thus, further research is needed for an improved determination of the diurnal avalanche probability. Nevertheless, the presented approach may contribute as a conceptual step towards a risk-based decision-making in risk management.
Resumo:
Elicitability has recently been discussed as a desirable property for risk measures. Kou and Peng (2014) showed that an elicitable distortion risk measure is either a Value-at-Risk or the mean. We give a concise alternative proof of this result, and discuss the conflict between comonotonic additivity and elicitability.
Resumo:
OBJECTIVE In patients with a long life expectancy with high-risk (HR) prostate cancer (PCa), the chance to die from PCa is not negligible and may change significantly according to the time elapsed from surgery. The aim of this study was to evaluate long-term survival patterns in young patients treated with radical prostatectomy (RP) for HRPCa. MATERIALS AND METHODS Within a multiinstitutional cohort, 600 young patients (≤59 years) treated with RP between 1987 and 2012 for HRPCa (defined as at least one of the following adverse characteristics: prostate specific antigen>20, cT3 or higher, biopsy Gleason sum 8-10) were identified. Smoothed cumulative incidence plot was performed to assess cancer-specific mortality (CSM) and other cause mortality (OCM) rates at 10, 15, and 20 years after RP. The same analyses were performed to assess the 5-year probability of CSM and OCM in patients who survived 5, 10, and 15 years after RP. A multivariable competing risk regression model was fitted to identify predictors of CSM and OCM. RESULTS The 10-, 15- and 20-year CSM and OCM rates were 11.6% and 5.5% vs. 15.5% and 13.5% vs. 18.4% and 19.3%, respectively. The 5-year probability of CSM and OCM rates among patients who survived at 5, 10, and 15 years after RP, were 6.4% and 2.7% vs. 4.6% and 9.6% vs. 4.2% and 8.2%, respectively. Year of surgery, pathological stage and Gleason score, surgical margin status and lymph node invasion were the major determinants of CSM (all P≤0.03). Conversely, none of the covariates was significantly associated with OCM (all P≥ 0.09). CONCLUSIONS Very long-term cancer control in young high-risk patients after RP is highly satisfactory. The probability of dying from PCa in young patients is the leading cause of death during the first 10 years of survivorship after RP. Thereafter, mortality not related to PCa became the main cause of death. Consequently, surgery should be consider among young patients with high-risk disease and strict PCa follow-up should enforce during the first 10 years of survivorship after RP.
Resumo:
An accurate detection of individuals at clinical high risk (CHR) for psychosis is a prerequisite for effective preventive interventions. Several psychometric interviews are available, but their prognostic accuracy is unknown. We conducted a prognostic accuracy meta-analysis of psychometric interviews used to examine referrals to high risk services. The index test was an established CHR psychometric instrument used to identify subjects with and without CHR (CHR+ and CHR-). The reference index was psychosis onset over time in both CHR+ and CHR- subjects. Data were analyzed with MIDAS (STATA13). Area under the curve (AUC), summary receiver operating characteristic curves, quality assessment, likelihood ratios, Fagan's nomogram and probability modified plots were computed. Eleven independent studies were included, with a total of 2,519 help-seeking, predominately adult subjects (CHR+: N=1,359; CHR-: N=1,160) referred to high risk services. The mean follow-up duration was 38 months. The AUC was excellent (0.90; 95% CI: 0.87-0.93), and comparable to other tests in preventive medicine, suggesting clinical utility in subjects referred to high risk services. Meta-regression analyses revealed an effect for exposure to antipsychotics and no effects for type of instrument, age, gender, follow-up time, sample size, quality assessment, proportion of CHR+ subjects in the total sample. Fagan's nomogram indicated a low positive predictive value (5.74%) in the general non-help-seeking population. Albeit the clear need to further improve prediction of psychosis, these findings support the use of psychometric prognostic interviews for CHR as clinical tools for an indicated prevention in subjects seeking help at high risk services worldwide.
Resumo:
Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes’ theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment.
Resumo:
Trabecular bone score (TBS) is a grey-level textural index of bone microarchitecture derived from lumbar spine dual-energy X-ray absorptiometry (DXA) images. TBS is a BMD-independent predictor of fracture risk. The objective of this meta-analysis was to determine whether TBS predicted fracture risk independently of FRAX probability and to examine their combined performance by adjusting the FRAX probability for TBS. We utilized individual level data from 17,809 men and women in 14 prospective population-based cohorts. Baseline evaluation included TBS and the FRAX risk variables and outcomes during follow up (mean 6.7 years) comprised major osteoporotic fractures. The association between TBS, FRAX probabilities and the risk of fracture was examined using an extension of the Poisson regression model in each cohort and for each sex and expressed as the gradient of risk (GR; hazard ratio per 1SD change in risk variable in direction of increased risk). FRAX probabilities were adjusted for TBS using an adjustment factor derived from an independent cohort (the Manitoba Bone Density Cohort). Overall, the GR of TBS for major osteoporotic fracture was 1.44 (95% CI: 1.35-1.53) when adjusted for age and time since baseline and was similar in men and women (p > 0.10). When additionally adjusted for FRAX 10-year probability of major osteoporotic fracture, TBS remained a significant, independent predictor for fracture (GR 1.32, 95%CI: 1.24-1.41). The adjustment of FRAX probability for TBS resulted in a small increase in the GR (1.76, 95%CI: 1.65, 1.87 vs. 1.70, 95%CI: 1.60-1.81). A smaller change in GR for hip fracture was observed (FRAX hip fracture probability GR 2.25 vs. 2.22). TBS is a significant predictor of fracture risk independently of FRAX. The findings support the use of TBS as a potential adjustment for FRAX probability, though the impact of the adjustment remains to be determined in the context of clinical assessment guidelines. This article is protected by copyright. All rights reserved.
Resumo:
BACKGROUND Physicians traditionally treat ulcerative colitis (UC) using a step-up approach. Given the paucity of data, we aimed to assess the cumulative probability of UC-related need for step-up therapy and to identify escalation-associated risk factors. METHODS Patients with UC enrolled into the Swiss IBD Cohort Study were analyzed. The following steps from the bottom to the top of the therapeutic pyramid were examined: (1) 5-aminosalicylic acid and/or rectal corticosteroids, (2) systemic corticosteroids, (3) immunomodulators (IM) (azathioprine, 6-mercaptopurine, methotrexate), (4) TNF antagonists, (5) calcineurin inhibitors, and (6) colectomy. RESULTS Data on 996 patients with UC with a median disease duration of 9 years were examined. The point estimates of cumulative use of different treatments at years 1, 5, 10, and 20 after UC diagnosis were 91%, 96%, 96%, and 97%, respectively, for 5-ASA and/or rectal corticosteroids, 63%, 69%, 72%, and 79%, respectively, for systemic corticosteroids, 43%, 57%, 59%, and 64%, respectively, for IM, 15%, 28%, and 35% (up to year 10 only), respectively, for TNF antagonists, 5%, 9%, 11%, and 12%, respectively, for calcineurin inhibitors, 1%, 5%, 9%, and 18%, respectively, for colectomy. The presence of extraintestinal manifestations and extended disease location (at least left-sided colitis) were identified as risk factors for step-up in therapy with systemic corticosteroids, IM, TNF antagonists, calcineurin inhibitors, and surgery. Cigarette smoking at diagnosis was protective against surgery. CONCLUSIONS The presence of extraintestinal manifestations, left-sided colitis, and extensive colitis/pancolitis at the time of diagnosis were associated with use of systemic corticosteroids, IM, TNF antagonists, calcineurin inhibitors, and colectomy during the disease course.
Resumo:
BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20%, or 40% of patients in 7 cohorts of patients starting ART in South Africa, and plotted cutoffs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia, and the Asia-Pacific. RESULTS In total, 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African cohort, from 64% to 93% in the Zambian cohort, and from 73% to 96% in the Asia-Pacific cohort. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia, and from 37% to 71% in Asia-Pacific. The area under the receiver operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia, and from 0.77 to 0.92 in Asia-Pacific. CONCLUSIONS CD4-based risk charts with optimal cutoffs for targeted VL testing maybe useful to monitor ART in settings where VL capacity is limited.
Resumo:
BACKGROUND Ductal carcinoma in situ (DCIS) is a noninvasive breast lesion with uncertain risk for invasive progression. Usual care (UC) for DCIS consists of treatment upon diagnosis, thus potentially overtreating patients with low propensity for progression. One strategy to reduce overtreatment is active surveillance (AS), whereby DCIS is treated only upon detection of invasive disease. Our goal was to perform a quantitative evaluation of outcomes following an AS strategy for DCIS. METHODS Age-stratified, 10-year disease-specific cumulative mortality (DSCM) for AS was calculated using a computational risk projection model based upon published estimates for natural history parameters, and Surveillance, Epidemiology, and End Results data for outcomes. AS projections were compared with the DSCM for patients who received UC. To quantify the propagation of parameter uncertainty, a 95% projection range (PR) was computed, and sensitivity analyses were performed. RESULTS Under the assumption that AS cannot outperform UC, the projected median differences in 10-year DSCM between AS and UC when diagnosed at ages 40, 55, and 70 years were 2.6% (PR = 1.4%-5.1%), 1.5% (PR = 0.5%-3.5%), and 0.6% (PR = 0.0%-2.4), respectively. Corresponding median numbers of patients needed to treat to avert one breast cancer death were 38.3 (PR = 19.7-69.9), 67.3 (PR = 28.7-211.4), and 157.2 (PR = 41.1-3872.8), respectively. Sensitivity analyses showed that the parameter with greatest impact on DSCM was the probability of understaging invasive cancer at diagnosis. CONCLUSION AS could be a viable management strategy for carefully selected DCIS patients, particularly among older age groups and those with substantial competing mortality risks. The effectiveness of AS could be markedly improved by reducing the rate of understaging.
Resumo:
Astronauts performing extravehicular activities (EVA) are at risk for occupational hazards due to a hypobaric environment, in particular Decompression Sickness (DCS). DCS results from nitrogen gas bubble formation in body tissues and venous blood. Denitrogenation achieved through lengthy staged decompression protocols has been the mainstay of prevention of DCS in space. Due to the greater number and duration of EVAs scheduled for construction and maintenance of the International Space Station, more efficient alternatives to accomplish missions without compromising astronaut safety are desirable. ^ This multi-center, multi-phase study (NASA-Prebreathe Reduction Protocol study, or PRP) was designed to identify a shorter denitrogenation protocol that can be implemented before an EVA, based on the combination of adynamia and exercise enhanced oxygen prebreathe. Human volunteers recruited at three sites (Texas, North Carolina and Canada) underwent three different combinations (“PRP phases”) of intense and light exercise prior to decompression in an altitude chamber. The outcome variables were detection of venous gas embolism (VGE) by precordial Doppler ultrasound, and clinical manifestations of DCS. Independent variables included age, gender, body mass index, oxygen consumption peak, peak heart rate, and PRP phase. Data analysis was performed both by pooling results from all study sites, and by examining each site separately. ^ Ten percent of the subjects developed DCS and 20% showed evidence of high grade VGE. No cases of DCS occurred in one particular PRP phase with use of the combination of dual-cycle ergometry (10 minutes at 75% of VO2 peak) plus 24 minutes of light EVA exercise (p = 0.04). No significant effects were found for the remaining independent variables on the occurrence of DCS. High grade VGE showed a strong correlation with subsequent development of DCS (sensitivity, 88.2%; specificity, 87.2%). In the presence of high grade VGE, the relative risk for DCS ranged from 7.52 to 35.0. ^ In summary, a good safety level can be achieved with exercise-enhanced oxygen denitrogenation that can be generalized to the astronaut population. Exercise is beneficial in preventing DCS if a specific schedule is followed, with an individualized VO2 prescription that provides a safety level that can then be applied to space operations. Furthermore, VGE Doppler detection is a useful clinical tool for prediction of altitude DCS. Because of the small number of high grade VGE episodes, the identification of a high probability DCS situation based on the presence of high grade VGE seems justified in astronauts. ^