871 resultados para Conditional-value-at-risk assessment
Resumo:
BACKGROUND/AIM: Parallel investigation, in a matched case-control study, of the association of different first-trimester markers with the risk of subsequent pre-eclampsia (PE). METHOD: The levels of different first trimester serum markers and fetal nuchal translucency thickness were compared between 52 cases of PE and 104 control women by non-parametric two-group comparisons and by calculating matched odds ratios. RESULTS: In univariable analysis increased concentrations of inhibin A and activin A were associated with subsequent PE (p < 0.02). Multivariable conditional logistic regression models revealed an association between increased risk of PE and increased inhibin A and translucency thickness and respectively reduced pregnancy-associated plasma protein A (PAPP-A) and placental lactogen . However, these associations varied with the gestational age at sample collection. For blood samples taken in pregnancy weeks 12 and 13 only, increased levels of activin A, inhibin A and nuchal translucency thickness, and lower levels of placenta growth factor and PAPP-A were associated with an increased risk of PE. CONCLUSIONS: Members of the inhibin family and to some extent PAPP-A and placental growth factor are superior to other serum markers, and the predictive value of these depends on the gestational age at blood sampling. The availability of a single, early pregnancy 'miracle' serum marker for PE risk assessment seems unlikely in the near future.
Resumo:
There is a need to validate risk assessment tools for hospitalised medical patients at risk of venous thromboembolism (VTE). We investigated whether a predefined cut-off of the Geneva Risk Score, as compared to the Padua Prediction Score, accurately distinguishes low-risk from high-risk patients regardless of the use of thromboprophylaxis. In the multicentre, prospective Explicit ASsessment of Thromboembolic RIsk and Prophylaxis for Medical PATients in SwitzErland (ESTIMATE) cohort study, 1,478 hospitalised medical patients were enrolled of whom 637 (43%) did not receive thromboprophylaxis. The primary endpoint was symptomatic VTE or VTE-related death at 90 days. The study is registered at ClinicalTrials.gov, number NCT01277536. According to the Geneva Risk Score, the cumulative rate of the primary endpoint was 3.2% (95% confidence interval [CI] 2.2-4.6%) in 962 high-risk vs 0.6% (95% CI 0.2-1.9%) in 516 low-risk patients (p=0.002); among patients without prophylaxis, this rate was 3.5% vs 0.8% (p=0.029), respectively. In comparison, the Padua Prediction Score yielded a cumulative rate of the primary endpoint of 3.5% (95% CI 2.3-5.3%) in 714 high-risk vs 1.1% (95% CI 0.6-2.3%) in 764 low-risk patients (p=0.002); among patients without prophylaxis, this rate was 3.2% vs 1.5% (p=0.130), respectively. Negative likelihood ratio was 0.28 (95% CI 0.10-0.83) for the Geneva Risk Score and 0.51 (95% CI 0.28-0.93) for the Padua Prediction Score. In conclusion, among hospitalised medical patients, the Geneva Risk Score predicted VTE and VTE-related mortality and compared favourably with the Padua Prediction Score, particularly for its accuracy to identify low-risk patients who do not require thromboprophylaxis.
Resumo:
BACKGROUND Heart failure with preserved ejection fraction (HFpEF) represents a growing health burden associated with substantial mortality and morbidity. Consequently, risk prediction is of highest importance. Endothelial dysfunction has been recently shown to play an important role in the complex pathophysiology of HFpEF. We therefore aimed to assess von Willebrand factor (vWF), a marker of endothelial damage, as potential biomarker for risk assessment in patients with HFpEF. METHODS AND RESULTS Concentrations of vWF were assessed in 457 patients with HFpEF enrolled as part of the LUdwigshafen Risk and Cardiovascular Health (LURIC) study. All-cause mortality was observed in 40% of patients during a median follow-up time of 9.7 years. vWF significantly predicted mortality with a hazard ratio (HR) per increase of 1 SD of 1.45 (95% confidence interval, 1.26-1.68; P<0.001) and remained a significant predictor after adjustment for age, sex, body mass index, N-terminal pro-B-type natriuretic peptide (NT-proBNP), renal function, and frequent HFpEF-related comorbidities (adjusted HR per 1 SD, 1.22; 95% confidence interval, 1.05-1.42; P=0.001). Most notably, vWF showed additional prognostic value beyond that achievable with NT-proBNP indicated by improvements in C-Statistic (vWF×NT-proBNP: 0.65 versus NT-proBNP: 0.63; P for comparison, 0.004) and category-free net reclassification index (37.6%; P<0.001). CONCLUSIONS vWF is an independent predictor of long-term outcome in patients with HFpEF, which is in line with endothelial dysfunction as potential mediator in the pathophysiology of HFpEF. In particular, combined assessment of vWF and NT-proBNP improved risk prediction in this vulnerable group of patients.
Resumo:
Colombia is one of the largest per capita mercury polluters in the world as a consequence of its artisanal gold mining activities. The severity of this problem in terms of potential health effects was evaluated by means of a probabilistic risk assessment carried out in the twelve departments (or provinces) in Colombia with the largest gold production. The two exposure pathways included in the risk assessment were inhalation of elemental Hg vapors and ingestion of fish contaminated with methyl mercury. Exposure parameters for the adult population (especially rates of fish consumption) were obtained from nation-wide surveys and concentrations of Hg in air and of methyl-mercury in fish were gathered from previous scientific studies. Fish consumption varied between departments and ranged from 0 to 0.3 kg d?1. Average concentrations of total mercury in fish (70 data) ranged from 0.026 to 3.3 lg g?1. A total of 550 individual measurements of Hg in workshop air (ranging from menor queDL to 1 mg m?3) and 261 measurements of Hg in outdoor air (ranging from menor queDL to 0.652 mg m?3) were used to generate the probability distributions used as concentration terms in the calculation of risk. All but two of the distributions of Hazard Quotients (HQ) associated with ingestion of Hg-contaminated fish for the twelve regions evaluated presented median values higher than the threshold value of 1 and the 95th percentiles ranged from 4 to 90. In the case of exposure to Hg vapors, minimum values of HQ for the general population exceeded 1 in all the towns included in this study, and the HQs for miner-smelters burning the amalgam is two orders of magnitude higher, reaching values of 200 for the 95th percentile. Even acknowledging the conservative assumptions included in the risk assessment and the uncertainties associated with it, its results clearly reveal the exorbitant levels of risk endured not only by miner-smelters but also by the general population of artisanal gold mining communities in Colombia.
Resumo:
Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives-defined as a choice that makes preferred consequences more likely-requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the relevance of new information, relative to the initial ( and smaller) set of data on which the decision was based. We exemplify this seemingly simple situation using risk management of BSE. As an integral aspect of causal analysis under risk, the methods developed in this paper permit the addition of non-linear, hormetic dose-response models to the current set of regulatory defaults such as the linear, non-threshold models. This increase in the number of defaults is an important improvement because most of the variants of the precautionary principle require cost-benefit balancing. Specifically, increasing the set of causal defaults accounts for beneficial effects at very low doses. We also show and conclude that quantitative risk assessment dominates qualitative risk assessment, supporting the extension of the set of default causal models.
Resumo:
This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.
Resumo:
Strategic sourcing has increased in importance in recent years, and now plays an important role in companies’ planning. The current volatility in supply markets means companies face multiple challenges involving lock-in situations, supplier bankruptcies or supply security issues. In addition, their exposure can increase due to natural disasters, as witnessed recently in the form of bird flu, volcanic ash and tsunamis. Therefore, the primary focus of this study is risk management in the context of strategic sourcing. The study presents a literature review on sourcing based on the 15 years from 1998–2012, and considers 131 academic articles. The literature describes strategic sourcing as a strategic, holistic process in managing supplier relationships, with a long-term focus on adding value to the company and realising competitive advantage. Few studies discovered the real risk impact and status of risk management in strategic sourcing, and evaluation across countries and industries was limited, with the construction sector particularly under-researched. This methodology is founded on a qualitative study of twenty cases across Ger-many and the United Kingdom from the construction sector and electronics manufacturing industries. While considering risk management in the context of strategic sourcing, the thesis takes into account six dimensions that cover trends in strategic sourcing, theoretical and practical sourcing models, risk management, supply and demand management, critical success factors and the strategic supplier evaluation. The study contributes in several ways. First, recent trends are traced and future needs identified across the research dimensions of countries, industries and companies. Second, it evaluates critical success factors in contemporary strategic sourcing. Third, it explores the application of theoretical and practical sourcing models in terms of effectiveness and sustainability. Fourth, based on the case study findings, a risk-oriented strategic sourcing framework and a model for strategic sourcing are developed. These are based on the validation of contemporary requirements and a critical evaluation of the existing situation. It contemplates the empirical findings and leads to a structured process to manage risk in strategic sourcing. The risk-oriented framework considers areas such as trends, corporate and sourcing strategy, critical success factors, strategic supplier selection criteria, risk assessment, reporting, strategy alignment and reporting. The proposed model highlights the essential dimensions in strategic sourcing and guides us to a new definition of strategic sourcing supported by this empirical study.
Resumo:
Family health history (FHH) in the context of risk assessment has been shown to positively impact risk perception and behavior change. The added value of genetic risk testing is less certain. The aim of this study was to determine the impact of Type 2 Diabetes (T2D) FHH and genetic risk counseling on behavior and its cognitive precursors. Subjects were non-diabetic patients randomized to counseling that included FHH +/- T2D genetic testing. Measurements included weight, BMI, fasting glucose at baseline and 12 months and behavioral and cognitive precursor (T2D risk perception and control over disease development) surveys at baseline, 3, and 12 months. 391 subjects enrolled of which 312 completed the study. Behavioral and clinical outcomes did not differ across FHH or genetic risk but cognitive precursors did. Higher FHH risk was associated with a stronger perceived T2D risk (pKendall < 0.001) and with a perception of "serious" risk (pKendall < 0.001). Genetic risk did not influence risk perception, but was correlated with an increase in perception of "serious" risk for moderate (pKendall = 0.04) and average FHH risk subjects (pKendall = 0.01), though not for the high FHH risk group. Perceived control over T2D risk was high and not affected by FHH or genetic risk. FHH appears to have a strong impact on cognitive precursors of behavior change, suggesting it could be leveraged to enhance risk counseling, particularly when lifestyle change is desirable. Genetic risk was able to alter perceptions about the seriousness of T2D risk in those with moderate and average FHH risk, suggesting that FHH could be used to selectively identify individuals who may benefit from genetic risk testing.
Resumo:
AIMS: Our aims were to evaluate the distribution of troponin I concentrations in population cohorts across Europe, to characterize the association with cardiovascular outcomes, to determine the predictive value beyond the variables used in the ESC SCORE, to test a potentially clinically relevant cut-off value, and to evaluate the improved eligibility for statin therapy based on elevated troponin I concentrations retrospectively.
METHODS AND RESULTS: Based on the Biomarkers for Cardiovascular Risk Assessment in Europe (BiomarCaRE) project, we analysed individual level data from 10 prospective population-based studies including 74 738 participants. We investigated the value of adding troponin I levels to conventional risk factors for prediction of cardiovascular disease by calculating measures of discrimination (C-index) and net reclassification improvement (NRI). We further tested the clinical implication of statin therapy based on troponin concentration in 12 956 individuals free of cardiovascular disease in the JUPITER study. Troponin I remained an independent predictor with a hazard ratio of 1.37 for cardiovascular mortality, 1.23 for cardiovascular disease, and 1.24 for total mortality. The addition of troponin I information to a prognostic model for cardiovascular death constructed of ESC SCORE variables increased the C-index discrimination measure by 0.007 and yielded an NRI of 0.048, whereas the addition to prognostic models for cardiovascular disease and total mortality led to lesser C-index discrimination and NRI increment. In individuals above 6 ng/L of troponin I, a concentration near the upper quintile in BiomarCaRE (5.9 ng/L) and JUPITER (5.8 ng/L), rosuvastatin therapy resulted in higher absolute risk reduction compared with individuals <6 ng/L of troponin I, whereas the relative risk reduction was similar.
CONCLUSION: In individuals free of cardiovascular disease, the addition of troponin I to variables of established risk score improves prediction of cardiovascular death and cardiovascular disease.
Resumo:
This paper reviews the literature of construction risk modelling and assessment. It also reviews the real practice of risk assessment. The review resulted in significant results, summarised as follows. There has been a major shift in risk perception from an estimation variance into a project attribute. Although the Probability–Impact risk model is prevailing, substantial efforts are being put to improving it reflecting the increasing complexity of construction projects. The literature lacks a comprehensive assessment approach capable of capturing risk impact on different project objectives. Obtaining a realistic project risk level demands an effective mechanism for aggregating individual risk assessments. The various assessment tools suffer from low take-up; professionals typically rely on their experience. It is concluded that a simple analytical tool that uses risk cost as a common scale and utilises professional experience could be a viable option to facilitate closing the gap between theory and practice of risk assessment.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
The Short Term Assessment of Risk and Treatability is a structured judgement tool used to inform risk estimation for multiple adverse outcomes. In research, risk estimates outperform the tool's strength and vulnerability scales for violence prediction. Little is known about what its’component parts contribute to the assignment of risk estimates and how those estimates fare in prediction of non-violent adverse outcomes compared with the structured components. START assessment and outcomes data from a secure mental health service (N=84) was collected. Binomial and multinomial regression analyses determined the contribution of selected elements of the START structured domain and recent adverse risk events to risk estimates and outcomes prediction for violence, self-harm/suicidality, victimisation, and self-neglect. START vulnerabilities and lifetime history of violence, predicted the violence risk estimate; self-harm and victimisation estimates were predicted only by corresponding recent adverse events. Recent adverse events uniquely predicted all corresponding outcomes, with the exception of self-neglect which was predicted by the strength scale. Only for victimisation did the risk estimate outperform prediction based on the START components and recent adverse events. In the absence of recent corresponding risk behaviour, restrictions imposed on the basis of START-informed risk estimates could be unwarranted and may be unethical.
Resumo:
Protective factors are neglected in risk assessment in adult psychiatric and criminal justice populations. This review investigated the predictive efficacy of selected tools that assess protective factors. Five databases were searched using comprehensive terms for records up to June 2014, resulting in 17 studies (n = 2,198). Results were combined in a multilevel meta-analysis using the R (R Core Team, R: A Language and Environment for Statistical Computing, Vienna, Austria: R Foundation for Statistical Computing, 2015) metafor package (Viechtbauer, Journal of Statistical Software, 2010, 36, 1). Prediction of outcomes was poor relative to a reference category of violent offending, with the exception of prediction of discharge from secure units. There were no significant differences between the predictive efficacy of risk scales, protective scales, and summary judgments. Protective factor assessment may be clinically useful, but more development is required. Claims that use of these tools is therapeutically beneficial require testing.
Resumo:
Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.