25 resultados para Downside Risk Management
Resumo:
OBJECTIVES: To briefly inform on the conclusions from a conference on the next 10 years in the management of peripheral artery disease (PAD). DESIGN OF THE CONFERENCE: International participation, invited presentations and open discussion were based on the following issues: Why is PAD under-recognised? Health economic impact of PAD; funding of PAD research; changes of treatment options? Aspects on clinical trials and regulatory views; and the role of guidelines. RESULTS AND CONCLUSIONS: A relative lack of knowledge about cardiovascular risk and optimal management of PAD patients exists not only among the public, but also in parts of the health-care system. Specialists are required to act for improved information. More specific PAD research is needed for risk management and to apply the best possible evaluation of evidence for treatment strategies. Better strategies for funding are required based on, for example, public/private initiatives. The proportion of endovascular treatments is steadily increasing, more frequently based on observational studies than on randomised controlled trials. The role of guidelines is therefore important to guide the profession in the assessment of most relevant treatment.
Resumo:
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Resumo:
OBJECTIVES: To validate the Probability of Repeated Admission (Pra) questionnaire, a widely used self-administered tool for predicting future healthcare use in older persons, in three European healthcare systems. DESIGN: Prospective study with 1-year follow-up. SETTING: Hamburg, Germany; London, United Kingdom; Canton of Solothurn, Switzerland. PARTICIPANTS: Nine thousand seven hundred thirteen independently living community-dwelling people aged 65 and older. MEASUREMENTS: Self-administered eight-item Pra questionnaire at baseline. Self-reported number of hospital admissions and physician visits during 1 year of follow-up. RESULTS: In the combined sample, areas under the receiver operating characteristic curves (AUCs) were 0.64 (95% confidence interval (CI)=0.62-0.66) for the prediction of one or more hospital admissions and 0.68 (95% CI=0.66-0.69) for the prediction of more than six physician visits during the following year. AUCs were similar between sites. In comparison, prediction models based on a person's age and sex alone exhibited poor predictive validity (AUC
Resumo:
This paper surveys the currency risk management practices of Swiss industrial corporations. We find tha industrials do not quantify their currency risk exposure and investigate possible reasons. One possibility is that firms do not think they need to know because they use on-balance-sheet instruments to protect themselves before and after currency rates reach troublesome levels. This is puzzling because a rough estimate of at least cash flow exposure is not a prohibitive task and could be helpful. It is also puzzling that firms use currency derivatives to hedge/insure individual short-term transactions, without apparently trying to estimate aggregate transaction exposure.
Resumo:
The paper deals with the development of a general as well as integrative and holistic framework to systematize and assess vulnerability, risk and adaptation. The framework is a thinking tool meant as a heuristic that outlines key factors and different dimensions that need to be addressed when assessing vulnerability in the context of natural hazards and climate change. The approach underlines that the key factors of such a common framework are related to the exposure of a society or system to a hazard or stressor, the susceptibility of the system or community exposed, and its resilience and adaptive capacity. Additionally, it underlines the necessity to consider key factors and multiple thematic dimensions when assessing vulnerability in the context of natural and socio-natural hazards. In this regard, it shows key linkages between the different concepts used within the disaster risk management (DRM) and climate change adaptation (CCA) research. Further, it helps to illustrate the strong relationships between different concepts used in DRM and CCA. The framework is also a tool for communicating complexity and stresses the need for societal change in order to reduce risk and to promote adaptation. With regard to this, the policy relevance of the framework and first results of its application are outlined. Overall, the framework presented enhances the discussion on how to frame and link vulnerability, disaster risk, risk management and adaptation concepts.
Resumo:
Desertification research conventionally focuses on the problem – that is, degradation – while neglecting the appraisal of successful conservation practices. Based on the premise that Sustainable Land Management (SLM) experiences are not sufficiently or comprehensively documented, evaluated, and shared, the World Overview of Conservation Approaches and Technologies (WOCAT) initiative (www.wocat.net), in collaboration with FAO’s Land Degradation Assessment in Drylands (LADA) project (www.fao.org/nr/lada/) and the EU’s DESIRE project (http://www.desire-project.eu/), has developed standardised tools and methods for compiling and evaluating the biophysical and socio-economic knowledge available about SLM. The tools allow SLM specialists to share their knowledge and assess the impact of SLM at the local, national, and global levels. As a whole, the WOCAT–LADA–DESIRE methodology comprises tools for documenting, self-evaluating, and assessing the impact of SLM practices, as well as for knowledge sharing and decision support in the field, at the planning level, and in scaling up identified good practices. SLM depends on flexibility and responsiveness to changing complex ecological and socioeconomic causes of land degradation. The WOCAT tools are designed to reflect and capture this capacity of SLM. In order to take account of new challenges and meet emerging needs of WOCAT users, the tools are constantly further developed and adapted. Recent enhancements include tools for improved data analysis (impact and cost/benefit), cross-scale mapping, climate change adaptation and disaster risk management, and easier reporting on SLM best practices to UNCCD and other national and international partners. Moreover, WOCAT has begun to give land users a voice by backing conventional documentation with video clips straight from the field. To promote the scaling up of SLM, WOCAT works with key institutions and partners at the local and national level, for example advisory services and implementation projects. Keywords: Sustainable Land Management (SLM), knowledge management, decision-making, WOCAT–LADA–DESIRE methodology.
Resumo:
OBJECTIVE Hunger strikers resuming nutritional intake may develop a life-threatening refeeding syndrome (RFS). Consequently, hunger strikers represent a core challenge for the medical staff. The objective of the study was to test the effectiveness and safety of evidence-based recommendations for prevention and management of RFS during the refeeding phase. METHODS This was a retrospective, observational data analysis of 37 consecutive, unselected cases of prisoners on a hunger strike during a 5-y period. The sample consisted of 37 cases representing 33 individual patients. RESULTS In seven cases (18.9%), the hunger strike was continued during the hospital stay, in 16 episodes (43.2%) cessation of the hunger strike occurred immediately after admission to the security ward, and in 14 episodes (37.9%) during hospital stay. In the refeed cases (n = 30), nutritional replenishment occurred orally, and in 25 (83.3%) micronutrients substitutions were made based on the recommendations. The gradual refeeding with fluid restriction occurred over 10 d. Uncomplicated dyselectrolytemia was documented in 12 cases (40%) within the refeeding phase. One case (3.3%) presented bilateral ankle edemas as a clinical manifestation of moderate RFS. Intensive medical treatment was not necessary and none of the patients died. Seven episodes of continued hunger strike were observed during the entire hospital stay without medical complications. CONCLUSIONS Our data suggested that seriousness and rate of medical complications during the refeeding phase can be kept at a minimum in a hunger strike population. This study supported use of recommendations to optimize risk management and to improve treatment quality and patient safety in this vulnerable population.
Resumo:
The fatality risk caused by avalanches on road networks can be analysed using a long-term approach, resulting in a mean value of risk, and with emphasis on short-term fluctuations due to the temporal variability of both, the hazard potential and the damage potential. In this study, the approach for analysing the long-term fatality risk has been adapted by modelling the highly variable short-term risk. The emphasis was on the temporal variability of the damage potential and the related risk peaks. For defined hazard scenarios resulting from classified amounts of snow accumulation, the fatality risk was calculated by modelling the hazard potential and observing the traffic volume. The avalanche occurrence probability was calculated using a statistical relationship between new snow height and observed avalanche releases. The number of persons at risk was determined from the recorded traffic density. The method resulted in a value for the fatality risk within the observed time frame for the studied road segment. The long-term fatality risk due to snow avalanches as well as the short-term fatality risk was compared to the average fatality risk due to traffic accidents. The application of the method had shown that the long-term avalanche risk is lower than the fatality risk due to traffic accidents. The analyses of short-term avalanche-induced fatality risk provided risk peaks that were 50 times higher than the statistical accident risk. Apart from situations with high hazard level and high traffic density, risk peaks result from both, a high hazard level combined with a low traffic density and a high traffic density combined with a low hazard level. This provided evidence for the importance of the temporal variability of the damage potential for risk simulations on road networks. The assumed dependence of the risk calculation on the sum of precipitation within three days is a simplified model. Thus, further research is needed for an improved determination of the diurnal avalanche probability. Nevertheless, the presented approach may contribute as a conceptual step towards a risk-based decision-making in risk management.
Resumo:
The Chakhama Valley, a remote area in Pakistan-administered Kashmir, was badly damaged by the 7.6-magnitude earthquake that struck India and Pakistan on 8 October 2005. More than 5% of the population lost their lives, and about 90% of the existing housing was irreparably damaged or completely destroyed. In early 2006, the Aga Khan Development Network (AKDN) initiated a multisector, community-driven reconstruction program in the Chakhama Valley on the premise that the scale of the disaster required a response that would address all aspects of people's lives. One important aspect covered the promotion of disaster risk management for sustainable recovery in a safe environment. Accordingly, prevailing hazards (rockfalls, landslides, and debris flow, in addition to earthquake hazards) and existing risks were thoroughly assessed, and the information was incorporated into the main planning processes. Hazard maps, detailed site investigations, and proposals for precautionary measures assisted engineers in supporting the reconstruction of private homes in safe locations to render investments disaster resilient. The information was also used for community-based land use decisions and disaster mitigation and preparedness. The work revealed three main problems: (1) thorough assessment of hazards and incorporation of this assessment into planning processes is time consuming and often little understood by the population directly affected, but it pays off in the long run; (2) relocating people out of dangerous places is a highly sensitive issue that requires the support of clear and forceful government policies; and (3) the involvement of local communities is essential for the success of mitigation and preparedness.
Resumo:
Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes’ theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment.