33 resultados para Hazard Risk Management
Resumo:
isk Management today has moved from being the topic of top level conferences and media discussions to being a permanent issue in the board and top management agenda. Several new directives and regulations in Switzerland, Germany and EU make it obligatory for the firms to have a risk management strategy and transparently disclose the risk management process to their stakeholders. Shareholders, insurance providers, banks, media, analysts, employees, suppliers and other stakeholders expect the board members to be pro-active in knowing the critical risks facing their organization and provide them with a reasonable assurance vis-à-vis the management of those risks. In this environment however, the lack of standards and training opportunities makes this task difficult for board members. This book with the help of real life examples, analysis of drivers, interpretation of the Swiss legal requirements, and information based on international benchmarks tries to reach out to the forward looking leaders of today's businesses. The authors have collectively brought their years of scientific and practical experience in risk management, Swiss law and board memberships together to provide the board members practical solutions in risk management. The desire is that this book will clear the fear regarding risk management from the minds of the company leadership and help them in making risk savvy decisions in quest to achieve their strategic objectives.
Resumo:
In natural hazard research, risk is defined as a function of (1) the probability of occurrence of a hazardous process, and (2) the assessment of the related extent of damage, defined by the value of elements at risk exposed and their physical vulnerability. Until now, various works have been undertaken to determine vulnerability values for objects exposed to geomorphic hazards such as mountain torrents. Yet, many studies only provide rough estimates for vulnerability values based on proxies for process intensities. However, the deduced vulnerability functions proposed in the literature show a wide range, in particular with respect to medium and high process magnitudes. In our study, we compare vulnerability functions for torrent processes derived from studies in test sites located in the Austrian Alps and in Taiwan. Based on this comparison we expose needs for future research in order to enhance mountain hazard risk management with a particular focus on the question of vulnerability on a catchment scale.
Resumo:
Critical limb ischaemia (CLI) is a particularly severe manifestation of lower limb atherosclerosis posing a major threat to both limb and life of affected patients. Besides arterial revascularisation, risk-factor modification and administration of antiplatelet therapy is a major goal in the treatment of CLI patients. Key elements of cardiovascular risk management are smoking cessation and treatment of hyperlipidaemia with dietary modification or statins. Moreover, arterial hypertension and diabetes mellitus should be adequately treated. In CLI patients not suitable for arterial revascularisation or subsequent to unsuccessful revascularisation, parenteral prostanoids may be considered. CLI patients undergoing surgical revascularisation should be treated with beta blockers. At present, neither gene nor stem-cell therapy can be recommended outside clinical trials. Of note, walking exercise is contraindicated in CLI patients due to the risk of worsening pre-existing or causing new ischaemic wounds. CLI patients are oftentimes medically frail and exhibit significant comorbidities. Co-existing coronary heart and carotid as well as renal artery disease should be managed according to current guidelines. Considering the above-mentioned treatment goals, interdisciplinary treatment approaches for CLI patients are warranted. Aim of the present manuscript is to discuss currently existing evidence for both the management of cardiovascular risk factors and treatment of co-existing disease and to deduct specific treatment recommendations.
Resumo:
The paper deals with the development of a general as well as integrative and holistic framework to systematize and assess vulnerability, risk and adaptation. The framework is a thinking tool meant as a heuristic that outlines key factors and different dimensions that need to be addressed when assessing vulnerability in the context of natural hazards and climate change. The approach underlines that the key factors of such a common framework are related to the exposure of a society or system to a hazard or stressor, the susceptibility of the system or community exposed, and its resilience and adaptive capacity. Additionally, it underlines the necessity to consider key factors and multiple thematic dimensions when assessing vulnerability in the context of natural and socio-natural hazards. In this regard, it shows key linkages between the different concepts used within the disaster risk management (DRM) and climate change adaptation (CCA) research. Further, it helps to illustrate the strong relationships between different concepts used in DRM and CCA. The framework is also a tool for communicating complexity and stresses the need for societal change in order to reduce risk and to promote adaptation. With regard to this, the policy relevance of the framework and first results of its application are outlined. Overall, the framework presented enhances the discussion on how to frame and link vulnerability, disaster risk, risk management and adaptation concepts.
Resumo:
The potential effects of climatic changes on natural risks are widely discussed. But the formulation of strategies for adapting risk management practice to climate changes requires knowledge of the related risks for people and economic values. The main goals of this work were (1) the development of a method for analysing and comparing risks induced by different natural hazard types, (2) highlighting the most relevant natural hazard processes and related damages, (3) the development of an information system for the monitoring of the temporal development of natural hazard risk and (4) the visualisation of the resulting information for the wider public. A comparative exposure analysis provides the basis for pointing out the hot spots of natural hazard risks in the province of Carinthia, Austria. An analysis of flood risks in all municipalities provides the basis for setting the priorities in the planning of flood protection measures. The methods form the basis for a monitoring system that periodically observes the temporal development of natural hazard risks. This makes it possible firstly to identify situations in which natural hazard risks are rising and secondly to differentiate between the most relevant factors responsible for the increasing risks. The factors that most influence the natural risks could be made evident.
Resumo:
The fatality risk caused by avalanches on road networks can be analysed using a long-term approach, resulting in a mean value of risk, and with emphasis on short-term fluctuations due to the temporal variability of both, the hazard potential and the damage potential. In this study, the approach for analysing the long-term fatality risk has been adapted by modelling the highly variable short-term risk. The emphasis was on the temporal variability of the damage potential and the related risk peaks. For defined hazard scenarios resulting from classified amounts of snow accumulation, the fatality risk was calculated by modelling the hazard potential and observing the traffic volume. The avalanche occurrence probability was calculated using a statistical relationship between new snow height and observed avalanche releases. The number of persons at risk was determined from the recorded traffic density. The method resulted in a value for the fatality risk within the observed time frame for the studied road segment. The long-term fatality risk due to snow avalanches as well as the short-term fatality risk was compared to the average fatality risk due to traffic accidents. The application of the method had shown that the long-term avalanche risk is lower than the fatality risk due to traffic accidents. The analyses of short-term avalanche-induced fatality risk provided risk peaks that were 50 times higher than the statistical accident risk. Apart from situations with high hazard level and high traffic density, risk peaks result from both, a high hazard level combined with a low traffic density and a high traffic density combined with a low hazard level. This provided evidence for the importance of the temporal variability of the damage potential for risk simulations on road networks. The assumed dependence of the risk calculation on the sum of precipitation within three days is a simplified model. Thus, further research is needed for an improved determination of the diurnal avalanche probability. Nevertheless, the presented approach may contribute as a conceptual step towards a risk-based decision-making in risk management.
Resumo:
The Chakhama Valley, a remote area in Pakistan-administered Kashmir, was badly damaged by the 7.6-magnitude earthquake that struck India and Pakistan on 8 October 2005. More than 5% of the population lost their lives, and about 90% of the existing housing was irreparably damaged or completely destroyed. In early 2006, the Aga Khan Development Network (AKDN) initiated a multisector, community-driven reconstruction program in the Chakhama Valley on the premise that the scale of the disaster required a response that would address all aspects of people's lives. One important aspect covered the promotion of disaster risk management for sustainable recovery in a safe environment. Accordingly, prevailing hazards (rockfalls, landslides, and debris flow, in addition to earthquake hazards) and existing risks were thoroughly assessed, and the information was incorporated into the main planning processes. Hazard maps, detailed site investigations, and proposals for precautionary measures assisted engineers in supporting the reconstruction of private homes in safe locations to render investments disaster resilient. The information was also used for community-based land use decisions and disaster mitigation and preparedness. The work revealed three main problems: (1) thorough assessment of hazards and incorporation of this assessment into planning processes is time consuming and often little understood by the population directly affected, but it pays off in the long run; (2) relocating people out of dangerous places is a highly sensitive issue that requires the support of clear and forceful government policies; and (3) the involvement of local communities is essential for the success of mitigation and preparedness.
Resumo:
OBJECTIVES: To briefly inform on the conclusions from a conference on the next 10 years in the management of peripheral artery disease (PAD). DESIGN OF THE CONFERENCE: International participation, invited presentations and open discussion were based on the following issues: Why is PAD under-recognised? Health economic impact of PAD; funding of PAD research; changes of treatment options? Aspects on clinical trials and regulatory views; and the role of guidelines. RESULTS AND CONCLUSIONS: A relative lack of knowledge about cardiovascular risk and optimal management of PAD patients exists not only among the public, but also in parts of the health-care system. Specialists are required to act for improved information. More specific PAD research is needed for risk management and to apply the best possible evaluation of evidence for treatment strategies. Better strategies for funding are required based on, for example, public/private initiatives. The proportion of endovascular treatments is steadily increasing, more frequently based on observational studies than on randomised controlled trials. The role of guidelines is therefore important to guide the profession in the assessment of most relevant treatment.
Resumo:
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Resumo:
OBJECTIVES: To validate the Probability of Repeated Admission (Pra) questionnaire, a widely used self-administered tool for predicting future healthcare use in older persons, in three European healthcare systems. DESIGN: Prospective study with 1-year follow-up. SETTING: Hamburg, Germany; London, United Kingdom; Canton of Solothurn, Switzerland. PARTICIPANTS: Nine thousand seven hundred thirteen independently living community-dwelling people aged 65 and older. MEASUREMENTS: Self-administered eight-item Pra questionnaire at baseline. Self-reported number of hospital admissions and physician visits during 1 year of follow-up. RESULTS: In the combined sample, areas under the receiver operating characteristic curves (AUCs) were 0.64 (95% confidence interval (CI)=0.62-0.66) for the prediction of one or more hospital admissions and 0.68 (95% CI=0.66-0.69) for the prediction of more than six physician visits during the following year. AUCs were similar between sites. In comparison, prediction models based on a person's age and sex alone exhibited poor predictive validity (AUC
Resumo:
To compare the prediction of hip fracture risk of several bone ultrasounds (QUS), 7062 Swiss women > or =70 years of age were measured with three QUSs (two of the heel, one of the phalanges). Heel QUSs were both predictive of hip fracture risk, whereas the phalanges QUS was not. INTRODUCTION: As the number of hip fracture is expected to increase during these next decades, it is important to develop strategies to detect subjects at risk. Quantitative bone ultrasound (QUS), an ionizing radiation-free method, which is transportable, could be interesting for this purpose. MATERIALS AND METHODS: The Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture Risk (SEMOF) study is a multicenter cohort study, which compared three QUSs for the assessment of hip fracture risk in a sample of 7609 elderly ambulatory women > or =70 years of age. Two QUSs measured the heel (Achilles+; GE-Lunar and Sahara; Hologic), and one measured the heel (DBM Sonic 1200; IGEA). The Cox proportional hazards regression was used to estimate the hazard of the first hip fracture, adjusted for age, BMI, and center, and the area under the ROC curves were calculated to compare the devices and their parameters. RESULTS: From the 7609 women who were included in the study, 7062 women 75.2 +/- 3.1 (SD) years of age were prospectively followed for 2.9 +/- 0.8 years. Eighty women reported a hip fracture. A decrease by 1 SD of the QUS variables corresponded to an increase of the hip fracture risk from 2.3 (95% CI, 1.7, 3.1) to 2.6 (95% CI, 1.9, 3.4) for the three variables of Achilles+ and from 2.2 (95% CI, 1.7, 3.0) to 2.4 (95% CI, 1.8, 3.2) for the three variables of Sahara. Risk gradients did not differ significantly among the variables of the two heel QUS devices. On the other hand, the phalanges QUS (DBM Sonic 1200) was not predictive of hip fracture risk, with an adjusted hazard risk of 1.2 (95% CI, 0.9, 1.5), even after reanalysis of the digitalized data and using different cut-off levels (1700 or 1570 m/s). CONCLUSIONS: In this elderly women population, heel QUS devices were both predictive of hip fracture risk, whereas the phalanges QUS device was not.
Resumo:
This paper surveys the currency risk management practices of Swiss industrial corporations. We find tha industrials do not quantify their currency risk exposure and investigate possible reasons. One possibility is that firms do not think they need to know because they use on-balance-sheet instruments to protect themselves before and after currency rates reach troublesome levels. This is puzzling because a rough estimate of at least cash flow exposure is not a prohibitive task and could be helpful. It is also puzzling that firms use currency derivatives to hedge/insure individual short-term transactions, without apparently trying to estimate aggregate transaction exposure.
Resumo:
Desertification research conventionally focuses on the problem – that is, degradation – while neglecting the appraisal of successful conservation practices. Based on the premise that Sustainable Land Management (SLM) experiences are not sufficiently or comprehensively documented, evaluated, and shared, the World Overview of Conservation Approaches and Technologies (WOCAT) initiative (www.wocat.net), in collaboration with FAO’s Land Degradation Assessment in Drylands (LADA) project (www.fao.org/nr/lada/) and the EU’s DESIRE project (http://www.desire-project.eu/), has developed standardised tools and methods for compiling and evaluating the biophysical and socio-economic knowledge available about SLM. The tools allow SLM specialists to share their knowledge and assess the impact of SLM at the local, national, and global levels. As a whole, the WOCAT–LADA–DESIRE methodology comprises tools for documenting, self-evaluating, and assessing the impact of SLM practices, as well as for knowledge sharing and decision support in the field, at the planning level, and in scaling up identified good practices. SLM depends on flexibility and responsiveness to changing complex ecological and socioeconomic causes of land degradation. The WOCAT tools are designed to reflect and capture this capacity of SLM. In order to take account of new challenges and meet emerging needs of WOCAT users, the tools are constantly further developed and adapted. Recent enhancements include tools for improved data analysis (impact and cost/benefit), cross-scale mapping, climate change adaptation and disaster risk management, and easier reporting on SLM best practices to UNCCD and other national and international partners. Moreover, WOCAT has begun to give land users a voice by backing conventional documentation with video clips straight from the field. To promote the scaling up of SLM, WOCAT works with key institutions and partners at the local and national level, for example advisory services and implementation projects. Keywords: Sustainable Land Management (SLM), knowledge management, decision-making, WOCAT–LADA–DESIRE methodology.
Resumo:
One of the main problems of flood hazard assessment in ungauged or poorly gauged basins is the lack of runoff data. In an attempt to overcome this problem we have combined archival records, dendrogeomorphic time series and instrumental data (daily rainfall and discharge) from four ungauged and poorly gauged mountain basins in Central Spain with the aim of reconstructing and compiling information on 41 flash flood events since the end of the 19th century. Estimation of historical discharge and the incorporation of uncertainty for the at-site and regional flood frequency analysis were performed with an empirical rainfall–runoff assessment as well as stochastic and Bayesian Markov Chain Monte Carlo (MCMC) approaches. Results for each of the ungauged basins include flood frequency, severity, seasonality and triggers (synoptic meteorological situations). The reconstructed data series clearly demonstrates how uncertainty can be reduced by including historical information, but also points to the considerable influence of different approaches on quantile estimation. This uncertainty should be taken into account when these data are used for flood risk management.