801 resultados para risk assessment scale
Resumo:
Despite its huge potential in risk analysis, the Dempster–Shafer Theory of Evidence (DST) has not received enough attention in construction management. This paper presents a DST-based approach for structuring personal experience and professional judgment when assessing construction project risk. DST was innovatively used to tackle the problem of lacking sufficient information through enabling analysts to provide incomplete assessments. Risk cost is used as a common scale for measuring risk impact on the various project objectives, and the Evidential Reasoning algorithm is suggested as a novel alternative for aggregating individual assessments. A spreadsheet-based decision support system (DSS) was devised to facilitate the proposed approach. Four case studies were conducted to examine the approach's viability. Senior managers in four British construction companies tried the DSS and gave very promising feedback. The paper concludes that the proposed methodology may contribute to bridging the gap between theory and practice of construction risk assessment.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
An increasingly older population will most likely lead to greater demands on the health care system, as older age is associated with an increased risk of having acute and chronic conditions. The number of diseases or disabilities is not the only marker of the amount of health care utilized, as persons may seek hospitalization without a disease and/or illness that requires hospital healthcare. Hospitalization may pose a severe risk to older persons, as exposure to the hospital environment may lead to increased risks of iatrogenic disorders, confusion, falls and nosocomial infections, i.e., disorders that may involve unnecessary suffering and lead to serious consequences. Aims: The overall aim of this thesis was to describe and explore individual trajectories of cognitive development in relation to hospitalization and risk factors for hospitalization among older persons living in different accommodations in Sweden and to explore older persons' reasons for being transferred to a hospital. Methods: The study designs were longitudinal, prospective and descriptive, and both quantitative and qualitative methods were used. Specifically, latent growth curve modelling was used to assess the association of cognitive development with hospitalization. The Cox proportional hazards regression model was used to analyse factors associated with hospitalization risk overtime. In addition, an explorative descriptive design was used to explore how home health care patients experienced and perceived their decision to seek hospital care. Results: The most common reasons for hospitalization were cardiovascular diseases, which caused more than one-quarter of first hospitalizations among the persons living in ordinary housing and nursing home residents (NHRs). The persons who had been hospitalized had a lower mean level of cognitive performance in general cognition, verbal, spatial/fluid, memory and processing speed abilities compared to those who had not been hospitalized. Significantly steeper declines in general cognition, spatial/fluid and processing speed abilities were observed among the persons who had been hospitalized. Cox proportional hazards regression analysis showed that the number of diseases, number of drugs used, having experienced a fall and being assessed as malnourished according to the Mini Nutritional Assessment scale were related to an increased hospitalization risk among the NHRs. Among the older persons living in ordinary housing, the risk factors for hospitalization were related to marital status, i.e., unmarried persons and widows/widowers had a decreased hospitalization risk. In addition, among social factors, receipt of support from relatives was related to an increased hospitalization risk, while receipt of support from friends was related to a decreased risk. The number of illnesses was not associated with the hospitalization risk for older persons in any age group or for those of either sex, when controlling for other variables. The older persons who received home health care described different reasons for their decisions to seek hospital care. The underlying theme of the home health care patients’ perceptions of their transfer to a hospital involved trust in hospitals. This trust was shared by the home health care patients, their relatives and the home health care staff, according to the patients. Conclusions: This thesis revealed that middle-aged and older persons who had been hospitalized exhibited a steeper decline in cognition. Specifically, spatial/fluid, processing speed, and general cognitive abilities were affected. The steeper decline in cognition among those who had been hospitalized remained even after controlling for comorbidities. The most common causes of hospitalization among the older persons living in ordinary housing and in nursing homes were cardiovascular diseases, tumours and falls. Not only health-related factors, such as the number of diseases, number of drugs used, and being assessed as malnourished, but also social factors and marital status were related to the hospitalization risk among the older persons living in ordinary housing and in nursing homes. Some risk factors associated with hospitalization differed not only between the men and women but also among the different age groups. The information provided in this thesis could be applied in care settings by professionals who interact with older persons before they decide to seek hospital care. To meet the needs of an older population, health care systems need to offer the proper health care at the most appropriate level, and they need to increase integration and coordination among health care delivered by different care services.
Resumo:
Protective factors are neglected in risk assessment in adult psychiatric and criminal justice populations. This review investigated the predictive efficacy of selected tools that assess protective factors. Five databases were searched using comprehensive terms for records up to June 2014, resulting in 17 studies (n = 2,198). Results were combined in a multilevel meta-analysis using the R (R Core Team, R: A Language and Environment for Statistical Computing, Vienna, Austria: R Foundation for Statistical Computing, 2015) metafor package (Viechtbauer, Journal of Statistical Software, 2010, 36, 1). Prediction of outcomes was poor relative to a reference category of violent offending, with the exception of prediction of discharge from secure units. There were no significant differences between the predictive efficacy of risk scales, protective scales, and summary judgments. Protective factor assessment may be clinically useful, but more development is required. Claims that use of these tools is therapeutically beneficial require testing.
Resumo:
BACKGROUND: Risk assessment is fundamental in the management of acute coronary syndromes (ACS), enabling estimation of prognosis. AIMS: To evaluate whether the combined use of GRACE and CRUSADE risk stratification schemes in patients with myocardial infarction outperforms each of the scores individually in terms of mortality and haemorrhagic risk prediction. METHODS: Observational retrospective single-centre cohort study including 566 consecutive patients admitted for non-ST-segment elevation myocardial infarction. The CRUSADE model increased GRACE discriminatory performance in predicting all-cause mortality, ascertained by Cox regression, demonstrating CRUSADE independent and additive predictive value, which was sustained throughout follow-up. The cohort was divided into four different subgroups: G1 (GRACE<141; CRUSADE<41); G2 (GRACE<141; CRUSADE≥41); G3 (GRACE≥141; CRUSADE<41); G4 (GRACE≥141; CRUSADE≥41). RESULTS: Outcomes and variables estimating clinical severity, such as admission Killip-Kimbal class and left ventricular systolic dysfunction, deteriorated progressively throughout the subgroups (G1 to G4). Survival analysis differentiated three risk strata (G1, lowest risk; G2 and G3, intermediate risk; G4, highest risk). The GRACE+CRUSADE model revealed higher prognostic performance (area under the curve [AUC] 0.76) than GRACE alone (AUC 0.70) for mortality prediction, further confirmed by the integrated discrimination improvement index. Moreover, GRACE+CRUSADE combined risk assessment seemed to be valuable in delineating bleeding risk in this setting, identifying G4 as a very high-risk subgroup (hazard ratio 3.5; P<0.001). CONCLUSIONS: Combined risk stratification with GRACE and CRUSADE scores can improve the individual discriminatory power of GRACE and CRUSADE models in the prediction of all-cause mortality and bleeding. This combined assessment is a practical approach that is potentially advantageous in treatment decision-making.
Resumo:
Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.
Resumo:
The activity of Fuego volcano during the 1999 - 2013 eruptive episode is studied through field, remote sensing and observatory records. Mapping of the deposits allows quantifying the erupted volumes and areas affected by the largest eruptions during this period. A wide range of volcanic processes results in a diversity of products and associated deposits, including minor airfall tephra, rockfall avalanches, lava flows, and pyroclastic flows. The activity can be characterized by long term, low level background activity, and sporadic larger explosive eruptions. Although the background activity erupts lava and ash at a low rate (~ 0.1 m3/s), the persistence of such activity over time results in a significant contribution (~ 30%) to the eruption budget during the studied period. Larger eruptions produced the majority of the volume of products during the studied period, mainly during three large events (May 21, 1999, June 29, 2003, and September 13, 2012), mostly in the form of pyroclastic flows. A total volume of ~ 1.4 x 108 m3 was estimated from the mapped deposits and the estimated background eruption rate. Posterior remobilization of pyroclastic flow material by stream erosion in the highly confined Barranca channels leads to lahar generation, either by normal rainfall, or by extreme rainfall events. A reassessment of the types of products and volumes erupted during the decade of 1970's allows comparing the activity happening since 1999 with the older activity, and suggests that many of the eruptive phenomena at Fuego may have similar mechanisms, despite the differences in scale between. The deposits of large pyroclastic flows erupted during the 1970's are remarkably similar in appearance to the deposit of pyroclastic flows from the 1999 - 2013 period, despite their much larger volume; this is also the case for prehistoric eruptions. Radiocarbon dating of pyroclastic flow deposits suggests that Fuego has produced large eruptions many times during the last ~ 2 ka, including larger eruptions during the last 500 years, which has important hazard implications. A survey was conducted among the local residents living near to the volcano, about their expectations of possible future crises. The results show that people are aware of the risk they could face in case of a large eruption, and therefore they are willing to evacuate in such case. However, their decision to evacuate may also be influenced by the conditions in which the evacuation could take place. If the evacuation represents a potential loss of their livelihood or property they will be more hesitant to leave their villages during a large eruption. The prospect of facing hardship conditions during the evacuation and in the shelters may further cause reluctance to evacuate. A short discussion on some of the issues regarding risk assessment and management through an early warning system is presented in the last chapter.
Resumo:
Some polycyclic aromatic hydrocarbons (PAHs) are ubiquitous in air and have been implicated as carcinogenic materials. Therefore, literature is replete with studies that are focused on their occurrence and profiles in indoor and outdoor air samples. However, because the relative potency of individual PAHs vary widely, health risks associated with the presence of PAHs in a particular environment cannot be extrapolated directly from the concentrations of individual PAHs in that environment. In addition, database on the potency of PAH mixtures is currently limited. In this paper, we have utilized multi-criteria decision making methods (MCDMs) to simultaneously correlate PAH-related health risk in some microenvironments to the concentration levels, ethoxyresorufin-O-deethylase (EROD) activity induction equivalency factors and toxic equivalency factors (TEFs) of PAHs found in those microenvironments. The results showed that the relative risk associated with PAHs in different air samples depends on the index used. Nevertheless, this approach offers a promising tool that could help identify microenvironments of concern and assist the prioritisation of control strategies.
Resumo:
The historical challenge of environmental impact assessment (EIA) has been to predict project-based impacts accurately. Both EIA legislation and the practice of EIA have evolved over the last three decades in Canada, and the development of the discipline and science of environmental assessment has improved how we apply environmental assessment to complex projects. The practice of environmental assessment integrates the social and natural sciences and relies on an eclectic knowledge base from a wide range of sources. EIA methods and tools provide a means to structure and integrate knowledge in order to evaluate and predict environmental impacts.----- This Chapter will provide a brief overview of how impacts are identified and predicted. How do we determine what aspect of the natural and social environment will be affected when a mine is excavated? How does the practitioner determine the range of potential impacts, assess whether they are significant, and predict the consequences? There are no standard answers to these questions, but there are established methods to provide a foundation for scoping and predicting the potential impacts of a project.----- Of course, the community and publics play an important role in this process, and this will be discussed in subsequent chapters. In the first part of this chapter, we will deal with impact identification, which involves appplying scoping to critical issues and determining impact significance, baseline ecosystem evaluation techniques, and how to communicate environmental impacts. In the second part of the chapter, we discuss the prediction of impacts in relation to the complexity of the environment, ecological risk assessment, and modelling.
Resumo:
Risks and uncertainties are inevitable in engineering projects and infrastructure investments. Decisions about investment in infrastructure such as for maintenance, rehabilitation and construction works can pose risks, and may generate significant impacts on social, cultural, environmental and other related issues. This report presents the results of a literature review of current practice in identifying, quantifying and managing risks and predicting impacts as part of the planning and assessment process for infrastructure investment proposals. In assessing proposals for investment in infrastructure, it is necessary to consider social, cultural and environmental risks and impacts to the overall community, as well as financial risks to the investor. The report defines and explains the concept of risk and uncertainty, and describes the three main methodology approaches to the analysis of risk and uncertainty in investment planning for infrastructure, viz examining a range of scenarios or options, sensitivity analysis, and a statistical probability approach, listed here in order of increasing merit and complexity. Forecasts of costs, benefits and community impacts of infrastructure are recognised as central aspects of developing and assessing investment proposals. Increasingly complex modelling techniques are being used for investment evaluation. The literature review identified forecasting errors as the major cause of risk. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. For risks that cannot be readily quantified, assessment techniques commonly include classification or rating systems for likelihood and consequence. The report outlines the system used by the Australian Defence Organisation and in the Australian Standard on risk management. After each risk is identified and quantified or rated, consideration can be given to reducing the risk, and managing any remaining risk as part of the scope of the project. The literature review identified use of risk mapping techniques by a North American chemical company and by the Australian Defence Organisation. This literature review has enabled a risk assessment strategy to be developed, and will underpin an examination of the feasibility of developing a risk assessment capability using a probability approach.
Resumo:
Purpose: The purpose of this paper is to analyse the risk management process conducted by some private and not-for-profit affordable housing providers in South East Queensland, and draw conclusions about the relationship between risk assessments/responses and past experiences.----- Design/methodology/approach: In-depth interviews of selected non-government housing providers have been conducted to facilitate an understanding of their approach to risk assessment in developing and in managing affordable housing projects. Qualitative data are analysed using thematic analysis to find emerging themes suggested by interview participants.----- Findings: The paper finds that informal risk management process is used as part of normal business process in accordance with industry standards. Many interviewees agree that the recognition of financial risk and the fear of community rejection of such housing projects have restrained them from committing to such investment projects. The levels of acceptance of risk are not always consistent across housing providers which create opportunities to conduct multi-stakeholder partnership to reduce overall risk.----- Research limitations/implications: The paper has implications for developers or investors who seek to include affordable housing as part of their portfolio. However, data collected in the study are a cross-section of interviews that will not include the impact on recent tax incentives offers by the Australian Commonwealth Government.----- Practical implications: The study suggests that implementing improvements to the risk mitigation and management framework may assist in promoting the supply of affordable housing by non-government providers.----- Originality/value: The focus of the study is the interaction between partnerships and risk management in development and management of affordable rental housing.
Resumo:
Quantitative Microbial Risk Assessment (QMRA) analysis was used to quantify the risk of infection associated with the exposure to pathogens from potable and non-potable uses of roof-harvested rainwater in South East Queensland (SEQ). A total of 84 rainwater samples were analysed for the presence of faecal indicators (using culture based methods) and zoonotic bacterial and protozoan pathogens using binary and quantitative PCR (qPCR). The concentrations of Salmonella invA, and Giardia lamblia β-giradin genes ranged from 65-380 genomic units/1000 mL and 9-57 genomic units/1000 mL of water, respectively. After converting gene copies to cell/cyst number, the risk of infection from G. lamblia and Salmonella spp. associated with the use of rainwater for bi-weekly garden hosing was calculated to be below the threshold value of 1 extra infection per 10,000 persons per year. However, the estimated risk of infection from drinking the rainwater daily was 44-250 (for G. lamblia) and 85-520 (for Salmonella spp.) infections per 10,000 persons per year. Since this health risk seems higher than that expected from the reported incidences of gastroenteritis, the assumptions used to estimate these infection risks are critically discussed. Nevertheless, it would seem prudent to disinfect rainwater for potable use.
Resumo:
Road curves are an important feature of road infrastructure and many serious crashes occur on road curves. In Queensland, the number of fatalities is twice as many on curves as that on straight roads. Therefore, there is a need to reduce drivers’ exposure to crash risk on road curves. Road crashes in Australia and in the Organisation for Economic Co-operation and Development(OECD) have plateaued in the last five years (2004 to 2008) and the road safety community is desperately seeking innovative interventions to reduce the number of crashes. However, designing an innovative and effective intervention may prove to be difficult as it relies on providing theoretical foundation, coherence, understanding, and structure to both the design and validation of the efficiency of the new intervention. Researchers from multiple disciplines have developed various models to determine the contributing factors for crashes on road curves with a view towards reducing the crash rate. However, most of the existing methods are based on statistical analysis of contributing factors described in government crash reports. In order to further explore the contributing factors related to crashes on road curves, this thesis designs a novel method to analyse and validate these contributing factors. The use of crash claim reports from an insurance company is proposed for analysis using data mining techniques. To the best of our knowledge, this is the first attempt to use data mining techniques to analyse crashes on road curves. Text mining technique is employed as the reports consist of thousands of textual descriptions and hence, text mining is able to identify the contributing factors. Besides identifying the contributing factors, limited studies to date have investigated the relationships between these factors, especially for crashes on road curves. Thus, this study proposed the use of the rough set analysis technique to determine these relationships. The results from this analysis are used to assess the effect of these contributing factors on crash severity. The findings obtained through the use of data mining techniques presented in this thesis, have been found to be consistent with existing identified contributing factors. Furthermore, this thesis has identified new contributing factors towards crashes and the relationships between them. A significant pattern related with crash severity is the time of the day where severe road crashes occur more frequently in the evening or night time. Tree collision is another common pattern where crashes that occur in the morning and involves hitting a tree are likely to have a higher crash severity. Another factor that influences crash severity is the age of the driver. Most age groups face a high crash severity except for drivers between 60 and 100 years old, who have the lowest crash severity. The significant relationship identified between contributing factors consists of the time of the crash, the manufactured year of the vehicle, the age of the driver and hitting a tree. Having identified new contributing factors and relationships, a validation process is carried out using a traffic simulator in order to determine their accuracy. The validation process indicates that the results are accurate. This demonstrates that data mining techniques are a powerful tool in road safety research, and can be usefully applied within the Intelligent Transport System (ITS) domain. The research presented in this thesis provides an insight into the complexity of crashes on road curves. The findings of this research have important implications for both practitioners and academics. For road safety practitioners, the results from this research illustrate practical benefits for the design of interventions for road curves that will potentially help in decreasing related injuries and fatalities. For academics, this research opens up a new research methodology to assess crash severity, related to road crashes on curves.