272 resultados para Risk Classification
em University of Queensland eSpace - Australia
Resumo:
Systematic protocols that use decision rules or scores arc, seen to improve consistency and transparency in classifying the conservation status of species. When applying these protocols, assessors are typically required to decide on estimates for attributes That are inherently uncertain, Input data and resulting classifications are usually treated as though they arc, exact and hence without operator error We investigated the impact of data interpretation on the consistency of protocols of extinction risk classifications and diagnosed causes of discrepancies when they occurred. We tested three widely used systematic classification protocols employed by the World Conservation Union, NatureServe, and the Florida Fish and Wildlife Conservation Commission. We provided 18 assessors with identical information for 13 different species to infer estimates for each of the required parameters for the three protocols. The threat classification of several of the species varied from low risk to high risk, depending on who did the assessment. This occurred across the three Protocols investigated. Assessors tended to agree on their placement of species in the highest (50-70%) and lowest risk categories (20-40%), but There was poor agreement on which species should be placed in the intermediate categories, Furthermore, the correspondence between The three classification methods was unpredictable, with large variation among assessors. These results highlight the importance of peer review and consensus among multiple assessors in species classifications and the need to be cautious with assessments carried out 4), a single assessor Greater consistency among assessors requires wide use of training manuals and formal methods for estimating parameters that allow uncertainties to be represented, carried through chains of calculations, and reported transparently.
Resumo:
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.
Resumo:
Promoted as the key policy response to unemployment, the Job Network constitutes an array of interlocking processes that position unemployed people as `problems' in need of remediation. Unemployment is presented as a primary risk threatening society, and unemployed people are presented as displaying various degrees of riskiness. The Job Seeker Classification Instrument (JSCI) is a `technology' employed by Centrelink to assess `risk' and to determine the type of interaction that unemployed people have with the job Network. In the first instance, we critically examine the development of the JSCI and expose issues that erode its credibility and legitimacy. Second, employing the analytical tools of discourse analysis, we show how the JSCI both assumes and imposes particular subject identities on unemployed people. The purpose of this latter analysis is to illustrate the consequences of the sorts of technologies and interventions used within the job Network.
Resumo:
Objectives: Resternotomy is a common part of cardiac surgical practice. Associated with resternotomy are the risks of cardiac injury and catastrophic hemorrhage and the subsequent elevated morbidity and mortality in the operating room or during the postoperative period. The technique of direct vision resternotomy is safe and has fewer, if any, serious cardiac injuries. The technique, the reduced need for groin cannulation and the overall low operative mortality and morbidity are the focus of this restrospective analysis. Methods: The records of 495 patients undergoing 546 resternotomies over a 21-year period to January 2000 were reviewed. All consecutive reoperations by the one surgeon comprised patients over the age of 20 at first resternotomy: M:F 343:203, mean age 57 years (range 20 to 85, median age 60). The mean NYHA grade was 2.3 [with 67 patients (1), 273 (11),159 (111), 43 (IV), and 4 (V classification)] with elective reoperation in 94.6%. Cardiac injury was graded into five groups and the incidence and reasons for groin cannulation estimated. The morbidity and mortality as a result of the reoperation and resternotomy were assessed. Results: The hospital/30 day mortality was 2.9% (95% Cl: 1.6%-4.4%) (16 deaths) over the 21 years. First (481), second (53), and third (12) resternotomies produced 307 uncomplicated technical reopenings, 203 slower but uncomplicated procedures, 9 minor superficial cardiac lacerations, and no moderate or severe cardiac injuries. Direct vision resternotomy is crystalized into the principle that only adhesions that are visualized from below are divided and only sternal bone that is freed of adhesions is sewn. Groin exposure was never performed prophylactically for resternotomy. Fourteen patients (2.6%) had such cannulation for aortic dissection/aneurysm (9 patients), excessive sternal adherence of cardiac structures (3 patients), presurgery cardiac arrest (1 patient), and high aortic cannulation desired and not possible (1 patient). The average postop blood loss was 594 mL (95% CI:558-631) in the first 12 hours. The need to return to the operating room for control of excessive bleeding was 2% (11 patients). Blood transfusion was given in 65% of the resternotomy procedures over the 21 years (mean 854 mL 95% Cl 765-945 mL) and 41% over the last 5 years. Conclusions: The technique of direct vision resternotomy has been associated with zero moderate or major cardiac injury/catastrophic hemorrhage at reoperation. Few patients have required groin cannulation. In the postoperative period, there was acceptable blood loss, transfusion rates, reduced morbidity, and moderate low mortality for this potentially high risk group.
Resumo:
In studies assessing the trends in coronary events, such as the World Health Organization (WHO) MONICA Project (multinational MONItoring of trends and determinants of CArdiovascular disease), the main emphasis has been on coronary deaths and non-fatal definite myocardial infarctions (MI). It is, however, possible that the proportion of milder MIs may be increasing because of improvements in treatment and reductions in levels of risk factors. We used the MI register data of the WHO MONICA Project to investigate several definitions for mild non-fatal MIs that would be applicable in various settings and could be used to assess trends in milder coronary events. Of 38 populations participating in the WHO MONICA MI register study, more than half registered a sufficiently wide spectrum of events that it was possible to identify subsets of milder cases. The event rates and case fatality rates of MI are clearly dependent on the spectrum of non-fatal MIs, which are included. On clinical grounds we propose that the original MONICA category ''non-fatal possible MI'' could bt:divided into two groups: ''non fatal probable MI'' and ''prolonged chest pain.'' Non-fatal probable MIs are cases, which in addition to ''typical symptoms'' have electrocardiogram (EGG) or enzyme changes suggesting cardiac ischemia, but not severe enough to fulfil the criteria for non-fatal definite MI In more than half of the MONICA Collaborating Centers, the registration of MI covers these milder events reasonably well. Proportions of non-fatal probable MIs vary less between populations than do proportions of non fatal possible MIs. Also rates of non-fatal probable MI are somewhat more highly correlated with rates of fatal events and non-fatal definite MI. These findings support the validity of the category of non-fatal probable MI. In each center the increase in event rates and the decrease in case-fatality due to the inclusion of non-fatal probable MI was lar er for women than men. For the WHO MONICA Project and other epidemiological studies the proposed category of non-fatal probable MIs can be used for assessing trends in rates of milder MI. Copyright (C) 1997 Elsevier Science Inc.
Resumo:
In this study, we examine an important factor that affects consumers' acceptance of business-to-commerce (B2C) electronic commerce - perceived risk. The objective of this paper is to examine the definition of perceived risk in the context of B2C electronic commerce. The paper highlights the importance of perceived risk and the interwoven relation between perceived risk and trust. It discusses the problem of defining perceived risk in prior B2C research. This study proposes a new classification of consumers' perceived risk based on sources. It highlights the importance of identifying the sources of consumer's risk perceptions in addition to the consequences dimensions. Two focus group discussion sessions were conducted to verify the proposed classification. Results indicate that Internet consumers perceive three sources of risk in B2C electronic commerce: technology, vendor, and product. © 2003 Elsevier B.V. All rights reserved.
Resumo:
Risk assessment systems for introduced species are being developed and applied globally, but methods for rigorously evaluating them are still in their infancy. We explore classification and regression tree models as an alternative to the current Australian Weed Risk Assessment system, and demonstrate how the performance of screening tests for unwanted alien species may be quantitatively compared using receiver operating characteristic (ROC) curve analysis. The optimal classification tree model for predicting weediness included just four out of a possible 44 attributes of introduced plants examined, namely: (i) intentional human dispersal of propagules; (ii) evidence of naturalization beyond native range; (iii) evidence of being a weed elsewhere; and (iv) a high level of domestication. Intentional human dispersal of propagules in combination with evidence of naturalization beyond a plants native range led to the strongest prediction of weediness. A high level of domestication in combination with no evidence of naturalization mitigated the likelihood of an introduced plant becoming a weed resulting from intentional human dispersal of propagules. Unlikely intentional human dispersal of propagules combined with no evidence of being a weed elsewhere led to the lowest predicted probability of weediness. The failure to include intrinsic plant attributes in the model suggests that either these attributes are not useful general predictors of weediness, or data and analysis were inadequate to elucidate the underlying relationship(s). This concurs with the historical pessimism that we will ever be able to accurately predict invasive plants. Given the apparent importance of propagule pressure (the number of individuals of an species released), future attempts at evaluating screening model performance for identifying unwanted plants need to account for propagule pressure when collating and/or analysing datasets. The classification tree had a cross-validated sensitivity of 93.6% and specificity of 36.7%. Based on the area under the ROC curve, the performance of the classification tree in correctly classifying plants as weeds or non-weeds was slightly inferior (Area under ROC curve = 0.83 +/- 0.021 (+/- SE)) to that of the current risk assessment system in use (Area under ROC curve = 0.89 +/- 0.018 (+/- SE)), although requires many fewer questions to be answered.
Resumo:
BACKGROUND: Coronary heart disease has been a major cause of mortality in Australian adults, but the rate has declined by 83% from the 1968 peak by the year 2000. The study objective is to determine the contribution of changes in population risk factors - mean serum cholesterol and diastolic blood pressure and tobacco smoking prevalence - to the decline in coronary heart disease mortality in Australia over three decades. METHODS: Coronary heart disease deaths (International Classification of Disease-9, 410-414) and population by year, age group and sex were obtained from the Australian Bureau of Statistics. Risk factor levels were obtained from population surveys and estimated average annual changes by period were used to calculate average annual 'attributable' proportional declines in CHD mortality by period (age 35-64 years). RESULTS: Over the period 1968-2000, 74% of male decline and 81% of the female decline in coronary heart disease mortality rate was accounted for by the combined effect of reductions in the three risk factors. In males 36% of the decline was contributed by reductions in diastolic blood pressure, 22% by cholesterol and 16% by smoking. For females 56% was from diastolic blood pressure reduction, 20% from cholesterol and 5% from smoking. Effects of reductions in serum cholesterol on coronary heart disease mortality occurred mainly in the 1970s. Declines in diastolic blood pressure had effects on coronary heart disease mortality over the three decades, and declines in tobacco smoking had a significant effect in males in the 1980s. CONCLUSION: Most of the spectacular decline in coronary heart disease mortality over the last three decades in Australia can be ascribed to reductions in population risk factors from primary and secondary prevention.
Resumo:
Background: Children engage in various physical activities that pose different injury risks. However, the lack of adequate data on exposure has meant that these risks have not been quantified or compared in young children aged 5-12 years. Objectives: To measure exposure to popular activities among Australian primary school children and to quantify the associated injury risks. Method: The Childhood Injury Prevention Study prospectively followed up a cohort of randomly selected Australian primary and preschool children aged 5-12 years. Time (min) engaged in various physical activities was measured using a parent-completed 7-day diary. All injuries over 12 months were reported to the study. All data on exposure and injuries were coded using the International classification of external causes of injury. Injury rates per 1000 h of exposure were calculated for the most popular activities. Results: Complete diaries and data on injuries were available for 744 children. Over 12 months, 314 injuries relating to physical activity outside of school were reported. The highest injury risks per exposure time occurred for tackle-style football (2.18/1000 h), wheeled activities (1.72/1000 h) and tennis (1.19/1000 h). Overall, boys were injured more often than girls; however, the differences were non-significant or reversed for some activities including soccer, trampolining and team ball sports. Conclusion: Although the overall injury rate was low in this prospective cohort, the safety of some popular childhood activities can be improved so that the benefits may be enjoyed with fewer negative consequences.
Resumo:
Stochastic simulation is a recognised tool for quantifying the spatial distribution of geological uncertainty and risk in earth science and engineering. Metals mining is an area where simulation technologies are extensively used; however, applications in the coal mining industry have been limited. This is particularly due to the lack of a systematic demonstration illustrating the capabilities these techniques have in problem solving in coal mining. This paper presents two broad and technically distinct areas of applications in coal mining. The first deals with the use of simulation in the quantification of uncertainty in coal seam attributes and risk assessment to assist coal resource classification, and drillhole spacing optimisation to meet pre-specified risk levels at a required confidence. The second application presents the use of stochastic simulation in the quantification of fault risk, an area of particular interest to underground coal mining, and documents the performance of the approach. The examples presented demonstrate the advantages and positive contribution stochastic simulation approaches bring to the coal mining industry
Resumo:
In recent years, the phrase 'genomic medicine' has increasingly been used to describe a new development in medicine that holds great promise for human health. This new approach to health care uses the knowledge of an individual's genetic make-up to identify those that are at a higher risk of developing certain diseases and to intervene at an earlier stage to prevent these diseases. Identifying genes that are involved in disease aetiology will provide researchers with tools to develop better treatments and cures. A major role within this field is attributed to 'predictive genomic medicine', which proposes screening healthy individuals to identify those who carry alleles that increase their susceptibility to common diseases, such as cancers and heart disease. Physicians could then intervene even before the disease manifests and advise individuals with a higher genetic risk to change their behaviour - for instance, to exercise or to eat a healthier diet - or offer drugs or other medical treatment to reduce their chances of developing these diseases. These promises have fallen on fertile ground among politicians, health-care providers and the general public, particularly in light of the increasing costs of health care in developed societies. Various countries have established databases on the DNA and health information of whole populations as a first step towards genomic medicine. Biomedical research has also identified a large number of genes that could be used to predict someone's risk of developing a certain disorder. But it would be premature to assume that genomic medicine will soon become reality, as many problems remain to be solved. Our knowledge about most disease genes and their roles is far from sufficient to make reliable predictions about a patient’s risk of actually developing a disease. In addition, genomic medicine will create new political, social, ethical and economic challenges that will have to be addressed in the near future.
Resumo:
Parkinson’s disease (PD) is a progressive, degenerative, neurological disease. The progressive disability associated with PD results in substantial burdens for those with the condition, their families and society in terms of increased health resource use, earnings loss of affected individuals and family caregivers, poorer quality of life, caregiver burden, disrupted family relationships, decreased social and leisure activities, and deteriorating emotional well-being. Currently, no cure is available and the efficacy of available treatments, such as medication and surgical interventions, decreases with longer duration of the disease. Whilst the cause of PD is unknown, genetic and environmental factors are believed to contribute to its aetiology. Descriptive and analytical epidemiological studies have been conducted in a number of countries in an effort to elucidate the cause, or causes, of PD. Rural residency, farming, well water consumption, pesticide exposure, metals and solvents have been implicated as potential risk factors for PD in some previous epidemiological studies. However, there is substantial disagreement between the results of existing studies. Therefore, the role of environmental exposures in the aetiology of PD remains unclear. The main component of this thesis consists of a case-control study that assessed the contribution of environmental exposures to the risk of developing PD. An existing, previously unanalysed, dataset from a local case-control study was analysed to inform the design of the new case-control study. The analysis results suggested that regular exposure to pesticides and head injury were important risk factors for PD. However, due to the substantial limitations of this existing study, further confirmation of these results was desirable with a more robustly designed epidemiological study. A new exposure measurement instrument (a structured interviewer-delivered questionnaire) was developed for the new case-control study to obtain data on demographic, lifestyle, environmental and medical factors. Prior to its use in the case-control study, the questionnaire was assessed for test-retest repeatability in a series of 32 PD cases and 29 healthy sex-, age- and residential suburb-matched electoral roll controls. High repeatability was demonstrated for lifestyle exposures, such as smoking and coffee/tea consumption (kappas 0.70-1.00). The majority of environmental exposures, including use of pesticides, solvents and exposure to metal dusts and fumes, also showed high repeatability (kappas >0.78). A consecutive series of 163 PD case participants was recruited from a neurology clinic in Brisbane. One hundred and fifty-one (151) control participants were randomly selected from the Australian Commonwealth Electoral Roll and individually matched to the PD cases on age (± 2 years), sex and current residential suburb. Participants ranged in age from 40-89 years (mean age 67 years). Exposure data were collected in face-to-face interviews. Odds ratios and 95% confidence intervals were calculated using conditional logistic regression for matched sets in SAS version 9.1. Consistent with previous studies, ever having been a regular smoker or coffee drinker was inversely associated with PD with dose-response relationships evident for packyears smoked and number of cups of coffee drunk per day. Passive smoking from ever having lived with a smoker or worked in a smoky workplace was also inversely related to PD. Ever having been a regular tea drinker was associated with decreased odds of PD. Hobby gardening was inversely associated with PD. However, use of fungicides in the home garden or occupationally was associated with increased odds of PD. Exposure to welding fumes, cleaning solvents, or thinners occupationally was associated with increased odds of PD. Ever having resided in a rural or remote area was inversely associated with PD. Ever having resided on a farm was only associated with moderately increased odds of PD. Whilst the current study’s results suggest that environmental exposures on their own are only modest contributors to overall PD risk, the possibility that interaction with genetic factors may additively or synergistically increase risk should be considered. The results of this research support the theory that PD has a multifactorial aetiology and that environmental exposures are some of a number of factors to contribute to PD risk. There was also evidence of interaction between some factors (eg smoking and welding) to moderate PD risk.
Resumo:
Developing a unified classification system to replace four of the systems currently used in disability athletics (i.e., track and field) has been widely advocated. The diverse impairments to be included in a unified system require severed assessment methods, results of which cannot be meaningfully compared. Therefore, the taxonomic basis of current classification systems is invalid in a unified system. Biomechanical analysis establishes that force, a vector described in terms of magnitude and direction, is a key determinant of success in all athletic disciplines. It is posited that all impairments to be included in a unified system may be classified as either force magnitude impairments (FMI) or force control impairments (FCI). This framework would provide a valid taxonomic basis for a unified system, creating the opportunity to decrease the number of classes and enhance the viability of disability athletics.