849 resultados para Risk factor
Resumo:
Background Older people have higher rates of hospital admission than the general population and higher rates of readmission due to complications and falls. During hospitalisation, older people experience significant functional decline which impairs their future independence and quality of life. Acute hospital services comprise the largest section of health expenditure in Australia and prevention or delay of disease is known to produce more effective use of services. Current models of discharge planning and follow-up care, however, do not address the need to prevent deconditioning or functional decline. This paper describes the protocol of a randomised controlled trial which aims to evaluate innovative transitional care strategies to reduce unplanned readmissions and improve functional status, independence, and psycho-social well-being of community-based older people at risk of readmission. Methods/Design The study is a randomised controlled trial. Within 72 hours of hospital admission, a sample of older adults fitting the inclusion/exclusion criteria (aged 65 years and over, admitted with a medical diagnosis, able to walk independently for 3 meters, and at least one risk factor for readmission) are randomised into one of four groups: 1) the usual care control group, 2) the exercise and in-home/telephone follow-up intervention group, 3) the exercise only intervention group, or 4) the in-home/telephone follow-up only intervention group. The usual care control group receive usual discharge planning provided by the health service. In addition to usual care, the exercise and in-home/telephone follow-up intervention group receive an intervention consisting of a tailored exercise program, in-home visit and 24 week telephone follow-up by a gerontic nurse. The exercise only and in-home/telephone follow-up only intervention groups, in addition to usual care receive only the exercise or gerontic nurse components of the intervention respectively. Data collection is undertaken at baseline within 72 hours of hospital admission, 4 weeks following hospital discharge, 12 weeks following hospital discharge, and 24 weeks following hospital discharge. Outcome assessors are blinded to group allocation. Primary outcomes are emergency hospital readmissions and health service use, functional status, psychosocial well-being and cost effectiveness. Discussion The acute hospital sector comprises the largest component of health care system expenditure in developed countries, and older adults are the most frequent consumers. There are few trials to demonstrate effective models of transitional care to prevent emergency readmissions, loss of functional ability and independence in this population following an acute hospital admission. This study aims to address that gap and provide information for future health service planning which meets client needs and lowers the use of acute care services.
Quantity of documentation of maltreatment risk factors in injury-related paediatric hospitalisations
Resumo:
Background While child maltreatment is recognised as a global problem, solid epidemiological data on the prevalence of child maltreatment and risk factors associated with child maltreatment is lacking in Australia and internationally. There have been recent calls for action to improve the evidence-base capturing and describing child abuse, particularly those data captured within the health sector. This paper describes the quantity of documentation of maltreatment risk factors in injury-related paediatric hospitalisations in Queensland, Australia. Methods This study involved a retrospective medical record review, text extraction and coding methodology to assess the quantity of documentation of risk factors and the subsequent utility of data in hospital records for describing child maltreatment and data linkage to Child Protection Service (CPS). Results There were 433 children in the maltreatment group and 462 in the unintentional injury group for whom medical records could be reviewed. Almost 93% of the any maltreatment code sample, but only 11% of the unintentional injury sample had documentation identified indicating the presence of any of 20 risk factors. In the maltreatment group the most commonly documented risk factor was history of abuse (41%). In those with an unintentional injury, the most commonly documented risk factor was alcohol abuse of the child or family (3%). More than 93% of the maltreatment sample also linked to a child protection record. Of concern are the 16% of those children who linked to child protection who did not have documented risk factors in the medical record. Conclusion Given the importance of the medical record as a source of information about children presenting to hospital for treatment and as a potential source of evidence for legal action the lack of documentation is of concern. The details surrounding the injury admission and consideration of any maltreatment related risk factors, both identifying their presence and ruling them out are required for each and every case. This highlights the need for additional training for clinicians to understand the importance of their documentation in child injury cases.
Resumo:
Academic pressure among adolescents is a major risk factor for poor mental health and suicide and other harmful behaviours. While this is a worldwide phenomenon, it appears to be especially pronounced in China and other East Asian countries. Despite a growing body of research into adolescent mental health in recent years, the multiple constructs within the ‘educational stress’ phenomenon have not been clearly articulated in Chinese contexts. Further, the individual, family, school and peer influencing factors for educational stress and its associations with adolescent mental health are not well understood. An in-depth investigation may provide important information for the ongoing educational reform in Mainland China with a special focus on students’ mental health and wellbeing. The primary goal of this study was to examine the relative contribution of educational stress to poor mental health, in comparison to other well-known individual, family, school and peer factors. Another important task was to identify significant risk factors for educational stress. In addition, due to the lack of a culturally suitable instrument for educational stress in this population, a new tool – the Educational Stress Scale for Adolescents (ESSA) was initially developed in this study and tested for reliability and validity. A self-administered questionnaire was used to collect information from convenient samples of secondary school students in Shandong, China. The pilot survey was conducted with 347 students (grades 8 and 11) to test the psychometric properties of the ESSA and other scales or questions in the questionnaire. Based on factor analysis and reliability and validity testing, the 16-item scale (the ESSA) with five factors showed adequate to good internal consistency, 2-week test-retest reliability, and satisfactory concurrent and predictive validity. Its factor structure was further demonstrated in the main survey with a confirmatory factor analysis illustrating a good fit of the proposed model based on a confirmatory factor analysis. The reliabilities of other scales and questions were also adequate to be used in this study. The main survey was subsequently conducted with a sample of 1627 secondary school (grades 7-12) students to examine the influencing factors of educational stress and its associations with mental health outcomes, including depression, happiness and suicidal behaviours. A wide range of individual, family, school and peer factors were found to have a significant association with the total ESSA and subscale scores. Most of the strong factors for academic stress were school or study-related, including rural school location, low school connectedness, perceived poor academic grades and frequent emotional conflicts with teachers and peers. Unexpectedly, family and parental factors, such as parental bonding, family connectedness and conflicts with parents were found to have little or no association with educational stress. Educational stress was the most predictive variable for depression, but was not strongly associated with happiness. It had a strong association with suicide ideation but not with suicide attempts. Among five subscales of the ESSA, ‘Study despondency’ score had the strongest associations with these mental health measures. Surprising, two subscales, ‘Self-expectation’ and ‘Worry about grades’ showed a protective effect on suicidal behaviours. An additional analysis revealed that although academic pressure was the most commonly reported reason for suicidal thinking, the occurrence of problems in peer relationships such as peer teasing and bullying, and romantic problems had a much stronger relationship with actual attempts. This study provides some insights into the nature and health implications of educational stress among Chinese adolescents. Findings in this study suggest that interventions on educational stress should focus on school environment and academic factors. Intervention programs focused on educational stress may have a high impact on the prevalence of common mental disorders such as depression. Efforts to increase perceived happiness however should cover a wider range of individual, family and school factors. The importance of healthy peer relationships should be adequately emphasised in suicide prevention. In addition, the newly developed scale (the ESSA) demonstrates sound psychometric properties and is expected to be used in future research into academic-related stress among secondary school adolescents.
Resumo:
Background: Cardiovascular disease (CVD) is more prevalent in regional and remote Australia compared to metropolitan areas. The aim of Healthy Hearts was to determine age and sex specific CVD risk factor levels and the potential value of national risk clinics. Methods: Healthy Hearts was an observational research study conducted in four purposefully selected higher risk communities in regional Victoria, Australia. The main outcome measures were the proportion of participants with CVD risk factors with group comparisons to determine the adjusted likelihood of elevated risk factor levels. Trained personnel used a standardized protocol over four weeks per community to measure CVD risk factor levels, estimate absolute CVD risk and provide feedback and advice. Results: A total of 2125 self-selected participants were assessed (mean age 58 ± 15 years, 57% women). Overall, CVD risk factors were highly prevalent. More men than women had ≥ 2 modifiable CVD risk factors (76% vs. 68%, p < .001), pre-existing CVD (20 vs. 15%, p < .01) and a major ECG abnormality requiring follow-up (15% vs. 7%, p < .001) . Less men reported depressive symptoms compared to women (28% vs. 22%, p < .01). A higher proportion of women were obese (adjusted OR 1.36, 95% CI 1.13 to 1.63), and physically inactive (adjusted OR 1.32, 95% CI 1.07 to 1.63). Conclusions: High CVD risk factor levels were confirmed for regional Victoria. Close engagement with individuals and communities provides scope for the application of regional risk management clinics to reduce the burden of CVD risk in regional Australia.
Resumo:
Objective: To establish risk factors for moderate and severe microbial keratitis among daily contact lens (CL) wearers in Australia. Design: A prospective, 12-month, population-based, case-control study. Participants: New cases of moderate and severe microbial keratitis in daily wear CL users presenting in Australia over a 12-month period were identified through surveillance of all ophthalmic practitioners. Case detection was augmented by record audits at major ophthalmic centers. Controls were users of daily wear CLs in the community identified using a national telephone survey. Testing: Cases and controls were interviewed by telephone to determine subject demographics and CL wear history. Multiple binary logistic regression was used to determine independent risk factors and univariate population attributable risk percentage (PAR%) was estimated for each risk factor.; Main Outcome Measures: Independent risk factors, relative risk (with 95% confidence intervals [CIs]), and PAR%. Results: There were 90 eligible moderate and severe cases related to daily wear of CLs reported during the study period. We identified 1090 community controls using daily wear CLs. Independent risk factors for moderate and severe keratitis while adjusting for age, gender, and lens material type included poor storage case hygiene 6.4× (95% CI, 1.9-21.8; PAR, 49%), infrequent storage case replacement 5.4× (95% CI, 1.5-18.9; PAR, 27%), solution type 7.2× (95% CI, 2.3-22.5; PAR, 35%), occasional overnight lens use (<1 night per week) 6.5× (95% CI, 1.3-31.7; PAR, 23%), high socioeconomic status 4.1× (95% CI, 1.2-14.4; PAR, 31%), and smoking 3.7× (95% CI, 1.1-12.8; PAR, 31%). Conclusions: Moderate and severe microbial keratitis associated with daily use of CLs was independently associated with factors likely to cause contamination of CL storage cases (frequency of storage case replacement, hygiene, and solution type). Other factors included occasional overnight use of CLs, smoking, and socioeconomic class. Disease load may be considerably reduced by attention to modifiable risk factors related to CL storage case practice.
Resumo:
Background: Critically ill patients are at high risk for pressure ulcer (PrU) development due to their high acuity and the invasive nature of the multiple interventions and therapies they receive. With reported incidence rates of PrU development in the adult critical care population as high as 56%, the identification of patients at high risk of PrU development is essential. This paper will explore the association between PrU development and risk factors. It will also explore PrU development and the use of risk assessment scales for critically ill patients in adult intensive care units. Method: A literature search from 2000 to 2012 using the CINHAL, Cochrane Library, EBSCOHost, Medline (via EBSCOHost), PubMed, ProQuest and Google Scholar databases was conducted. Key words used were: pressure ulcer/s; pressure sore/s; decubitus ulcer/s; bed sore/s; critical care; intensive care; critical illness; prevalence; incidence; prevention; management; risk factor; risk assessment scale. Results: Nineteen articles were included in this review; eight studies addressing PrU risk factors, eight studies addressing risk assessment scales and three studies overlapping both. Results from the studies reviewed identified 28 intrinsic and extrinsic risk factors which may lead to PrU development. Development of a risk factor prediction model in this patient population, although beneficial, appears problematic due to many issues such as diverse diagnoses and subsequent patient needs. Additionally, several risk assessment instruments have been developed for early screening of patients at higher risk of developing PrU in the ICU. No existing risk assessment scales are valid for identification high risk critically ill patient,with the majority of scales potentially over-predicting patients at risk for PrU development. Conclusion: Research studies to inform the risk factors for potential pressure ulcer development are inconsistent. Additionally, there is no consistent or clear evidence which demonstrates any scale to better or more effective than another when used to identify the patients at risk for PrU development. Furthermore robust research is needed to identify the risk factors and develop valid scales for measuring the risk of PrU development in ICU.
Resumo:
OBJECTIVES: To examine the prospective association between perception of health during pregnancy and cardiovascular risk factor of mothers 21 years after the index pregnancy. METHODS: Data used were from the Mater University Study of Pregnancy (MUSP), a community- based prospective birth ohort study begun in Brisbane, Australia, in 1983. Logistic regression analyses were conducted. RESULTS: Data were available for 3692 women. Women who perceived themselves as not having a straight forward pregnancy had twice the odds (adjusted OR 2.0, 95% CI 1.1-3.8) of being diagnosed with heart disease 21 years after the indexpregnancyascomparedtowomenwith a straight forward pregnancy. Apart from that, women who had complications (other than serious pregnancy complications) during the pregnancy were also at30%increased odds (adjustedOR 1.3, 95% CI 1.0-1.6) of having hypertension 21 years later. CONCLUSIONS: As a whole, our study suggests that pregnant women who perceived that they had complications and did not have a straight forward pregnancy are likely to experience poorer cardiovascular outcomes 21 years after the pregnancy.
Interaction of psychosocial risk factors explain increased neck problems among female office workers
Resumo:
This study investigated the relationship between psychosocial risk factors and (1) neck symptoms and (2) neck pain and disability as measured by the neck disability index (NDI). Female office workers employed in local private and public organizations were invited to participate, with 333 completing a questionnaire. Data were collected on various risk factors including age, negative affectivity, history of previous neck trauma, physical work environment, and task demands. Sixty-one percent of the sample reported neck symptoms lasting greater than 8 days in the last 12 months. The mean NDI of the sample was 15.5 out of 100, indicating mild neck pain and disability. In a hierarchical multivariate logistic regression, low supervisor support was the only psychosocial risk factor identified with the presence of neck symptoms. Similarly, low supervisor support was the only factor associated with the score on the NDI. These associations remained after adjustment for potential confounders of age, negative affectivity, and physical risk factors. The interaction of job demands, decision authority, and supervisor support was significantly associated with the NDI in the final model and this association increased when those with previous trauma were excluded. Interestingly, and somewhat contrary to initial expectations, as job demands increased, high decision authority had an increasing effect on the NDI when supervisor support was low.
Resumo:
Purpose Is eccentric hamstring strength and between limb imbalance in eccentric strength, measured during the Nordic hamstring exercise, a risk factor for hamstring strain injury (HSI)? Methods Elite Australian footballers (n=210) from five different teams participated. Eccentric hamstring strength during the Nordic was taken at the commencement and conclusion of preseason training and in season. Injury history and demographic data were also collected. Reports on prospectively occurring HSIs were completed by team medical staff. Relative risk (RR) was determined for univariate data and logistic regression was employed for multivariate data. Results Twenty-eight HSIs were recorded. Eccentric hamstring strength below 256N at the start of preseason and 279N at the end of preseason increased risk of future HSI 2.7 (relative risk, 2.7; 95% confidence interval, 1.3 to 5.5; p = 0.006) and 4.3 fold (relative risk, 4.3; 95% confidence interval, 1.7 to 11.0; p = 0.002) respectively. Between limb imbalance in strength of greater than 10% did not increase the risk of future HSI. Univariate analysis did not reveal a significantly greater relative risk for future HSI in athletes who had sustained a lower limb injury of any kind within the last 12 months. Logistic regression revealed interactions between both athlete age and history of HSI with eccentric hamstring strength, whereby the likelihood of future HSI in older athletes or athletes with a history of HSI was reduced if an athlete had high levels of eccentric strength. Conclusion Low levels of eccentric hamstring strength increased the risk of future HSI. Interaction effects suggest that the additional risk of future HSI associated with advancing age or previous injury was mitigated by higher levels of eccentric hamstring strength.
Resumo:
Purpose: Several occupational carcinogens are metabolized by polymorphic enzymes. The distribution of the polymorphic enzymes N-acetyltransferase 2 (NAT2; substrates: aromatic amines), glutathione S-transferase M1 (GSTM1; substrates: e.g., reactive metabolites of polycyclic aromatic hydrocarbons), and glutathione S-transferase T1 (GSTT1; substrates: small molecules with 1-2 carbon atoms) were investigated. Material and Methods: At the urological department in Lutherstadt Wittenberg, 136 patients with a histologically proven transitional cell cancer of the urinary bladder were investigated for all occupations performed for more than 6 months. Several occupational and non-occupational risk factors were asked. The genotypes of NAT2, GSTM1, and GSTT1 were determined from leucocyte DNA by PCR. Results: Compared to the general population in Middle Europe, the percentage of GSTT1 negative persons (22.1 %) was ordinary; the percentage of slow acetylators (59.6%) was in the upper normal range, while the percentage of GSTM1 negative persons (58.8%) was elevated in the entire group. Shifts in the distribution of the genotypes were observed in subgroups who had been exposed to asbestos (6/6 GSTM1 negative, 5/6 slow acetylators), rubber manufacturing (8/10 GSTM1 negative), and chlorinated solvents (9/15 GSTM1 negative). Conclusions: The overrepresentation of GSTM1 negative bladder cancer patients also in this industrialized area and more pronounced in several occupationally exposed subgroups points to an impact of the GSTM1 negative genotype in bladder carcinogenesis. [Article in German]
Resumo:
Deprivation has previously been shown to be an independent risk factor for the high prevalence of malnutrition observed in COPD (Collins et al., 2010). It has been suggested the socioeconomic gradient observed in COPD is greater than any other chronic disease (Prescott & Vestbo, 1999). The current study aimed to examine the infl uence of disease severity and social deprivation on malnutrition risk in outpatients with COPD. 424 COPD outpatients were screened using the ‘Malnutrition Universal Screening Tool’ (‘MUST’). COPD disease severity was recorded in accordance with the GOLD criteria and deprivation was established according to the patient’s geographical location (postcode) at the time of nutritional screening using the UK Government’s Index of Multiple Deprivation (IMD). IMD ranks postcodes from 1 (most deprived) to 32,482 (least deprived). Disease severity was posi tively associated with an increased prevalence of malnutrition risk (p < 0.001) both within and between groups, whilst rank IMD was negatively associated with malnutrition (p = 0.020), i.e. those residing in less deprived areas were less likely to be malnourished. Within each category of disease severity the prevalence of malnutrition was two-fold greater in those residing in the most deprived areas compared to those residing in the least deprived areas. This study suggests that deprivation and disease severity are independent risk factors for malnutrition in COPD both contributing to the widely variable prevalence of malnutrition. Consideration of these issues could assist with the targeted nutritional management of these patients.
Resumo:
Introduction Risk factor analyses for nosocomial infections (NIs) are complex. First, due to competing events for NI, the association between risk factors of NI as measured using hazard rates may not coincide with the association using cumulative probability (risk). Second, patients from the same intensive care unit (ICU) who share the same environmental exposure are likely to be more similar with regard to risk factors predisposing to a NI than patients from different ICUs. We aimed to develop an analytical approach to account for both features and to use it to evaluate associations between patient- and ICU-level characteristics with both rates of NI and competing risks and with the cumulative probability of infection. Methods We considered a multicenter database of 159 intensive care units containing 109,216 admissions (813,739 admission-days) from the Spanish HELICS-ENVIN ICU network. We analyzed the data using two models: an etiologic model (rate based) and a predictive model (risk based). In both models, random effects (shared frailties) were introduced to assess heterogeneity. Death and discharge without NI are treated as competing events for NI. Results There was a large heterogeneity across ICUs in NI hazard rates, which remained after accounting for multilevel risk factors, meaning that there are remaining unobserved ICU-specific factors that influence NI occurrence. Heterogeneity across ICUs in terms of cumulative probability of NI was even more pronounced. Several risk factors had markedly different associations in the rate-based and risk-based models. For some, the associations differed in magnitude. For example, high Acute Physiology and Chronic Health Evaluation II (APACHE II) scores were associated with modest increases in the rate of nosocomial bacteremia, but large increases in the risk. Others differed in sign, for example respiratory vs cardiovascular diagnostic categories were associated with a reduced rate of nosocomial bacteremia, but an increased risk. Conclusions A combination of competing risks and multilevel models is required to understand direct and indirect risk factors for NI and distinguish patient-level from ICU-level factors.
Resumo:
Background Australian subacute inpatient rehabilitation facilities face significant challenges from the ageing population and the increasing burden of chronic disease. Foot disease complications are a negative consequence of many chronic diseases. With the rapid expansion of subacute rehabilitation inpatient services, it seems imperative to investigate the prevalence of foot disease and foot disease risk factors in this population. The primary aim of this cross-sectional study was to determine the prevalence of active foot disease and foot disease risk factors in a subacute inpatient rehabilitation facility. Methods Eligible participants were all adults admitted at least overnight into a large Australian subacute inpatient rehabilitation facility over two different four week periods. Consenting participants underwent a short non-invasive foot examination by a podiatrist utilising the validated Queensland Health High Risk Foot Form to collect data on age, sex, medical co-morbidity history, foot disease risk factor history and clinically diagnosed foot disease complications and foot disease risk factors. Descriptive statistics were used to determine the prevalence of clinically diagnosed foot disease complications, foot disease risk factors and groups of foot disease risk factors. Logistic regression analyses were used to investigate any associations between defined explanatory variables and appropriate foot disease outcome variables. Results Overall, 85 (88%) of 97 people admitted to the facility during the study periods consented; mean age 80 (±9) years and 71% were female. The prevalence (95% confidence interval) of participants with active foot disease was 11.8% (6.3 – 20.5), 32.9% (23.9 – 43.5) had multiple foot disease risk factors, and overall, 56.5% (45.9 – 66.5) had at least one foot disease risk factor. A self-reported history of peripheral neuropathy diagnosis was independently associated with having multiple foot disease risk factors (OR 13.504, p = 0.001). Conclusion This study highlights the potential significance of the burden of foot disease in subacute inpatient rehabilitation facilities. One in eight subacute inpatients were admitted with active foot disease and one in two with at least one foot disease risk factor in this study. It is recommended that further multi-site studies and management guidelines are required to address the foot disease burden in subacute inpatient rehabilitation facilities. Keywords: Subacute; Inpatient; Foot; Complication; Prevalence
Resumo:
Background Cardiovascular disease and mental health both hold enormous public health importance, both ranking highly in results of the recent Global Burden of Disease Study 2010 (GBD 2010). For the first time, the GBD 2010 has systematically and quantitatively assessed major depression as an independent risk factor for the development of ischemic heart disease (IHD) using comparative risk assessment methodology. Methods A pooled relative risk (RR) was calculated from studies identified through a systematic review with strict inclusion criteria designed to provide evidence of independent risk factor status. Accepted case definitions of depression include diagnosis by a clinician or by non-clinician raters adhering to Diagnostic and Statistical Manual of Mental Disorders (DSM) or International Classification of Diseases (ICD) classifications. We therefore refer to the exposure in this paper as major depression as opposed to the DSM-IV category of major depressive disorder (MDD). The population attributable fraction (PAF) was calculated using the pooled RR estimate. Attributable burden was calculated by multiplying the PAF by the underlying burden of IHD estimated as part of GBD 2010. Results The pooled relative risk of developing IHD in those with major depression was 1.56 (95% CI 1.30 to 1.87). Globally there were almost 4 million estimated IHD disability-adjusted life years (DALYs), which can be attributed to major depression in 2010; 3.5 million years of life lost and 250,000 years of life lived with a disability. These findings highlight a previously underestimated mortality component of the burden of major depression. As a proportion of overall IHD burden, 2.95% (95% CI 1.48 to 4.46%) of IHD DALYs were estimated to be attributable to MDD in 2010. Eastern Europe and North Africa/Middle East demonstrate the highest proportion with Asia Pacific, high income representing the lowest. Conclusions The present work comprises the most robust systematic review of its kind to date. The key finding that major depression may be responsible for approximately 3% of global IHD DALYs warrants assessment for depression in patients at high risk of developing IHD or at risk of a repeat IHD event.
Resumo:
Background The Global Burden of Disease Study 2010 (GBD 2010) identified mental and substance use disorders as the 5th leading contributor of burden in 2010, measured by disability adjusted life years (DALYs). This estimate was incomplete as it excluded burden resulting from the increased risk of suicide captured elsewhere in GBD 2010's mutually exclusive list of diseases and injuries. Here, we estimate suicide DALYs attributable to mental and substance use disorders. Methods Relative-risk estimates of suicide due to mental and substance use disorders and the global prevalence of each disorder were used to estimate population attributable fractions. These were adjusted for global differences in the proportion of suicide due to mental and substance use disorders compared to other causes then multiplied by suicide DALYs reported in GBD 2010 to estimate attributable DALYs (with 95% uncertainty). Results Mental and substance use disorders were responsible for 22.5 million (14.8-29.8 million) of the 36.2 million (26.5-44.3 million) DALYs allocated to suicide in 2010. Depression was responsible for the largest proportion of suicide DALYs (46.1% (28.0%-60.8%)) and anorexia nervosa the lowest (0.2% (0.02%-0.5%)). DALYs occurred throughout the lifespan, with the largest proportion found in Eastern Europe and Asia, and males aged 20-30 years. The inclusion of attributable suicide DALYs would have increased the overall burden of mental and substance use disorders (assigned to them in GBD 2010 as a direct cause) from 7.4% (6.2%-8.6%) to 8.3% (7.1%-9.6%) of global DALYs, and would have changed the global ranking from 5th to 3rd leading cause of burden. Conclusions Capturing the suicide burden attributable to mental and substance use disorders allows for more accurate estimates of burden. More consideration needs to be given to interventions targeted to populations with, or at risk for, mental and substance use disorders as an effective strategy for suicide prevention.