876 resultados para Risks Assessment Methods
Resumo:
Background The Global Burden of Disease, Injuries, and Risk Factor study 2013 (GBD 2013) is the first of a series of annual updates of the GBD. Risk factor quantification, particularly of modifiable risk factors, can help to identify emerging threats to population health and opportunities for prevention. The GBD 2013 provides a timely opportunity to update the comparative risk assessment with new data for exposure, relative risks, and evidence on the appropriate counterfactual risk distribution. Methods Attributable deaths, years of life lost, years lived with disability, and disability-adjusted life-years (DALYs) have been estimated for 79 risks or clusters of risks using the GBD 2010 methods. Risk–outcome pairs meeting explicit evidence criteria were assessed for 188 countries for the period 1990–2013 by age and sex using three inputs: risk exposure, relative risks, and the theoretical minimum risk exposure level (TMREL). Risks are organised into a hierarchy with blocks of behavioural, environmental and occupational, and metabolic risks at the first level of the hierarchy. The next level in the hierarchy includes nine clusters of related risks and two individual risks, with more detail provided at levels 3 and 4 of the hierarchy. Compared with GBD 2010, six new risk factors have been added: handwashing practices, occupational exposure to trichloroethylene, childhood wasting, childhood stunting, unsafe sex, and low glomerular filtration rate. For most risks, data for exposure were synthesised with a Bayesian meta-regression method, DisMod-MR 2.0, or spatial-temporal Gaussian process regression. Relative risks were based on meta-regressions of published cohort and intervention studies. Attributable burden for clusters of risks and all risks combined took into account evidence on the mediation of some risks such as high body-mass index (BMI) through other risks such as high systolic blood pressure and high cholesterol. Findings All risks combined account for 57·2% (95% uncertainty interval [UI] 55·8–58·5) of deaths and 41·6% (40·1–43·0) of DALYs. Risks quantified account for 87·9% (86·5–89·3) of cardiovascular disease DALYs, ranging to a low of 0% for neonatal disorders and neglected tropical diseases and malaria. In terms of global DALYs in 2013, six risks or clusters of risks each caused more than 5% of DALYs: dietary risks accounting for 11·3 million deaths and 241·4 million DALYs, high systolic blood pressure for 10·4 million deaths and 208·1 million DALYs, child and maternal malnutrition for 1·7 million deaths and 176·9 million DALYs, tobacco smoke for 6·1 million deaths and 143·5 million DALYs, air pollution for 5·5 million deaths and 141·5 million DALYs, and high BMI for 4·4 million deaths and 134·0 million DALYs. Risk factor patterns vary across regions and countries and with time. In sub-Saharan Africa, the leading risk factors are child and maternal malnutrition, unsafe sex, and unsafe water, sanitation, and handwashing. In women, in nearly all countries in the Americas, north Africa, and the Middle East, and in many other high-income countries, high BMI is the leading risk factor, with high systolic blood pressure as the leading risk in most of Central and Eastern Europe and south and east Asia. For men, high systolic blood pressure or tobacco use are the leading risks in nearly all high-income countries, in north Africa and the Middle East, Europe, and Asia. For men and women, unsafe sex is the leading risk in a corridor from Kenya to South Africa. Interpretation Behavioural, environmental and occupational, and metabolic risks can explain half of global mortality and more than one-third of global DALYs providing many opportunities for prevention. Of the larger risks, the attributable burden of high BMI has increased in the past 23 years. In view of the prominence of behavioural risk factors, behavioural and social science research on interventions for these risks should be strengthened. Many prevention and primary care policy options are available now to act on key risks.
Resumo:
The EU-funded research project ALARM will develop and test methods and protocols for the assessment of large-scale environmental risks in order to minimise negative human impacts. Research focuses on the assessment and forecast of changes in biodiversity and in the structure, function, and dynamics of ecosystems. This includes the relationships between society, the economy and biodiversity.
Resumo:
The increasing aversion to technological risks of the society requires the development of inherently safer and environmentally friendlier processes, besides assuring the economic competitiveness of the industrial activities. The different forms of impact (e.g. environmental, economic and societal) are frequently characterized by conflicting reduction strategies and must be holistically taken into account in order to identify the optimal solutions in process design. Though the literature reports an extensive discussion of strategies and specific principles, quantitative assessment tools are required to identify the marginal improvements in alternative design options, to allow the trade-off among contradictory aspects and to prevent the “risk shift”. In the present work a set of integrated quantitative tools for design assessment (i.e. design support system) was developed. The tools were specifically dedicated to the implementation of sustainability and inherent safety in process and plant design activities, with respect to chemical and industrial processes in which substances dangerous for humans and environment are used or stored. The tools were mainly devoted to the application in the stages of “conceptual” and “basic design”, when the project is still open to changes (due to the large number of degrees of freedom) which may comprise of strategies to improve sustainability and inherent safety. The set of developed tools includes different phases of the design activities, all through the lifecycle of a project (inventories, process flow diagrams, preliminary plant lay-out plans). The development of such tools gives a substantial contribution to fill the present gap in the availability of sound supports for implementing safety and sustainability in early phases of process design. The proposed decision support system was based on the development of a set of leading key performance indicators (KPIs), which ensure the assessment of economic, societal and environmental impacts of a process (i.e. sustainability profile). The KPIs were based on impact models (also complex), but are easy and swift in the practical application. Their full evaluation is possible also starting from the limited data available during early process design. Innovative reference criteria were developed to compare and aggregate the KPIs on the basis of the actual sitespecific impact burden and the sustainability policy. Particular attention was devoted to the development of reliable criteria and tools for the assessment of inherent safety in different stages of the project lifecycle. The assessment follows an innovative approach in the analysis of inherent safety, based on both the calculation of the expected consequences of potential accidents and the evaluation of the hazards related to equipment. The methodology overrides several problems present in the previous methods proposed for quantitative inherent safety assessment (use of arbitrary indexes, subjective judgement, build-in assumptions, etc.). A specific procedure was defined for the assessment of the hazards related to the formations of undesired substances in chemical systems undergoing “out of control” conditions. In the assessment of layout plans, “ad hoc” tools were developed to account for the hazard of domino escalations and the safety economics. The effectiveness and value of the tools were demonstrated by the application to a large number of case studies concerning different kinds of design activities (choice of materials, design of the process, of the plant, of the layout) and different types of processes/plants (chemical industry, storage facilities, waste disposal). An experimental survey (analysis of the thermal stability of isomers of nitrobenzaldehyde) provided the input data necessary to demonstrate the method for inherent safety assessment of materials.
Resumo:
Proper hazard identification has become progressively more difficult to achieve, as witnessed by several major accidents that took place in Europe, such as the Ammonium Nitrate explosion at Toulouse (2001) and the vapour cloud explosion at Buncefield (2005), whose accident scenarios were not considered by their site safety case. Furthermore, the rapid renewal in the industrial technology has brought about the need to upgrade hazard identification methodologies. Accident scenarios of emerging technologies, which are not still properly identified, may remain unidentified until they take place for the first time. The consideration of atypical scenarios deviating from normal expectations of unwanted events or worst case reference scenarios is thus extremely challenging. A specific method named Dynamic Procedure for Atypical Scenarios Identification (DyPASI) was developed as a complementary tool to bow-tie identification techniques. The main aim of the methodology is to provide an easier but comprehensive hazard identification of the industrial process analysed, by systematizing information from early signals of risk related to past events, near misses and inherent studies. DyPASI was validated on the two examples of new and emerging technologies: Liquefied Natural Gas regasification and Carbon Capture and Storage. The study broadened the knowledge on the related emerging risks and, at the same time, demonstrated that DyPASI is a valuable tool to obtain a complete and updated overview of potential hazards. Moreover, in order to tackle underlying accident causes of atypical events, three methods for the development of early warning indicators were assessed: the Resilience-based Early Warning Indicator (REWI) method, the Dual Assurance method and the Emerging Risk Key Performance Indicator method. REWI was found to be the most complementary and effective of the three, demonstrating that its synergy with DyPASI would be an adequate strategy to improve hazard identification methodologies towards the capture of atypical accident scenarios.
Resumo:
Although mitigating GHG emissions is necessary to reduce the overall negative climate change impacts on crop yields and agricultural production, certain mitigation measures may generate unintended consequences to food availability and access due to land use competition and economic burden of mitigation. Prior studies have examined the co-impacts on food availability and global producer prices caused by alternative climate policies. More recent studies have looked at the reduction in total caloric intake driven by both changing income and changing food prices under one specific climate policy. However, due to inelastic calorie demand, consumers’ well-being are likely further reduced by increased food expenditures. Built upon existing literature, my dissertation explores how alternative climate policy designs might adversely affect both caloric intake and staple food budget share to 2050, by using the Global Change Assessment Model (GCAM) and a post-estimated metric of food availability and access (FAA). My dissertation first develop a set of new metrics and methods to explore new perspectives of food availability and access under new conditions. The FAA metric consists of two components, the fraction of GDP per capita spent on five categories of staple food and total caloric intake relative to a reference level. By testing the metric against alternate expectations of the future, it shows consistent results with previous studies that economic growth dominates the improvement of FAA. As we increase our ambition to achieve stringent climate targets, two policy conditions tend to have large impacts on FAA driven by competing land use and increasing food prices. Strict conservation policies leave the competition between bioenergy and agriculture production on existing commercial land, while pricing terrestrial carbon encourages large-scale afforestation. To avoid unintended outcomes to food availability and access for the poor, pricing land emissions in frontier forests has the advantage of selecting more productive land for agricultural activities compared to the full conservation approach, but the land carbon price should not be linked to the price of energy system emissions. These results are highly relevant to effective policy-making to reduce land use change emissions, such as the Reduced Emissions from Deforestation and Forest Degradation (REDD).
Resumo:
Background The accurate measurement of Cardiac output (CO) is vital in guiding the treatment of critically ill patients. Invasive or minimally invasive measurement of CO is not without inherent risks to the patient. Skilled Intensive Care Unit (ICU) nursing staff are in an ideal position to assess changes in CO following therapeutic measures. The USCOM (Ultrasonic Cardiac Output Monitor) device is a non-invasive CO monitor whose clinical utility and ease of use requires testing. Objectives To compare cardiac output measurement using a non-invasive ultrasonic device (USCOM) operated by a non-echocardiograhically trained ICU Registered Nurse (RN), with the conventional pulmonary artery catheter (PAC) using both thermodilution and Fick methods. Design Prospective observational study. Setting and participants Between April 2006 and March 2007, we evaluated 30 spontaneously breathing patients requiring PAC for assessment of heart failure and/or pulmonary hypertension at a tertiary level cardiothoracic hospital. Methods SCOM CO was compared with thermodilution measurements via PAC and CO estimated using a modified Fick equation. This catheter was inserted by a medical officer, and all USCOM measurements by a senior ICU nurse. Mean values, bias and precision, and mean percentage difference between measures were determined to compare methods. The Intra-Class Correlation statistic was also used to assess agreement. The USCOM time to measure was recorded to assess the learning curve for USCOM use performed by an ICU RN and a line of best fit demonstrated to describe the operator learning curve. Results In 24 of 30 (80%) patients studied, CO measures were obtained. In 6 of 30 (20%) patients, an adequate USCOM signal was not achieved. The mean difference (±standard deviation) between USCOM and PAC, USCOM and Fick, and Fick and PAC CO were small, −0.34 ± 0.52 L/min, −0.33 ± 0.90 L/min and −0.25 ± 0.63 L/min respectively across a range of outputs from 2.6 L/min to 7.2 L/min. The percent limits of agreement (LOA) for all measures were −34.6% to 17.8% for USCOM and PAC, −49.8% to 34.1% for USCOM and Fick and −36.4% to 23.7% for PAC and Fick. Signal acquisition time reduced on average by 0.6 min per measure to less than 10 min at the end of the study. Conclusions In 80% of our cohort, USCOM, PAC and Fick measures of CO all showed clinically acceptable agreement and the learning curve for operation of the non-invasive USCOM device by an ICU RN was found to be satisfactorily short. Further work is required in patients receiving positive pressure ventilation.
Resumo:
Traditionally, transport disadvantage has been identified using accessibility analysis although the effectiveness of the accessibility planning approach to improving access to goods and services is not known. This paper undertakes a comparative assessment of measures of mobility, accessibility, and participation used to identify transport disadvantage using the concept of activity spaces. A 7 day activity-travel diary data for 89 individuals was collected from two case study areas located in rural Northern Ireland. A spatial analysis was conducted to select the case study areas using criteria derived from the literature. The criteria are related to the levels of area accessibility and area mobility which are known to influence the nature of transport disadvantage. Using the activity-travel diary data individuals weekly as well as day to day variations in activity-travel patterns were visualised. A model was developed using the ArcGIS ModelBuilder tool and was run to derive scores related to individual levels of mobility, accessibility, and participation in activities from the geovisualisation. Using these scores a multiple regression analysis was conducted to identify patterns of transport disadvantage. This study found a positive association between mobility and accessibility, between mobility and participation, and between accessibility and participation in activities. However, area accessibility and area mobility were found to have little impact on individual mobility, accessibility, and participation in activities. Income vis-àvis ´ car-ownership was found to have a significant impact on individual levels of mobility, and accessibility; whereas participation in activities were found to be a function of individual levels of income and their occupational status.
Resumo:
Enterococci are versatile Gram-positive bacteria that can survive under extreme conditions. Most enterococci are non-virulent and found in the gastrointestinal tract of humans and animals. Other strains are opportunistic pathogens that contribute to a large number of nosocomial infections globally. Epidemiological studies demonstrated a direct relationship between the density of enterococci in surface waters and the risk of swimmer-associated gastroenteritis. The distribution of infectious enterococcal strains from the hospital environment or other sources to environmental water bodies through sewage discharge or other means, could increase the prevalence of these strains in the human population. Environmental water quality studies may benefit from focusing on a subset of Enterococcus spp. that are consistently associated with sources of faecal pollution such as domestic sewage, rather than testing for the entire genus. E. faecalis and E. faecium are potentially good focal species for such studies, as they have been consistently identified as the dominant Enterococcus spp. in human faeces and sewage. On the other hand enterococcal infections are predominantly caused by E. faecalis and E. faecium. The characterisation of E. faecalis and E. faecium is important in studying their population structures, particularly in environmental samples. In developing and implementing rapid, robust molecular genotyping techniques, it is possible to more accurately establish the relationship between human and environmental enterococci. Of particular importance, is to determine the distribution of high risk enterococcal clonal complexes, such as E. faecium clonal complex 17 and E. faecalis clonal complexes 2 and 9 in recreational waters. These clonal complexes are recognized as particularly pathogenic enterococcal genotypes that cause severe disease in humans globally. The Pimpama-Coomera watershed is located in South East Queensland, Australia and was investigated in this study mainly because it is used intensively for agriculture and recreational purposes and has a strong anthropogenic impact. The primary aim of this study was to develop novel, universally applicable, robust, rapid and cost effective genotyping methods which are likely to yield more definitive results for the routine monitoring of E. faecalis and E. faecium, particularly in environmental water sources. To fullfill this aim, new genotyping methods were developed based on the interrogation of highly informative single nucleotide polymorphisms (SNPs) located in housekeeping genes of both E. faecalis and E. faecium. SNP genotyping was successfully applied in field investigations of the Coomera watershed, South-East Queensland, Australia. E. faecalis and E. faecium isolates were grouped into 29 and 23 SNP profiles respectively. This study showed the high longitudinal diversity of E. faecalis and E. faecium over a period of two years, and both human-related and human-specific SNP profiles were identified. Furthermore, 4.25% of E. faecium strains isolated from water was found to correspond to the important clonal complex-17 (CC17). Strains that belong to CC17 cause the majority of hospital outbreaks and clinical infections globally. Of the six sampling sites of the Coomera River, Paradise Point had the highest number of human-related and human-specific E. faecalis and E. faecium SNP profiles. The secondary aim of this study was to determine the antibiotic-resistance profiles and virulence traits associated with environmental E. faecalis and E. faecium isolates compared to human pathogenic E. faecalis and E. faecium isolates. This was performed to predict the potential health risks associated with coming into contact with these strains in the Coomera watershed. In general, clinical isolates were found to be more resistant to all the antibiotics tested compared to water isolates and they harbored more virulence traits. Multi-drug resistance was more prevalent in clinical isolates (71.18% of E. faecalis and 70.3 % of E. faecium) compared to water isolates (only 5.66 % E. faecium). However, tetracycline, gentamicin, ciprofloxacin and ampicillin resistance was observed in water isolates. The virulence gene esp was the most prevalent virulence determinant observed in clinical isolates (67.79% of E. faecalis and 70.37 % of E. faecium), and this gene has been described as a human-specific marker used for microbial source tracking (MST). The presence of esp in water isolates (16.36% of E. faecalis and 19.14% of E. faecium) could be indicative of human faecal contamination in these waterways. Finally, in order to compare overall gene expression between environmental and clinical strains of E. faecalis, a comparative gene hybridization study was performed. The results of this investigation clearly demonstrated the up-regulation of genes associated with pathogenicity in E. faecalis isolated from water. The expression study was performed at physiological temperatures relative to ambient temperatures. The up-regulation of virulence genes demonstrates that environmental strains of E. faecalis can pose an increased health risk which can lead to serious disease, particularly if these strains belong to the virulent CC17 group. The genotyping techniques developed in this study not only provide a rapid, robust and highly discriminatory tool to characterize E. faecalis and E. faecium, but also enables the efficient identification of virulent enterococci that are distributed in environmental water sources.
Resumo:
With the emergence of Unmanned Aircraft Systems (UAS) there is a growing need for safety standards and regulatory frameworks to manage the risks associated with their operations. The primary driver for airworthiness regulations (i.e., those governing the design, manufacture, maintenance and operation of UAS) are the risks presented to people in the regions overflown by the aircraft. Models characterising the nature of these risks are needed to inform the development of airworthiness regulations. The output from these models should include measures of the collective, individual and societal risk. A brief review of these measures is provided. Based on the review, it was determined that the model of the operation of an UAS over inhabited areas must be capable of describing the distribution of possible impact locations, given a failure at a particular point in the flight plan. Existing models either do not take the impact distribution into consideration, or propose complex and computationally expensive methods for its calculation. A computationally efficient approach for estimating the boundary (and in turn area) of the impact distribution for fixed wing unmanned aircraft is proposed. A series of geometric templates that approximate the impact distributions are derived using an empirical analysis of the results obtained from a 6-Degree of Freedom (6DoF) simulation. The impact distributions can be aggregated to provide impact footprint distributions for a range of generic phases of flight and missions. The maximum impact footprint areas obtained from the geometric template are shown to have a relative error of typically less than 1% compared to the areas calculated using the computationally more expensive 6DoF simulation. Computation times for the geometric models are on the order of one second or less, using a standard desktop computer. Future work includes characterising the distribution of impact locations within the footprint boundaries.
Resumo:
Climate change presents risks to health that must be addressed by both decision-makers and public health researchers. Within the application of Environmental Health Impact Assessment (EHIA), there have been few attempts to incorporate climate change-related health risks as an input to the framework. This study used a focus group design to examine the perceptions of government, industry and academic specialists about the suitability of assessing the health consequences of climate change within an EHIA framework. Practitioners expressed concern over a number of factors relating to the current EHIA methodology and the inclusion of climate change-related health risks. These concerns related to the broad scope of issues that would need to be considered, problems with identifying appropriate health indicators, the lack of relevant qualitative information that is currently incorporated in assessment and persistent issues surrounding stakeholder participation. It was suggested that improvements are needed in data collection processes, particularly in terms of adequate communication between environmental and health practitioners. Concerns were raised surrounding data privacy and usage, and how these could impact on the assessment process. These findings may provide guidance for government and industry bodies to improve the assessment of climate change-related health risks.
Resumo:
Background & Aims Nutrition screening and assessment enable early identification of malnourished people and those at risk of malnutrition. Appropriate assessment tools assist with informing and monitoring nutrition interventions. Tool choice needs to be appropriate to the population and setting. Methods Community-dwelling people with Parkinson’s disease (>18 years) were recruited. Body mass index (BMI) was calculated from weight and height. Participants were classified as underweight according to World Health Organisation (WHO) (≤18.5kg/m2) and age specific (<65 years,≤18.5kg/m2; ≥65 years,≤23.5kg/m2) cut-offs. The Mini-Nutritional Assessment (MNA) screening (MNA-SF) and total assessment scores were calculated. The Patient-Generated Subjective Global Assessment (PG-SGA), including the Subjective Global Assessment (SGA), was performed. Sensitivity, specificity, positive predictive value, negative predictive value and weighted kappa statistic of each of the above compared to SGA were determined. Results Median age of the 125 participants was 70.0(35-92) years. Age-specific BMI (Sn 68.4%, Sp 84.0%) performed better than WHO (Sn 15.8%, Sp 99.1%) categories. MNA-SF performed better (Sn 94.7%, Sp 78.3%) than both BMI categorisations for screening purposes. MNA had higher specificity but lower sensitivity than PG-SGA (MNA Sn 84.2%, Sp 87.7%; PG-SGA Sn 100.0%, Sp 69.8%). Conclusions BMI lacks sensitivity to identify malnourished people with Parkinson’s disease and should be used with caution. The MNA-SF may be a better screening tool in people with Parkinson’s disease. The PG-SGA performed well and may assist with informing and monitoring nutrition interventions. Further research should be conducted to validate screening and assessment tools in Parkinson’s disease.
Resumo:
In earlier cultures and societies, hazards and risks to human health were dealt with by methods derived from myth, metaphor and ritual. In modem society however, notions of hazard and risk have been transformed from the level of a folk discourse to that of an expert centred concept (Plough & Krimsky, 1987). With the professionalization of risk and hazard analysis came a preferred framework for decision making based on a range of 'technical' methodologies (Giere, 1991 ). This is especially true for decision processes relating to risk assessment and management, and impact assessment. Such approaches however, often entail narrow technical-based theoretical assumptions about human behaviour and the natural world, and the· methods used. They therefore carry 'in-built' error factors that contribute considerable uncertainty to the results.
Resumo:
Introduction This investigation aimed to assess the consistency and accuracy of radiation therapists (RTs) performing cone beam computed tomography (CBCT) alignment to fiducial markers (FMs) (CBCTFM) and the soft tissue prostate (CBCTST). Methods Six patients receiving prostate radiation therapy underwent daily CBCTs. Manual alignment of CBCTFM and CBCTST was performed by three RTs. Inter-observer agreement was assessed using a modified Bland–Altman analysis for each alignment method. Clinically acceptable 95% limits of agreement with the mean (LoAmean) were defined as ±2.0 mm for CBCTFM and ±3.0 mm for CBCTST. Differences between CBCTST alignment and the observer-averaged CBCTFM (AvCBCTFM) alignment were analysed. Clinically acceptable 95% LoA were defined as ±3.0 mm for the comparison of CBCTST and AvCBCTFM. Results CBCTFM and CBCTST alignments were performed for 185 images. The CBCTFM 95% LoAmean were within ±2.0 mm in all planes. CBCTST 95% LoAmean were within ±3.0 mm in all planes. Comparison of CBCTST with AvCBCTFM resulted in 95% LoA of −4.9 to 2.6, −1.6 to 2.5 and −4.7 to 1.9 mm in the superior–inferior, left–right and anterior–posterior planes, respectively. Conclusions Significant differences were found between soft tissue alignment and the predicted FM position. FMs are useful in reducing inter-observer variability compared with soft tissue alignment. Consideration needs to be given to margin design when using soft tissue matching due to increased inter-observer variability. This study highlights some of the complexities of soft tissue guidance for prostate radiation therapy.
Resumo:
BACKGROUND Quantification of the disease burden caused by different risks informs prevention by providing an account of health loss different to that provided by a disease-by-disease analysis. No complete revision of global disease burden caused by risk factors has been done since a comparative risk assessment in 2000, and no previous analysis has assessed changes in burden attributable to risk factors over time. METHODS We estimated deaths and disability-adjusted life years (DALYs; sum of years lived with disability [YLD] and years of life lost [YLL]) attributable to the independent effects of 67 risk factors and clusters of risk factors for 21 regions in 1990 and 2010. We estimated exposure distributions for each year, region, sex, and age group, and relative risks per unit of exposure by systematically reviewing and synthesising published and unpublished data. We used these estimates, together with estimates of cause-specific deaths and DALYs from the Global Burden of Disease Study 2010, to calculate the burden attributable to each risk factor exposure compared with the theoretical-minimum-risk exposure. We incorporated uncertainty in disease burden, relative risks, and exposures into our estimates of attributable burden. FINDINGS In 2010, the three leading risk factors for global disease burden were high blood pressure (7·0% [95% uncertainty interval 6·2-7·7] of global DALYs), tobacco smoking including second-hand smoke (6·3% [5·5-7·0]), and alcohol use (5·5% [5·0-5·9]). In 1990, the leading risks were childhood underweight (7·9% [6·8-9·4]), household air pollution from solid fuels (HAP; 7·0% [5·6-8·3]), and tobacco smoking including second-hand smoke (6·1% [5·4-6·8]). Dietary risk factors and physical inactivity collectively accounted for 10·0% (95% UI 9·2-10·8) of global DALYs in 2010, with the most prominent dietary risks being diets low in fruits and those high in sodium. Several risks that primarily affect childhood communicable diseases, including unimproved water and sanitation and childhood micronutrient deficiencies, fell in rank between 1990 and 2010, with unimproved water and sanitation accounting for 0·9% (0·4-1·6) of global DALYs in 2010. However, in most of sub-Saharan Africa childhood underweight, HAP, and non-exclusive and discontinued breastfeeding were the leading risks in 2010, while HAP was the leading risk in south Asia. The leading risk factor in Eastern Europe, most of Latin America, and southern sub-Saharan Africa in 2010 was alcohol use; in most of Asia, North Africa and Middle East, and central Europe it was high blood pressure. Despite declines, tobacco smoking including second-hand smoke remained the leading risk in high-income north America and western Europe. High body-mass index has increased globally and it is the leading risk in Australasia and southern Latin America, and also ranks high in other high-income regions, North Africa and Middle East, and Oceania. INTERPRETATION Worldwide, the contribution of different risk factors to disease burden has changed substantially, with a shift away from risks for communicable diseases in children towards those for non-communicable diseases in adults. These changes are related to the ageing population, decreased mortality among children younger than 5 years, changes in cause-of-death composition, and changes in risk factor exposures. New evidence has led to changes in the magnitude of key risks including unimproved water and sanitation, vitamin A and zinc deficiencies, and ambient particulate matter pollution. The extent to which the epidemiological shift has occurred and what the leading risks currently are varies greatly across regions. In much of sub-Saharan Africa, the leading risks are still those associated with poverty and those that affect children.