881 resultados para Disease severity
Resumo:
This study assessed the health-related quality of life (HRQoL), fatigue and physical activity levels of 28 persons with chronic kidney disease (CKD) on initial administration of an erythropoietin stimulating agent, and at 3 months, 6 months and 12 months. The sample comprised of 15 females and 13 males whose ages ranged from 31 to 84 years. Physical activity was measured using the Human Activity Profile (HAP): Self-care, Personal/Household work, Entertainment/Social, Independent exercise. Quality of life was measured using the SF-36 which gives scores on physical health (physical functioning, role-physical, bodily pain and general health) and mental health (vitality, social functioning, role-emotional and emotional well-being). Fatigue was measured by the Fatigue Severity Scale (FSS). Across all time points the renal sample engaged in considerably less HAP personal/household work activities and entertainment/social activities compared to healthy adults. The normative sample engaged in three times more independent/exercise activities compared to renal patients. One-way Repeated measures ANOVAs indicated a significant change over time for SF-36 scales of role physical, vitality, emotional well-being and overall mental health. There was a significant difference in fatigue levels over time [F(3,11) = 3.78, p<.05]. Fatigue was highest at baseline and lowest at 6 months. The more breathlessness the CKD patient reported, the fewer activities undertaken and the greater the reported level of fatigue. There were no significant age differences over time for fatigue or physical activity. Age differences were only found for SF-36 mental health at 3 months (t=-2.41, df=14, p<.05). Those younger than 65 years had lower emotional well-being compared to those aged over 65. Males had poorer physical health compared to females at 12 months. There were no significant gender differences on mental health at any time point. In the management of chronic kidney disease, early detection of a person’s inability to engage in routine activities due to fatigue is necessary. Early detection would enable timely interventions to optimise HRQoL and independent exercise.
Resumo:
In the elderly, the risks for protein-energy malnutrition from older age, dementia, depression and living alone have been well-documented. Other risk factors including anorexia, gastrointestinal dysfunction, loss of olfactory and taste senses and early satiety have also been suggested to contribute to poor nutritional status. In Parkinson’s disease (PD), it has been suggested that the disease symptoms may predispose people with PD to malnutrition. However, the risks for malnutrition in this population are not well-understood. The current study’s aim was to determine malnutrition risk factors in community-dwelling adults with PD. Nutritional status was assessed using the Patient-Generated Subjective Global Assessment (PG-SGA). Data about age, time since diagnosis, medications and living situation were collected. Levodopa equivalent doses (LDED) and LDED per kg body weight (mg/kg) were calculated. Depression and anxiety were measured using the Beck’s Depression Inventory (BDI) and Spielberger Trait Anxiety questionnaire, respectively. Cognitive function was assessed using the Addenbrooke’s Cognitive Examination (ACE-R). Non-motor symptoms were assessed using the Scales for Outcomes in Parkinson's disease-Autonomic (SCOPA-AUT) and Modified Constipation Assessment Scale (MCAS). A total of 125 community-dwelling people with PD were included, average age of 70.2±9.3(35-92) years and average time since diagnosis of 7.3±5.9(0–31) years. Average body mass index (BMI) was 26.0±5.5kg/m2. Of these, 15% (n=19) were malnourished (SGA-B). Multivariate logistic regression analysis revealed that older age (OR=1.16, CI=1.02-1.31), more depressive symptoms (OR=1.26, CI=1.07-1.48), lower levels of anxiety (OR=.90, CI=.82-.99), and higher LDED per kg body weight (OR=1.57, CI=1.14-2.15) significantly increased malnutrition risk. Cognitive function, living situation, number of prescription medications, LDED, years since diagnosis and the severity of non-motor symptoms did not significantly influence malnutrition risk. Malnutrition results in poorer health outcomes. Proactively addressing the risk factors can help prevent declines in nutritional status. In the current study, older people with PD with depression and greater amounts of levodopa per body weight were at increased malnutrition risk.
Resumo:
Background Non-fatal health outcomes from diseases and injuries are a crucial consideration in the promotion and monitoring of individual and population health. The Global Burden of Disease (GBD) studies done in 1990 and 2000 have been the only studies to quantify non-fatal health outcomes across an exhaustive set of disorders at the global and regional level. Neither effort quantified uncertainty in prevalence or years lived with disability (YLDs). Methods Of the 291 diseases and injuries in the GBD cause list, 289 cause disability. For 1160 sequelae of the 289 diseases and injuries, we undertook a systematic analysis of prevalence, incidence, remission, duration, and excess mortality. Sources included published studies, case notification, population-based cancer registries, other disease registries, antenatal clinic serosurveillance, hospital discharge data, ambulatory care data, household surveys, other surveys, and cohort studies. For most sequelae, we used a Bayesian meta-regression method, DisMod-MR, designed to address key limitations in descriptive epidemiological data, including missing data, inconsistency, and large methodological variation between data sources. For some disorders, we used natural history models, geospatial models, back-calculation models (models calculating incidence from population mortality rates and case fatality), or registration completeness models (models adjusting for incomplete registration with health-system access and other covariates). Disability weights for 220 unique health states were used to capture the severity of health loss. YLDs by cause at age, sex, country, and year levels were adjusted for comorbidity with simulation methods. We included uncertainty estimates at all stages of the analysis. Findings Global prevalence for all ages combined in 2010 across the 1160 sequelae ranged from fewer than one case per 1 million people to 350 000 cases per 1 million people. Prevalence and severity of health loss were weakly correlated (correlation coefficient −0·37). In 2010, there were 777 million YLDs from all causes, up from 583 million in 1990. The main contributors to global YLDs were mental and behavioural disorders, musculoskeletal disorders, and diabetes or endocrine diseases. The leading specific causes of YLDs were much the same in 2010 as they were in 1990: low back pain, major depressive disorder, iron-deficiency anaemia, neck pain, chronic obstructive pulmonary disease, anxiety disorders, migraine, diabetes, and falls. Age-specific prevalence of YLDs increased with age in all regions and has decreased slightly from 1990 to 2010. Regional patterns of the leading causes of YLDs were more similar compared with years of life lost due to premature mortality. Neglected tropical diseases, HIV/AIDS, tuberculosis, malaria, and anaemia were important causes of YLDs in sub-Saharan Africa. Interpretation Rates of YLDs per 100 000 people have remained largely constant over time but rise steadily with age. Population growth and ageing have increased YLD numbers and crude rates over the past two decades. Prevalences of the most common causes of YLDs, such as mental and behavioural disorders and musculoskeletal disorders, have not decreased. Health systems will need to address the needs of the rising numbers of individuals with a range of disorders that largely cause disability but not mortality. Quantification of the burden of non-fatal health outcomes will be crucial to understand how well health systems are responding to these challenges. Effective and affordable strategies to deal with this rising burden are an urgent priority for health systems in most parts of the world. Funding Bill & Melinda Gates Foundation.
Resumo:
Migraine is a painful and debilitating, neurovascular disease. Current migraine head pain treatments work with differing efficacies in migraineurs. The opioid system plays an important role in diverse biological functions including analgesia, drug response and pain reduction. The A118G single nucleotide polymorphism (SNP) in exon 1 of the μ-opioid receptor gene (OPRM1) has been associated with elevated pain responses and decreased pain threshold in a variety of populations. The aim of the current preliminary study was to test whether genotypes of the OPRM1 A118G SNP are associated with head pain severity in a clinical cohort of female migraineurs. This was a preliminary study to determine whether genotypes of the OPRM1 A118G SNP are associated with head pain severity in a clinical cohort of female migraineurs. A total of 153 chronic migraine with aura sufferers were assessed for migraine head pain using the Migraine Disability Assessment Score instrument and classified into high and low pain severity groups. DNA was extracted and genotypes obtained for the A118G SNP. Logistic regression analysis adjusting for age effects showed the A118G SNP of the OPRM1 gene to be significantly associated with migraine pain severity in the test population (P = 0.0037). In particular, G118 allele carriers were more likely to be high pain sufferers compared to homozygous carriers of the A118 allele (OR = 3.125, 95 % CI = 1.41, 6.93, P = 0.0037). These findings suggest that A118G genotypes of the OPRM1 gene may influence migraine-associated head pain in females. Further investigations are required to fully understand the effect of this gene variant on migraine head pain including studies in males and in different migraine subtypes, as well as in response to head pain medication.
Resumo:
Background Viral and bacterial respiratory tract infections in early-life are linked to the development of allergic airway inflammation and asthma. However, the mechanisms involved are not well understood. We have previously shown that neonatal and infant, but not adult, chlamydial lung infections in mice permanently alter inflammatory phenotype and physiology to increase the severity of allergic airway disease by increasing lung interleukin (IL)-13 expression, mucus hyper-secretion and airway hyper-responsiveness. This occurred through different mechanisms with infection at different ages. Neonatal infection suppressed inflammatory responses but enhanced systemic dendritic cell:T-cell IL-13 release and induced permanent alterations in lung structure (i.e., increased the size of alveoli). Infant infection enhanced inflammatory responses but had no effect on lung structure. Here we investigated the role of hematopoietic cells in these processes using bone marrow chimera studies. Methodology/Principal Findings Neonatal (<24-hours-old), infant (3-weeks-old) and adult (6-weeks-old) mice were infected with C. muridarum. Nine weeks after infection bone marrow was collected and transferred into recipient age-matched irradiated naïve mice. Allergic airway disease was induced (8 weeks after adoptive transfer) by sensitization and challenge with ovalbumin. Reconstitution of irradiated naïve mice with bone marrow from mice infected as neonates resulted in the suppression of the hallmark features of allergic airway disease including mucus hyper-secretion and airway hyper-responsiveness, which was associated with decreased IL-13 levels in the lung. In stark contrast, reconstitution with bone marrow from mice infected as infants increased the severity of allergic airway disease by increasing T helper type-2 cell cytokine release (IL-5 and IL-13), mucus hyper-secretion, airway hyper-responsiveness and IL-13 levels in the lung. Reconstitution with bone marrow from infected adult mice had no effects. Conclusions These results suggest that an infant chlamydial lung infection results in long lasting alterations in hematopoietic cells that increases the severity of allergic airway disease in later-life.
Resumo:
Aims and objectives This study sought to determine the relationship between health related quality of life (HRQoL), fatigue and activity levels of people with anaemia secondary to chronic kidney disease (CKD) over a 12 month period following the introduction of an erythropoietin stimulating agent (ESA). Background CKD occurs in five stages and it is a complex chronic illness which severely impacts on an individual’s HRQoL, and ability to perform everyday activities. Fatigue is also a common symptom experienced by people with CKD. Design and methods Using a longitudinal repeated measures design, 28 people with CKD completed the SF-36, human activity profile and fatigue severity scale at the commencement of an ESA and then at 3, 6 and 12 months. Results Over a 12 month period, people reported a significant change in HRQoL in relation to role physical, vitality, mental health/emotional well-being and overall mental health. However activity levels did not significantly improve during that time. Both the amount of breathlessness and level of fatigue were highest at baseline and declined over time. Both fatigue and breathlessness were correlated with less reported general health over time. Conclusion Renal nurses, in dialysis units and CKD outpatient clinics, have repeated and frequent contact with people with CKD over long periods of time, and are in an ideal position to routinely assess fatigue and activity levels and to institute timely interventions. Early detection would enable timely nursing interventions to optimise HRQoL and independent activity. Relevance to Clinical Practice Drawing on rehabilitation nursing interventions could assist renal nurses to minimize the burden of fatigue and its impact on simple everyday activities and a person’s quality of life. These interventions are important for people who are living at home and could assist in lowering the burden on home support services.
Resumo:
Objective To estimate the incidence and severity of invasive group A streptococcal infection in Victoria, Australia. Design Prospective active surveillance study. Setting Public and private laboratories, hospitals and general practitioners throughout Victoria. Patients eople in Victoria diagnosed with group A streptococcal disease notified to the surveillance system between 1 March 2002 and 31 August 2004. Main outcome measure Confirmed invasive group A streptococcal disease. Results We identified 333 confirmed cases: an average annualised incidence rate of 2.7 (95% CI, 2.3-3.2) per 100000 population per year. Rates were highest in people aged 65 years and older and those younger than 5 years. The case-fatality rate was 7.8%. Streptococcal toxic shock syndrome occurred in 48 patients (14.4%), with a case-fatality rate of 23%. Thirty cases of necrotising fasciitis were reported; five (17%) of these patients died. Type 1 (23%) was the most frequently identified emm sequence type in all, age groups. All tested isolates were susceptible to penicillin and clindamycin. Two isolates (4%) were resistant to erythromycin. Conclusion The incidence of invasive group A streptococcal disease in temperate Australia is greater than previously appreciated and warrants greater public health attention, including its designation as a notifiable disease.
Resumo:
Background The implementation of the Australian Consumer Law in 2011 highlighted the need for better use of injury data to improve the effectiveness and responsiveness of product safety (PS) initiatives. In the PS system, resources are allocated to different priority issues using risk assessment tools. The rapid exchange of information (RAPEX) tool to prioritise hazards, developed by the European Commission, is currently being adopted in Australia. Injury data is required as a basic input to the RAPEX tool in the risk assessment process. One of the challenges in utilising injury data in the PS system is the complexity of translating detailed clinical coded data into broad categories such as those used in the RAPEX tool. Aims This study aims to translate hospital burns data into a simplified format by mapping the International Statistical Classification of Disease and Related Health Problems (Tenth Revision) Australian Modification (ICD-10-AM) burn codes into RAPEX severity rankings, using these rankings to identify priority areas in childhood product-related burns data. Methods ICD-10-AM burn codes were mapped into four levels of severity using the RAPEX guide table by assigning rankings from 1-4, in order of increasing severity. RAPEX rankings were determined by the thickness and surface area of the burn (BSA) with information extracted from the fourth character of T20-T30 codes for burn thickness, and the fourth and fifth characters of T31 codes for the BSA. Following the mapping process, secondary data analysis of 2008-2010 Queensland Hospital Admitted Patient Data Collection (QHAPDC) paediatric data was conducted to identify priority areas in product-related burns. Results The application of RAPEX rankings in QHAPDC burn data showed approximately 70% of paediatric burns in Queensland hospitals were categorised under RAPEX levels 1 and 2, 25% under RAPEX 3 and 4, with the remaining 5% unclassifiable. In the PS system, prioritisations are made to issues categorised under RAPEX levels 3 and 4. Analysis of external cause codes within these levels showed that flammable materials (for children aged 10-15yo) and hot substances (for children aged <2yo) were the most frequently identified products. Discussion and conclusions The mapping of ICD-10-AM burn codes into RAPEX rankings showed a favourable degree of compatibility between both classification systems, suggesting that ICD-10-AM coded burn data can be simplified to more effectively support PS initiatives. Additionally, the secondary data analysis showed that only 25% of all admitted burn cases in Queensland were severe enough to trigger a PS response.
Resumo:
Background The implementation of the Australian Consumer Law in 2011 highlighted the need for better use of injury data to improve the effectiveness and responsiveness of product safety (PS) initiatives. In the PS system, resources are allocated to different priority issues using risk assessment tools. The rapid exchange of information (RAPEX) tool to prioritise hazards, developed by the European Commission, is currently being adopted in Australia. Injury data is required as a basic input to the RAPEX tool in the risk assessment process. One of the challenges in utilising injury data in the PS system is the complexity of translating detailed clinical coded data into broad categories such as those used in the RAPEX tool. Aims This study aims to translate hospital burns data into a simplified format by mapping the International Statistical Classification of Disease and Related Health Problems (Tenth Revision) Australian Modification (ICD-10-AM) burn codes into RAPEX severity rankings, using these rankings to identify priority areas in childhood product-related burns data. Methods ICD-10-AM burn codes were mapped into four levels of severity using the RAPEX guide table by assigning rankings from 1-4, in order of increasing severity. RAPEX rankings were determined by the thickness and surface area of the burn (BSA) with information extracted from the fourth character of T20-T30 codes for burn thickness, and the fourth and fifth characters of T31 codes for the BSA. Following the mapping process, secondary data analysis of 2008-2010 Queensland Hospital Admitted Patient Data Collection (QHAPDC) paediatric data was conducted to identify priority areas in product-related burns. Results The application of RAPEX rankings in QHAPDC burn data showed approximately 70% of paediatric burns in Queensland hospitals were categorised under RAPEX levels 1 and 2, 25% under RAPEX 3 and 4, with the remaining 5% unclassifiable. In the PS system, prioritisations are made to issues categorised under RAPEX levels 3 and 4. Analysis of external cause codes within these levels showed that flammable materials (for children aged 10-15yo) and hot substances (for children aged <2yo) were the most frequently identified products. Discussion and conclusions The mapping of ICD-10-AM burn codes into RAPEX rankings showed a favourable degree of compatibility between both classification systems, suggesting that ICD-10-AM coded burn data can be simplified to more effectively support PS initiatives. Additionally, the secondary data analysis showed that only 25% of all admitted burn cases in Queensland were severe enough to trigger a PS response.
Resumo:
Background Viral respiratory illness triggers asthma exacerbations, but the influence of respiratory illness on the acute severity and recovery of childhood asthma is unknown. Our objective was to evaluate the impact of a concurrent acute respiratory illness (based on a clinical definition and PCR detection of a panel of respiratory viruses, Mycoplasma pneumoniae and Chlamydia pneumoniae) on the severity and resolution of symptoms in children with a nonhospitalized exacerbation of asthma. Methods Subjects were children aged 2 to 15 years presenting to an emergency department for an acute asthma exacerbation and not hospitalized. Acute respiratory illness (ARI) was clinically defined. Nasopharyngeal aspirates (NPA) were examined for respiratory viruses, Chlamydia and Mycoplasma using PCR. The primary outcome was quality of life (QOL) on presentation, day 7 and day 14. Secondary outcomes were acute asthma severity score, asthma diary, and cough diary scores on days 5, 7,10, and 14. Results On multivariate regression, presence of ARI was statistically but not clinically significantly associated with QOL score on presentation (B = 0.36, P = 0.025). By day 7 and 14, there was no difference between groups. Asthma diary score was significantly higher in children with ARI (B = 0.41, P = 0.039) on day 5 but not on presentation or subsequent days. Respiratory viruses were detected in 54% of the 78 NPAs obtained. There was no difference in the any of the asthma outcomes of children grouped by positive or negative NPA. Conclusions The presence of a viral respiratory illness has a modest influence on asthma severity, and does not influence recovery from a nonhospitalized asthma exacerbation.
Resumo:
Background Chronic kidney disease is a global public health problem of increasing prevalence. There are five stages of kidney disease, with Stage 5 indicating end stage kidney disease (ESKD) requiring dialysis or death will eventually occur. Over the last two decades there have been increasing numbers of people commencing dialysis. A majority of this increase has occurred in the population of people who are 65 years and over. With the older population it is difficult to determine at times whether dialysis will provide any benefit over non-dialysis management. The poor prognosis for the population over 65 years raises issues around management of ESKD in this population. It is therefore important to review any research that has been undertaken in this area which compares outcomes of the older ESKD population who have commenced dialysis with those who have received non-dialysis management. Objective The primary objective was to assess the effect of dialysis compared with non-dialysis management for the population of 65 years and over with ESKD. Inclusion criteria Types of participants This review considered studies that included participants who were 65 years and older. These participants needed to have been diagnosed with ESKD for greater than three months and also be either receiving renal replacement therapy (RRT) (hemodialysis [HD] or peritoneal dialysis [PD]) or non-dialysis management. The settings for the studies included the home, self-care centre, satellite centre, hospital, hospice or nursing home. Types of intervention(s)/phenomena of interest This review considered studies where the intervention was RRT (HD or PD) for the participants with ESKD. There was no restriction on frequency of RRT or length of time the participant received RRT. The comparator was participants who were not undergoing RRT. Types of studies This review considered both experimental and epidemiological study designs including randomized controlled trials, non-randomized controlled trials, quasi-experimental, before and after studies, prospective and retrospective cohort studies, case control studies and analytical cross sectional studies. This review also considered descriptive epidemiological study designs including case series, individual case reports and descriptive cross sectional studies for inclusion. This review included any of the following primary and secondary outcome measures: •Primary outcome – survival measures •Secondary outcomes – functional performance score (e.g. Karnofsky Performance score) •Symptoms and severity of end stage kidney disease •Hospital admissions •Health related quality of life (e.g. KDQOL, SF36 and HRQOL) •Comorbidities (e.g. Charlson Comorbidity index).
Resumo:
The transcriptome response of Atlantic salmon (Salmo salar) displaying advanced stages of amoebic gill disease (AGD) was investigated. Naïve smolt were challenged with AGD for 19 days, at which time all fish were euthanized and their severity of infection quantified through histopathological scoring. Gene expression profiles were compared between heavily infected and naïve individuals using a 17 K Atlantic salmon cDNA microarray with real-time quantitative RT-PCR (qPCR) verification. Expression profiles were examined in the gill, anterior kidney, and liver. Twenty-seven transcripts were significantly differentially expressed within the gill; 20 of these transcripts were down-regulated in the AGD-affected individuals compared with naïve individuals. In contrast, only nine transcripts were significantly differentially expressed within the anterior kidney and five within the liver. Again the majority of these transcripts were down-regulated within the diseased individuals. A down-regulation of transcripts involved in apoptosis (procathepsin L, cathepsin H precursor, and cystatin B) was observed in AGD-affected Atlantic salmon. Four transcripts encoding genes with antioxidant properties also were down-regulated in AGD-affected gill tissue according to qPCR analysis. The most up-regulated transcript within the gill was an unknown expressed sequence tag (EST) whose expression was 218-fold (± SE 66) higher within the AGD affected gill tissue. Our results suggest that Atlantic salmon experiencing advanced stages of AGD demonstrate general down-regulation of gene expression, which is most pronounced within the gill. We propose that this general gene suppression is parasite-mediated, thus allowing the parasite to withstand or ameliorate the host response. © 2008 Springer Science+Business Media, LLC.
Resumo:
BACKGROUND Measuring disease and injury burden in populations requires a composite metric that captures both premature mortality and the prevalence and severity of ill-health. The 1990 Global Burden of Disease study proposed disability-adjusted life years (DALYs) to measure disease burden. No comprehensive update of disease burden worldwide incorporating a systematic reassessment of disease and injury-specific epidemiology has been done since the 1990 study. We aimed to calculate disease burden worldwide and for 21 regions for 1990, 2005, and 2010 with methods to enable meaningful comparisons over time. METHODS We calculated DALYs as the sum of years of life lost (YLLs) and years lived with disability (YLDs). DALYs were calculated for 291 causes, 20 age groups, both sexes, and for 187 countries, and aggregated to regional and global estimates of disease burden for three points in time with strictly comparable definitions and methods. YLLs were calculated from age-sex-country-time-specific estimates of mortality by cause, with death by standardised lost life expectancy at each age. YLDs were calculated as prevalence of 1160 disabling sequelae, by age, sex, and cause, and weighted by new disability weights for each health state. Neither YLLs nor YLDs were age-weighted or discounted. Uncertainty around cause-specific DALYs was calculated incorporating uncertainty in levels of all-cause mortality, cause-specific mortality, prevalence, and disability weights. FINDINGS Global DALYs remained stable from 1990 (2·503 billion) to 2010 (2·490 billion). Crude DALYs per 1000 decreased by 23% (472 per 1000 to 361 per 1000). An important shift has occurred in DALY composition with the contribution of deaths and disability among children (younger than 5 years of age) declining from 41% of global DALYs in 1990 to 25% in 2010. YLLs typically account for about half of disease burden in more developed regions (high-income Asia Pacific, western Europe, high-income North America, and Australasia), rising to over 80% of DALYs in sub-Saharan Africa. In 1990, 47% of DALYs worldwide were from communicable, maternal, neonatal, and nutritional disorders, 43% from non-communicable diseases, and 10% from injuries. By 2010, this had shifted to 35%, 54%, and 11%, respectively. Ischaemic heart disease was the leading cause of DALYs worldwide in 2010 (up from fourth rank in 1990, increasing by 29%), followed by lower respiratory infections (top rank in 1990; 44% decline in DALYs), stroke (fifth in 1990; 19% increase), diarrhoeal diseases (second in 1990; 51% decrease), and HIV/AIDS (33rd in 1990; 351% increase). Major depressive disorder increased from 15th to 11th rank (37% increase) and road injury from 12th to 10th rank (34% increase). Substantial heterogeneity exists in rankings of leading causes of disease burden among regions. INTERPRETATION Global disease burden has continued to shift away from communicable to non-communicable diseases and from premature death to years lived with disability. In sub-Saharan Africa, however, many communicable, maternal, neonatal, and nutritional disorders remain the dominant causes of disease burden. The rising burden from mental and behavioural disorders, musculoskeletal disorders, and diabetes will impose new challenges on health systems. Regional heterogeneity highlights the importance of understanding local burden of disease and setting goals and targets for the post-2015 agenda taking such patterns into account. Because of improved definitions, methods, and data, these results for 1990 and 2010 supersede all previously published Global Burden of Disease results.
Resumo:
BACKGROUND Measurement of the global burden of disease with disability-adjusted life-years (DALYs) requires disability weights that quantify health losses for all non-fatal consequences of disease and injury. There has been extensive debate about a range of conceptual and methodological issues concerning the definition and measurement of these weights. Our primary objective was a comprehensive re-estimation of disability weights for the Global Burden of Disease Study 2010 through a large-scale empirical investigation in which judgments about health losses associated with many causes of disease and injury were elicited from the general public in diverse communities through a new, standardised approach. METHODS We surveyed respondents in two ways: household surveys of adults aged 18 years or older (face-to-face interviews in Bangladesh, Indonesia, Peru, and Tanzania; telephone interviews in the USA) between Oct 28, 2009, and June 23, 2010; and an open-access web-based survey between July 26, 2010, and May 16, 2011. The surveys used paired comparison questions, in which respondents considered two hypothetical individuals with different, randomly selected health states and indicated which person they regarded as healthier. The web survey added questions about population health equivalence, which compared the overall health benefits of different life-saving or disease-prevention programmes. We analysed paired comparison responses with probit regression analysis on all 220 unique states in the study. We used results from the population health equivalence responses to anchor the results from the paired comparisons on the disability weight scale from 0 (implying no loss of health) to 1 (implying a health loss equivalent to death). Additionally, we compared new disability weights with those used in WHO's most recent update of the Global Burden of Disease Study for 2004. FINDINGS 13,902 individuals participated in household surveys and 16,328 in the web survey. Analysis of paired comparison responses indicated a high degree of consistency across surveys: correlations between individual survey results and results from analysis of the pooled dataset were 0·9 or higher in all surveys except in Bangladesh (r=0·75). Most of the 220 disability weights were located on the mild end of the severity scale, with 58 (26%) having weights below 0·05. Five (11%) states had weights below 0·01, such as mild anaemia, mild hearing or vision loss, and secondary infertility. The health states with the highest disability weights were acute schizophrenia (0·76) and severe multiple sclerosis (0·71). We identified a broad pattern of agreement between the old and new weights (r=0·70), particularly in the moderate-to-severe range. However, in the mild range below 0·2, many states had significantly lower weights in our study than previously. INTERPRETATION This study represents the most extensive empirical effort as yet to measure disability weights. By contrast with the popular hypothesis that disability assessments vary widely across samples with different cultural environments, we have reported strong evidence of highly consistent results.
Resumo:
Emotional and role functioning difficulties are associated with chronic alcohol use and liver disease. Little is known about prospective changes in psychological and psychosocial functioning following orthotopic liver transplantation (OLT) amongst patients with alcoholic liver disease (ALD). We aimed to assess the functioning of this patient group post liver transplantation. Comprehensive psychosocial assessment of depression (Beck Depression Inventory [BDI]), anxiety (State-Trait Anxiety Inventory-Form X [STAI]) and psychosocial adjustment (Psychosocial Adjustment to Illness Scale-Self-Report version [PAIS-SR]) was conducted with 42 ALD patients available for pre and post OLT testing. Dependence severity was assessed by the Brief Michigan Alcoholism Screening Test (bMAST). Significant reductions in average anxiety and depression symptoms were observed 12-months post-OLT. Significant improvements in psychosocial adjustment to illness were also reported. Patients with higher levels of alcohol dependence severity pre transplant assessment improved comparably to those with lower levels of dependence. In summary, the study found that OLT contributed to reducing overall levels of mood and anxiety symptoms in ALD patients, approximating general (non-clinical) population norms. Psychosocial adjustment also improved significantly post liver transplantation.