35 resultados para flock-level risk factors
em University of Queensland eSpace - Australia
Resumo:
SETTING: Hlabisa Tuberculosis Programme, Hlabisa, South Africa. OBJECTIVE: To determine trends in and risk factors for interruption of tuberculosis treatment. METHODS: Data were extracted from the control programme database starting in 1991. Temporal trends in treatment interruption are described; independent risk factors for treatment interruption were determined with a multiple logistic regression model, and Kaplan-Meier survival curves for treatment interruption were constructed for patients treated in 1994-1995. RESULTS: Overall 629 of 3610 surviving patients (17%) failed to complete treatment; this proportion increased from 11% (n = 79) in 1991/1992 to 22% (n = 201) in 1996. Independent risk factors for treatment interruption were diagnosis between 1994-1996 compared with 1991-1393 (odds ratio [OR] 1.9, 95% confidence interval [CT] 1.6-2.4); human immunodeficiency virus (HIV) positivity compared with HIV negativity (OR 1.8, 95% CI 1.4-2.4); supervised by village clinic compared with community health worker (OR 1.9, 95% CI 1.4-2.6); and male versus female sex (OR 1.3, 95% CI 1.1-1.6). Few patients interrupted treatment during the first 2 weeks, and the treatment interruption rate thereafter was constant at 1% per 14 days. CONCLUSIONS: Frequency of treatment interruption from this programme has increased recently. The strongest risk factor was year of diagnosis, perhaps reflecting the impact of an increased caseload on programme performance. Ensuring adherence to therapy in communities with a high level of migration remains a challenge even within community-based directly observed therapy programmes.
Resumo:
OBJECTIVE To determine the prevalence, intensity and associated risk factors for infection with Ascaris, hookworms and Trichuris in three tea-growing communities in Assam, India. METHODS Single faecal samples were collected from 328 individuals and subjected to centrifugal floatation and the Kato Katz quantitation technique and prevalence and intensities of infection with each parasite calculated. Associations between parasite prevalence, intensity and host and environmental factors were then made using both univariate and multivariate analysis. RESULTS The overall prevalence of Ascaris was 38% [95% confidence interval (CI): 33, 43], and the individual prevalence of hookworm and Trichuris was 43% (95% CI: 38, 49). The strongest predictors for the intensity of one or more geohelminths using multiple regression (P less than or equal to 0.10) were socioeconomic status, age, household crowding, level of education, religion, use of footwear when outdoors, defecation practices, pig ownership and water source. CONCLUSION A universal blanket treatment with broad-spectrum anthelmintics together with promotion of scholastic and health education and improvements in sanitation is recommended for helminth control in the communities under study.
Resumo:
Background Estimates of the disease burden due to multiple risk factors can show the potential gain from combined preventive measures. But few such investigations have been attempted, and none on a global scale. Our aim was to estimate the potential health benefits from removal of multiple major risk factors. Methods We assessed the burden of disease and injury attributable to the joint effects of 20 selected leading risk factors in 14 epidemiological subregions of the world. We estimated population attributable fractions, defined as the proportional reduction in disease or mortality that would occur if exposure to a risk factor were reduced to an alternative level, from data for risk factor prevalence and hazard size. For every disease, we estimated joint population attributable fractions, for multiple risk factors, by age and sex, from the direct contributions of individual risk factors. To obtain the direct hazards, we reviewed publications and re-analysed cohort data to account for that part of hazard that is mediated through other risks. Results Globally, an estimated 47% of premature deaths and 39% of total disease burden in 2000 resulted from the joint effects of the risk factors considered. These risks caused a substantial proportion of important diseases, including diarrhoea (92%-94%), lower respiratory infections (55-62%), lung cancer (72%), chronic obstructive pulmonary disease (60%), ischaemic heart disease (83-89%), and stroke (70-76%). Removal of these risks would have increased global healthy life expectancy by 9.3 years (17%) ranging from 4.4 years (6%) in the developed countries of the western Pacific to 16.1 years (43%) in parts of sub-Saharan Africa. Interpretation Removal of major risk factors would not only increase healthy life expectancy in every region, but also reduce some of the differences between regions, The potential for disease prevention and health gain from tackling major known risks simultaneously would be substantial.
Resumo:
Background Estimates of the disease burden due to multiple risk factors can show the potential gain from combined preventive measures. But few such investigations have been attempted, and none on a global scale. Our aim was to estimate the potential health benefits from removal of multiple major risk factors. Methods We assessed the burden of disease and injury attributable to the joint effects of 20 selected leading risk factors in 14 epidemiological subregions of the world. We estimated population attributable fractions, defined as the proportional reduction in disease or mortality that would occur if exposure to a risk factor were reduced to an alternative level, from data for risk factor prevalence and hazard size. For every disease, we estimated joint population attributable fractions, for multiple risk factors, by age and sex, from the direct contributions of individual risk factors. To obtain the direct hazards, we reviewed publications and re-analysed cohort data to account for that part of hazard that is mediated through other risks. Results Globally, an estimated 47% of premature deaths and 39% of total disease burden in 2000 resulted from the joint effects of the risk factors considered. These risks caused a substantial proportion of important diseases, including diarrhoea (92%-94%), lower respiratory infections (55-62%), lung cancer (72%), chronic obstructive pulmonary disease (60%), ischaemic heart disease (83-89%), and stroke (70-76%). Removal of these risks would have increased global healthy life expectancy by 9.3 years (17%) ranging from 4.4 years (6%) in the developed countries of the western Pacific to 16.1 years (43%) in parts of sub-Saharan Africa. Interpretation Removal of major risk factors would not only increase healthy life expectancy in every region, but also reduce some of the differences between regions, The potential for disease prevention and health gain from tackling major known risks simultaneously would be substantial.
Resumo:
To determine the occurrence of delirium in oncology inpatients and to identify and evaluate admission characteristics associated with the development of delirium during inpatient admission, a prospective observational study was conducted of H 3 patients with a total of 145 admissions with histological diagnosis of cancer admitted to the oncology unit over a period of ten weeks. At the point of inpatient admission, all patients were assessed for the presence of potential risk factors for development of delirium. During the index admission patients were assessed daily for the presence of delirium using the Confusion Assessment Method. Delirium was confirmed by clinician assessment. Delirium developed in 26 of 145 admissions (18%) and 32 episodes of delirium were recorded with 6 patients having 2 episodes of delirium during the index admission. Delirium occurred on average 3.3 days into the admission. The average duration of an episode of delirium was 2.1 day. Four patients with delirium (15%) died. All other cases of delirium were reversed. Factors significantly associated with development of delirium on multivariate analysis were: advanced age, cognitive impairment, low albumin level, bone metastases, and the presence of hematological malignancy. Hospital inpatient admission was significantly longer in delirium group (mean: 8.8 days vs 4.5 days in nondelirium group, P
Resumo:
Ross River virus (RE) is a mosquito-borne arbovirus responsible for outbreaks of polyarthritic disease throughout Australia. To better understand human and environmental factors driving such events, 57 historical reports oil RR Outbreaks between 1896 and 1998 were examined collectively. The magnitude, regularity, seasonality, and locality of outbreaks were found to be wide ranging; however, analysis of climatic and tidal data highlighted that environmental conditions let differently ill tropical, arid, and temperate regions. Overall, rainfall seems to be the single most important risk factor, with over 90% of major outbreak locations receiving higher than average rainfall in preceding mouths. Many temperatures were close to average, particularly in tropical populations; however, in arid regions, below average maximum temperatures predominated, and ill southeast temperate regions, above average minimum temperatures predominated. High spring tides preceded coastal Outbreaks, both in the presence and absence of rainfall, and the relationship between rainfall and the Southern Oscillation Index and Lit Nina episodes suggest they may be useful predictive tools, but only ill southeast temperate regions. Such heterogeneity predisposing outbreaks supports the notion that there are different RE epidemiologies throughout Australia but also Suggests that generic parameters for the prediction and control of outbreaks are of limited use at a local level.
Resumo:
Objectives To assess the associations between three measurements of socioeconomic position (SEP) - education, occupation and ability to cope on available income - and cardiovascular risk factors in three age cohorts of Australian women. Methods Cross-sectional analysis of three cohorts of Australian women aged 18-23, 45-50 and 70-75 years. Results In general, for all exposures and in all three cohorts, the odds of each adverse risk factor (smoking, obesity and physical inactivity) were lower in the most advantaged compared with the least advantaged. Within each of the three cohorts, the effects of each measurement of SEP on the outcomes were similar. There were, however, some notable between-cohort differences. The most marked differences were those with smoking. For women aged 70-75 (older), those with the highest educational attainment were more likely to have ever smoked than those with the lowest level of attainment. However, for the other two cohorts, this association was reversed, with a stronger association between low levels of education and ever smoking among those aged 18-23 (younger) than those aged 45-50 (mid-age). Similarly, for older women, those in the most skilled occupational classes were most likely to have ever smoked, with opposite findings for mid-age women. Education was also differently associated with physical inactivity across the three cohorts. Older women who were most educated were least likely to be physically inactive, whereas among the younger and mid-age cohorts there was little or no effect of education on physical inactivity. Conclusion These findings demonstrate the dynamic nature of the association between SEP and some health outcomes. Our findings do not appear to confirm previous suggestions that prestige-based measurements of SEP are more strongly associated with health-related behaviours than measurements that reflect material and psychosocial resources.
Resumo:
Background and Purpose. The re-admission of patients to intensive care is associated with increased morbidity, mortality, loss of morale for patients and family, and increased health costs. The aim of the present study was to identify factors which place patients at a higher risk of re-admission to intensive care. Method. A prospective study of patients who were re-admitted to a 22-bed tertiary level intensive care facility within a 12-month period. Data were kept on every patient re-admitted to intensive care, including standard demographic data, initial admission diagnosis, co-morbidities, re-admission diagnosis, mobility on discharge, secretions, airway, chest X-ray, PaCO2, PaO2, PaO2/FiO2and time of discharge. Subjects included 74 patients who had been re-admitted to intensive care in a 12-month period and a comparison group of patients who were not re-admitted to intensive care. A cross-tabs procedure was initially used to estimate maximum likelihood. Significant factors with an value of 65 years (p
Resumo:
Parkinson’s disease (PD) is a progressive, degenerative, neurological disease. The progressive disability associated with PD results in substantial burdens for those with the condition, their families and society in terms of increased health resource use, earnings loss of affected individuals and family caregivers, poorer quality of life, caregiver burden, disrupted family relationships, decreased social and leisure activities, and deteriorating emotional well-being. Currently, no cure is available and the efficacy of available treatments, such as medication and surgical interventions, decreases with longer duration of the disease. Whilst the cause of PD is unknown, genetic and environmental factors are believed to contribute to its aetiology. Descriptive and analytical epidemiological studies have been conducted in a number of countries in an effort to elucidate the cause, or causes, of PD. Rural residency, farming, well water consumption, pesticide exposure, metals and solvents have been implicated as potential risk factors for PD in some previous epidemiological studies. However, there is substantial disagreement between the results of existing studies. Therefore, the role of environmental exposures in the aetiology of PD remains unclear. The main component of this thesis consists of a case-control study that assessed the contribution of environmental exposures to the risk of developing PD. An existing, previously unanalysed, dataset from a local case-control study was analysed to inform the design of the new case-control study. The analysis results suggested that regular exposure to pesticides and head injury were important risk factors for PD. However, due to the substantial limitations of this existing study, further confirmation of these results was desirable with a more robustly designed epidemiological study. A new exposure measurement instrument (a structured interviewer-delivered questionnaire) was developed for the new case-control study to obtain data on demographic, lifestyle, environmental and medical factors. Prior to its use in the case-control study, the questionnaire was assessed for test-retest repeatability in a series of 32 PD cases and 29 healthy sex-, age- and residential suburb-matched electoral roll controls. High repeatability was demonstrated for lifestyle exposures, such as smoking and coffee/tea consumption (kappas 0.70-1.00). The majority of environmental exposures, including use of pesticides, solvents and exposure to metal dusts and fumes, also showed high repeatability (kappas >0.78). A consecutive series of 163 PD case participants was recruited from a neurology clinic in Brisbane. One hundred and fifty-one (151) control participants were randomly selected from the Australian Commonwealth Electoral Roll and individually matched to the PD cases on age (± 2 years), sex and current residential suburb. Participants ranged in age from 40-89 years (mean age 67 years). Exposure data were collected in face-to-face interviews. Odds ratios and 95% confidence intervals were calculated using conditional logistic regression for matched sets in SAS version 9.1. Consistent with previous studies, ever having been a regular smoker or coffee drinker was inversely associated with PD with dose-response relationships evident for packyears smoked and number of cups of coffee drunk per day. Passive smoking from ever having lived with a smoker or worked in a smoky workplace was also inversely related to PD. Ever having been a regular tea drinker was associated with decreased odds of PD. Hobby gardening was inversely associated with PD. However, use of fungicides in the home garden or occupationally was associated with increased odds of PD. Exposure to welding fumes, cleaning solvents, or thinners occupationally was associated with increased odds of PD. Ever having resided in a rural or remote area was inversely associated with PD. Ever having resided on a farm was only associated with moderately increased odds of PD. Whilst the current study’s results suggest that environmental exposures on their own are only modest contributors to overall PD risk, the possibility that interaction with genetic factors may additively or synergistically increase risk should be considered. The results of this research support the theory that PD has a multifactorial aetiology and that environmental exposures are some of a number of factors to contribute to PD risk. There was also evidence of interaction between some factors (eg smoking and welding) to moderate PD risk.
Resumo:
Background From the mid-1980s to mid-1990s, the WHO MONICA Project monitored coronary events and classic risk factors for coronary heart disease (CHD) in 38 populations from 21 countries. We assessed the extent to which changes in these risk factors explain the variation in the trends in coronary-event rates across the populations. Methods In men and women aged 35-64 years, non-fatal myocardial infarction and coronary deaths were registered continuously to assess trends in rates of coronary events. We carried out population surveys to estimate trends in risk factors. Trends in event rates were regressed on trends in risk score and in individual risk factors. Findings Smoking rates decreased in most male populations but trends were mixed in women; mean blood pressures and cholesterol concentrations decreased, body-mass index increased, and overall risk scores and coronary-event rates decreased. The model of trends in 10-year coronary-event rates against risk scores and single risk factors showed a poor fit, but this was improved with a 4-year time lag for coronary events. The explanatory power of the analyses was limited by imprecision of the estimates and homogeneity of trends in the study populations. Interpretation Changes in the classic risk factors seem to partly explain the variation in population trends in CHD. Residual variance is attributable to difficulties in measurement and analysis, including time lag, and to factors that were not included, such as medical interventions. The results support prevention policies based on the classic risk factors but suggest potential for prevention beyond these.
Resumo:
This is an overview of the first burden of disease and injury studies carried out in Australia. Methods developed for the World Bank and World Health Organization Global Burden of Disease Study were adapted and applied to Australian population health data. Depression was found to be the top-ranking cause of non-fatal disease burden in Australia, causing 8% of the total years lost due to disability in 1996. Mental disorders overall were responsible for nearly 30% of the non-fatal disease burden. The leading causes of total disease burden (disability-adjusted life years [DALYs]) were ischaemic heart disease and stroke, together causing nearly 18% of the total disease burden. Depression was the fourth leading cause of disease burden, accounting for 3.7% of the total burden. Of the 10 major risk factors to which the disease burden can be attributed, tobacco smoking causes an estimated 10% of the total disease burden in Australia, followed by physical inactivity (7%).
Resumo:
Objectives: Resternotomy is a common part of cardiac surgical practice. Associated with resternotomy are the risks of cardiac injury and catastrophic hemorrhage and the subsequent elevated morbidity and mortality in the operating room or during the postoperative period. The technique of direct vision resternotomy is safe and has fewer, if any, serious cardiac injuries. The technique, the reduced need for groin cannulation and the overall low operative mortality and morbidity are the focus of this restrospective analysis. Methods: The records of 495 patients undergoing 546 resternotomies over a 21-year period to January 2000 were reviewed. All consecutive reoperations by the one surgeon comprised patients over the age of 20 at first resternotomy: M:F 343:203, mean age 57 years (range 20 to 85, median age 60). The mean NYHA grade was 2.3 [with 67 patients (1), 273 (11),159 (111), 43 (IV), and 4 (V classification)] with elective reoperation in 94.6%. Cardiac injury was graded into five groups and the incidence and reasons for groin cannulation estimated. The morbidity and mortality as a result of the reoperation and resternotomy were assessed. Results: The hospital/30 day mortality was 2.9% (95% Cl: 1.6%-4.4%) (16 deaths) over the 21 years. First (481), second (53), and third (12) resternotomies produced 307 uncomplicated technical reopenings, 203 slower but uncomplicated procedures, 9 minor superficial cardiac lacerations, and no moderate or severe cardiac injuries. Direct vision resternotomy is crystalized into the principle that only adhesions that are visualized from below are divided and only sternal bone that is freed of adhesions is sewn. Groin exposure was never performed prophylactically for resternotomy. Fourteen patients (2.6%) had such cannulation for aortic dissection/aneurysm (9 patients), excessive sternal adherence of cardiac structures (3 patients), presurgery cardiac arrest (1 patient), and high aortic cannulation desired and not possible (1 patient). The average postop blood loss was 594 mL (95% CI:558-631) in the first 12 hours. The need to return to the operating room for control of excessive bleeding was 2% (11 patients). Blood transfusion was given in 65% of the resternotomy procedures over the 21 years (mean 854 mL 95% Cl 765-945 mL) and 41% over the last 5 years. Conclusions: The technique of direct vision resternotomy has been associated with zero moderate or major cardiac injury/catastrophic hemorrhage at reoperation. Few patients have required groin cannulation. In the postoperative period, there was acceptable blood loss, transfusion rates, reduced morbidity, and moderate low mortality for this potentially high risk group.