938 resultados para COHORT STUDIES


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Back symptoms are a major global public health problem with the lifetime prevalence ranging between 50-80%. Research suggests that work-related factors contribute to the occurrence of back pain in various industries. Despite the hazardous nature, strenuous tasks, and awkward postures associated with farm work, little is known about back injury and symptoms in farmworker adults and children. Research in the United States is particularly limited. This is a concern given the large proportion of migrant farmworkers in the United States without adequate access to healthcare as well as a substantial number of youth working in agriculture. The present study describes back symptoms and identifies work-related factors associated with back pain in migrant farmworker families and farmworker high school students from Starr County, TX. Two separate datasets were used from two cohort studies "Injury and Illness Surveillance in Migrant Farmworkers (MANOS)" (study A: n=267 families) and "South Texas Adolescent Rural Research Study (STARRS)" (study B: n=345). Descriptive and inferential statistics including multivariable logistic regression were used to identify work-related factors associated with back pain in each study. In migrant farmworker families, the prevalence of chronic back pain during the last migration season ranged from 9.5% among youngest children to 33.3% among mothers. Chronic back pain was significantly associated with increasing age; fairly bad/very bad quality of sleep while migrating; fewer than eight hours of sleep at home in Starr County, TX; depressive symptoms while migrating; self-provided water for washing hands/drinking; weeding at work; and exposure to pesticide drift/direct spray. Among farmworker adolescents, the prevalence of severe back symptoms was 15.7%. Severe back symptoms were significantly associated with being female; history of a prior accident/back injury; feeling tense, stressed, or anxious sometimes/often; lifting/carrying heavy objects not at work; current tobacco use; increasing lifetime number of migrant farmworker years; working with/around knives; and working on corn crops. Overall, results support that associations between work-related exposures and chronic back pain and severe back symptoms remain after controlling for the effect of non-work exposures in farmworker populations. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background. Risk factors underlying the development of Barrett's esophagus (BE) are poorly understood. Recent studies have examined the association between elevated body mass index (BMI) and BE with conflicting results. A systematic review of literature was performed to study this association.^ Methods. Cross-sectional, case control and cohort studies published through April 2007 meeting strict inclusion and exclusion criteria were included. A thorough data abstraction, including that of reported crude or adjusted odds ratios or mean BMI, was performed. Crude odds ratios were estimated from available information in 3 studies.^ Results. Of 630 publications identified by our search items, 59 were reviewed in detail and 12 included in the final analyses. 3 studies showed a statistically significant association between obesity and BE (30-32) while 2 studies found a statistically significant association between overweight and BE (31, 32). Two studies that reported BMI as a continuous variable found BMI in cases to be significantly higher than that in the comparison group (30, 32). Other studies failed to show an significant association between elevated BMI and BE.^ Conclusions. There is conflicting data regarding the association between elevated BMI and BE. It is important to identify other risk factors that in combination with elevated BMI may lead to BE. Further studies are needed to evaluate if the presence of reflux symptoms or any particular pattern of obesity, are independently associated with BE.^ Key words. Barrett's esophagus, obesity, Body Mass Index, gastroesophageal reflux disease, meta-analysis^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background. Several studies have proposed a link between type 2 Diabetes mellitus (DM2) and Hepatitis C infection (HCV) with conflicting results. Since DM2 and HCV have high prevalence, establishing a link between the two may guide further studies aimed at DM2 prevention. A systematic review was conducted to estimate the magnitude and direction of association between DM2 and HCV. Temporality was assessed from cohort studies and case-control studies where such information was available. ^ Methods. MEDLINE searches were conducted for studies that provided risk estimates and fulfill criteria regarding the definition of exposure (HCV) and outcomes (DM2). HCV was defined in terms of method of diagnosis, laboratory technique and method of data collection; DM2 was defined in terms of the classification [World Health Organization (WHO) and American Diabetes Association (ADA)] 1-3 used for diagnosis, laboratory technique and method of data collection. Standardized searches and data abstraction for construction of tables was performed. Unadjusted or adjusted measures of association for individual studies were obtained or calculated from the full text of the studies. Template designed by Dr. David Ramsey. ^ Results. Forty-six studies out of one hundred and nine potentially eligible articles finally met the inclusion and exclusion criteria and were classified separately based on the study design as cross-sectional (twenty four), case-control (fifteen) or cohort studies (seven). The cohort studies showed a three-fold high (confidence interval 1.66–6.29) occurrence of DM2 in individuals with HCV compared to those who were unexposed to HCV and cross sectional studies had a summary odds ratio of 2.53 (1.96, 3.25). In case control studies, the summary odds ratio for studies done in subjects with DM2 was 3.61 (1.93, 6.74); in HCV, it was 2.30 (1.56, 3.38); and all fifteen studies, together, yielded an odds ratio of 2.60 (1.82, 3.73). ^ Conclusion. The above results support the hypothesis that there is an association between DM and HCV. The temporal relationship evident from cohort studies and proposed pathogenic mechanisms also suggest that HCV predisposes patients to development of DM2. Further cohort or prospective studies are needed, however, to determine whether treatment of HCV infections prevents development of DM2.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective. To evaluate the host risk factors associated with rifamycin-resistant Clostridium difficile (C. diff) infection in hospitalized patients compared to rifamycin-susceptible C.diff infection.^ Background. C. diff is the most common definable cause of nosocomial diarrhea affecting elderly hospitalized patients taking antibiotics for prolonged durations. The epidemiology of Clostridium difficile associated disease is now changing with the reports of a new hypervirulent strain causing hospital outbreaks. This new strain is associated with increased disease severity and mortality. The conventional therapy for C. diff includes metronidazole and vancomycin but high recurrence rates and treatment failures are now becoming a major concern. Rifamycin antibiotics are being developed as a new therapeutic option to treat C. diff infection after their efficacy was established in a few in vivo and in vitro studies. There are some recent studies that report an association between the hypervirulent strain and emerging rifamycin resistance. These findings assess the need for clinical studies to better understand the efficacy of rifamycin drugs against C. diff.^ Methods. This is a hospital-based, matched case-control study using de-identified data drawn from two prospective cohort studies involving C. diff patients at St Luke's Hospital. The C. diff isolates from these patients are screened for rifamycin resistance using agar dilution methods for minimum inhibitory concentrations (MIC) as part of Dr Zhi-Dong Jiang's study. Twenty-four rifamycin-rifamycin resistant C. diff cases were identified and matched with one rifamycin susceptible C. diff control on the basis of ± 10 years of age and hospitalization 30 days before or after the case. De-identified data for the 48 subjects was obtained from Dr Kevin Garey's clinical study at St Luke's Hospital enrolling C. diff patients. It was reviewed to gather information about host risk factors, outcome variables and relevant clinical characteristic.^ Results. Medical diagnosis at the time of admission (p = 0.0281) and history of chemotherapy (p = 0.022) were identified as a significant risk factor while hospital stay ranging from 1 week to 1 month and artificial feeding were identified as an important outcome variable (p = 0.072 and p = 0.081 respectively). Horn's Index assessing the severity of underlying illness and duration of antibiotics for cases and controls showed no significant difference.^ Conclusion. The study was a small project designed to identify host risk factors and understand the clinical implications of rifamycin-resistance. The study was underpowered and a larger sample size is needed to validate the results.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In order to better take advantage of the abundant results from large-scale genomic association studies, investigators are turning to a genetic risk score (GRS) method in order to combine the information from common modest-effect risk alleles into an efficient risk assessment statistic. The statistical properties of these GRSs are poorly understood. As a first step toward a better understanding of GRSs, a systematic analysis of recent investigations using a GRS was undertaken. GRS studies were searched in the areas of coronary heart disease (CHD), cancer, and other common diseases using bibliographic databases and by hand-searching reference lists and journals. Twenty-one independent case-control studies, cohort studies, and simulation studies (12 in CHD, 9 in other diseases) were identified. The underlying statistical assumptions of the GRS using the experience of the Framingham risk score were investigated. Improvements in the construction of a GRS guided by the concept of composite indicators are discussed. The GRS will be a promising risk assessment tool to improve prediction and diagnosis of common diseases.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective. To systematically review studies published in English on the relationship between plasma total homocysteine (Hcy) levels and the clinical and/or postmortem diagnosis of Alzheimer's disease (AD) in subjects who are over 60 years old.^ Method. Medline, PubMed, PsycINFO and Academic Search Premier, were searched by using the keywords "homocysteine", "Alzheimer disease" and "dementia", and "cognitive disorders". In addition, relevant articles in PubMed using the "related articles" link and by cross-referencing were identified. The study design, study setting and study population, sample size, the diagnostic criteria of the National Institute of Neurological and Communicative Disorders and Stroke (NINCDS) and the Alzheimer's Disease and Related Disorders Association (ADRDA), and description of how Hcy levels were measured or defined had to have been clearly stated. Empirical investigations reporting quantitative data on the epidemiology of the relationship between plasma total Hcy (exposure factor) and AD (outcome) were included in the systematic review.^ Results. A total of 7 studies, which included a total of 2,989 subjects, out of 388 potential articles met the inclusion criteria: four case control and three cohort studies were identified. All 7 studies had association statistics, such as the odds ratio (OR), the relative rates (RR), and the hazard ratio (HR) of AD, examined using multivariate and logistic regression analyses. Three case - comparison studies: Clarke et al. (1998) (OR: 4.5, 95% CI.: 2.2 - 9.2); McIlroy et al. (2002) (OR: 2.9, 95% CI.: 1.00–8.1); Quadri et al. (2004) (OR: 3.7, 95% CI.: 1.1 - 13.1), and two cohort studies: Seshadri et al. (2002) (RR: 1.8, 95% CI.: 1.3 - 2.5); Ravaglia et al. (2005) (HR: 2.1, 95% CI.: 1.7 - 3.8) found a significant association between serum total Hcy and AD. One case-comparison study, Miller et al. (2002) (OR: 2.2, 95% C.I.: 0.3 -16), and one cohort study, Luchsinger et al. (2004) (HR: 1.4, 95% C.I.: 0.7 - 2.3) failed to reject H0.^ Conclusions. The purpose of this review is to provide a thorough analysis of studies that examined the relationship between Hcy levels and AD. Five studies showed a positive statistically significant association between elevated total Hcy values and AD but the association was not statistically significant in two studies. Further research is needed in order to establish evidence of the strong, consistent association between serum total Hcy and AD as well as the presence of the appropriate temporal relationship. To answer these questions, it is important to conduct more prospective studies that examine the occurrence of AD in individuals with and without elevated Hcy values at baseline. In addition, the international standardization of measurements and cut-off points for plasma Hcy levels across laboratories is a critical issue to be addressed for the conduct of future studies on the topic.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Cardiovascular diseases (CVD) are the leading cause of morbidity and mortality worldwide. CVD mainly comprise of coronary heart disease and stroke and were ranked first and fourth respectively amongst leading causes of death in the United States. Influenza (flu) causes annual outbreaks and pandemics and is increasingly recognized as an important trigger for acute coronary syndromes and stroke. Influenza vaccination is an inexpensive and effective strategy for prevention of influenza related complications in high risk individuals. Though it is recommended for all CVD patients, Influenza vaccine is still used at suboptimal levels in these patients owing to prevailing controversy related to its effectiveness in preventing CVD. This review was undertaken to critically assess the effectiveness of influenza vaccination as a primary or secondary prevention method for CVD. ^ Methods: A systematic review was conducted using electronic databases OVID MEDLINE, PUBMED (National Library of Medicine), EMBASE, GOOGLE SCHOLAR and TRIP (Turning Research into Practice). The study search was limited to peer-reviewed articles published in English language from January 1970 through May 2012. The case control studies, cohort studies and randomized controlled trials related to influenza vaccination and CVD, with data on at least one of the outcomes were identified. In the review, only population-based epidemiologic studies in all ethnic groups and of either sex and with age limitation of 30 yrs or above, with clinical CVD outcomes of interest were included. ^ Results: Of the 16 studies (8 case control studies, 6 cohort studies and 2 randomized controlled trials) that met the inclusion criteria, 14 studies reported that there was a significant benefit in u influenza vaccination as primary or secondary prevention method for preventing new cardiovascular events. In contrary to the above findings, two studies mentioned that there was no significant benefit of vaccination in CVD prevention. ^ Conclusion: The available body of evidence in the review elucidates that vaccination against influenza is associated with reduction in the risk of new CVD events, hospitalization for coronary heart disease and stroke and as well as the risk of death. The study findings disclose that the influenza vaccination is very effective in CVD prevention and should be encouraged for the high risk population. However, larger and more future studies like randomized control trials are needed to further evaluate and confirm these findings. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Prevalent sampling is an efficient and focused approach to the study of the natural history of disease. Right-censored time-to-event data observed from prospective prevalent cohort studies are often subject to left-truncated sampling. Left-truncated samples are not randomly selected from the population of interest and have a selection bias. Extensive studies have focused on estimating the unbiased distribution given left-truncated samples. However, in many applications, the exact date of disease onset was not observed. For example, in an HIV infection study, the exact HIV infection time is not observable. However, it is known that the HIV infection date occurred between two observable dates. Meeting these challenges motivated our study. We propose parametric models to estimate the unbiased distribution of left-truncated, right-censored time-to-event data with uncertain onset times. We first consider data from a length-biased sampling, a specific case in left-truncated samplings. Then we extend the proposed method to general left-truncated sampling. With a parametric model, we construct the full likelihood, given a biased sample with unobservable onset of disease. The parameters are estimated through the maximization of the constructed likelihood by adjusting the selection bias and unobservable exact onset. Simulations are conducted to evaluate the finite sample performance of the proposed methods. We apply the proposed method to an HIV infection study, estimating the unbiased survival function and covariance coefficients. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Methicillin Resistant Staphylococcus aureus healthcare-associated infections (MRSA HAIs) are a major cause of morbidity in hospitalized patients. They pose great economic burden to hospitals caring for these patients. Intensified Interventions aim to control MRSA HAIs. Cost-effectiveness of Intensified Interventions is largely unclear. We performed a review of cost-effectiveness literature on Intensified Interventions , and provide a summary of study findings, the status of economic research in the area, and information that will help decision-makers at regional level and guide future research.^ We conducted literature search using electronic database PubMed, EBSCO, and The Cochrane Library. We limited our search to English articles published after 1999. We reviewed a total of 1,356 titles, and after applying our inclusion and exclusion criteria selected seven articles for our final review. We modified the Economic Evaluation Abstraction Form provided by CDC, and used this form to abstract data from studies.^ Of the seven selected articles two were cohort studies and the remaining five were modeling studies. They were done in various countries, in different study settings, and with different variations of the Intensified Intervention . Overall, six of the seven studies reported that Intensified Interventions were dominant or at least cost-effective in their study setting. This effect persisted on sensitivity testing.^ We identified many gaps in research in this field. The cost-effectiveness research in the field is mostly composed of modeling studies. The studies do not always clearly describe the intervention. The intervention and infection costs and the sources for these costs are not always explicit or are missing. In modeling studies, there is uncertainty associated with some key model inputs, but these inputs are not always identified. The models utilized in the modeling studies are not always tested for internal consistency or validity. Studies usually test the short term cost-effectiveness of Intensified Interventions but not the long results.^ Our study limitation was the inability to adjust for differences in study settings, intervention costs, disease costs, or effectiveness measures. Our study strength is the presentation of a focused literature review of Intensified Interventions in hospital settings. Through this study we provide information that will help decision makers at regional level, help guide future research, and might change clinical care and policies. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: In recent years, Spain has implemented a number of air quality control measures that are expected to lead to a future reduction in fine particle concentrations and an ensuing positive impact on public health. Objectives: We aimed to assess the impact on mortality attributable to a reduction in fine particle levels in Spain in 2014 in relation to the estimated level for 2007. Methods: To estimate exposure, we constructed fine particle distribution models for Spain for 2007 (reference scenario) and 2014 (projected scenario) with a spatial resolution of 16x16 km2. In a second step, we used the concentration-response functions proposed by cohort studies carried out in Europe (European Study of Cohorts for Air Pollution Effects and Rome longitudinal cohort) and North America (American Cancer Society cohort, Harvard Six Cities study and Canadian national cohort) to calculate the number of attributable annual deaths corresponding to all causes, all non-accidental causes, ischemic heart disease and lung cancer among persons aged over 25 years (2005-2007 mortality rate data). We examined the effect of the Spanish demographic shift in our analysis using 2007 and 2012 population figures. Results: Our model suggested that there would be a mean overall reduction in fine particle levels of 1mg/m3 by 2014. Taking into account 2007 population data, between 8 and 15 all-cause deaths per 100,000 population could be postponed annually by the expected reduction in fine particle levels. For specific subgroups, estimates varied from 10 to 30 deaths for all non-accidental causes, from 1 to 5 for lung cancer, and from 2 to 6 for ischemic heart disease. The expected burden of preventable mortality would be even higher in the future due to the Spanish population growth. Taking into account the population older than 30 years in 2012, the absolute mortality impact estimate would increase approximately by 18%. Conclusions: Effective implementation of air quality measures in Spain, in a scenario with a short-term projection, would amount to an appreciable decline infine particle concentrations, and this, in turn, would lead to notable health-related benefits. Recent European cohort studies strengthen the evidence of an association between long-term exposure to fine particles and health effects, and could enhance the health impact quantification in Europe. Air quality models can contribute to improved assessment of air pollution health impact estimates, particularly in study areas without air pollution monitoring data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Trabalho Final do Curso de Mestrado Integrado em Medicina, Faculdade de Medicina, Universidade de Lisboa, 2014

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE: To determine the efficacy of exercise training and its effects on outcomes in patients with heart failure. METHODS: MEDLINE, Medscape, and the Cochrane Controlled Trials Registry were searched for trials of exercise training in heart failure patients. Data relating to training protocol, exercise capacity, and outcome measures were extracted and reviewed. RESULTS: A total of 81 studies were identified: 30 randomized controlled trials, five nonrandomized controlled trials, nine randomized crossover trials, and 37 longitudinal cohort studies. Exercise training was performed in 2387 patients. The average increment in peak oxygen consumption was 17% in 57 studies that measured oxygen consumption directly, 17% in 40 studies of aerobic training, 9% in three studies that only used strength training, 15% in 13 studies of combined aerobic and strength training, and 16% in the one study on inspiratory training. There were no reports of deaths that were directly related to exercise during more than 60,000 patient-hours of exercise training. During the training and follow-up periods of the randomized controlled trials, there were 56 combined (deaths or adverse events) events in the exercise groups and 75 combined events in the control groups (odds ratio [OR] = 0.98; 95% confidence interval [Cl]: 0.61 to 1.32; P = 0.60). During this same period, 26 exercising and 41 nonexercising subjects died (OR = 0.71; 95% CT: 0.37 to 1.02; P = 0.06). CONCLUSION: Exercise training is safe and effective in patients with heart failure. The risk of adverse events may be reduced, but further studies are required to determine whether there is any mortality benefit. (C) 2004 by Excerpta Medica Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Recent case-control studies suggest that, given equal smoking exposure, women may have a higher relative risk of developing lung cancer than men. Despite prospective data that conflict with this hypothesis, mechanistic studies to find a biologic basis for a sex difference continue. Methods: We addressed the hypothesis directly by analyzing prospective data from former and current smokers in two large cohorts-the Nurses' Health Study of women and the Health Professionals Follow-up Study of men. We calculated incidence rates and hazard ratios of lung cancer in women compared with men, adjusting for age, number of cigarettes smoked per day, age at start of smoking, and time since quitting, using Cox proportional hazards models. We also reviewed published results from prospective analyses. Results: From 1986 through 2000, 955 and 311 primary lung cancers were identified among 60 296 women and 25 397 men, respectively, who ranged in age from 40 to 79 years. Incidence rates per 100 000 person-years for women and men were 253 and 232, respectively, among current smokers and 81 and 73, respectively, among former smokers. The hazard ratio in women ever smokers compared with men was 1.11 (95% confidence interval = 0.95 to 1.31). Six published prospective cohort studies allowed assessment of comparative susceptibility to lung cancer by sex. None supported an excess risk of lung cancer for women. Conclusions: Women do not appear to have a greater susceptibility to lung cancer than men, given equal smoking exposure. Research should be focused on enhancing preventive interventions for all.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives To identify and examine differences in pre-existing morbidity between injured and non-injured population-based cohorts. Methods Administrative health data from Manitoba, Canada, were used to select a population-based cohort of injured people and a sample of non-injured people matched on age, gender, aboriginal status and geographical location of residence at the date of injury. All individuals aged 18-64 years who had been hospitalized between 1988 and 1991 for injury (International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) code 800-995) (n = 21032), were identified from the Manitoba discharge database. The matched non-injured comparison group comprised individuals randomly selected 1: 1 from the Manitoba population registry. Morbidity data for the 12 months prior to the date of the injury were obtained by linking the two cohorts with all hospital discharge records and physician claims. Results Compared to the non-injured group, injured people had higher Charlson Comorbidity Index scores, 1.9 times higher rates of hospital admissions and 1.7 times higher rates of physician claims in the year prior to the injury. Injured people had a rate of admissions to hospital for a mental health disorder 9.3 times higher, and physician claims for a mental health disorder 3.5 times higher, than that of non-injured people. These differences were all statistically significant (P < 0.001). Conclusion Injured people were shown to differ from the general non-injured population in terms of pre-existing morbidity. Existing population estimates of the attributable burden of injury that are obtained by extrapolating from observed outcomes in samples of injured cases may overestimate the magnitude of the problem.