872 resultados para Prospective Cohort


Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: Therapeutic temperature modulation is recommended after cardiac arrest (CA). However, body temperature (BT) regulation has not been extensively studied in this setting. We investigated BT variation in CA patients treated with therapeutic hypothermia (TH) and analyzed its impact on outcome. METHODS: A prospective cohort of comatose CA patients treated with TH (32-34°C, 24h) at the medical/surgical intensive care unit of the Lausanne University Hospital was studied. Spontaneous BT was recorded on hospital admission. The following variables were measured during and after TH: time to target temperature (TTT=time from hospital admission to induced BT target <34°C), cooling rate (spontaneous BT-induced BT target/TTT) and time of passive rewarming to normothermia. Associations of spontaneous and induced BT with in-hospital mortality were examined. RESULTS: A total of 177 patients (median age 61 years; median time to ROSC 25 min) were studied. Non-survivors (N=90, 51%) had lower spontaneous admission BT than survivors (median 34.5 [interquartile range 33.7-35.9]°C vs. 35.1 [34.4-35.8]°C, p=0.04). Accordingly, time to target temperature was shorter among non-survivors (200 [25-363]min vs. 270 [158-375]min, p=0.03); however, when adjusting for admission BT, cooling rates were comparable between the two outcome groups (0.4 [0.2-0.5]°C/h vs. 0.3 [0.2-0.4]°C/h, p=0.65). Longer duration of passive rewarming (600 [464-744]min vs. 479 [360-600]min, p<0.001) was associated with mortality. CONCLUSIONS: Lower spontaneous admission BT and longer time of passive rewarming were associated with in-hospital mortality after CA and TH. Impaired thermoregulation may be an important physiologic determinant of post-resuscitation disease and CA prognosis. When assessing the benefit of early cooling on outcome, future trials should adjust for patient admission temperature and use the cooling rate rather than the time to target temperature.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Polypharmacy, defined as the concomitant use of multiple medications, is very common in the elderly and may trigger drug-drug interactions and increase the risk of falls in patients receiving vitamin K antagonists. OBJECTIVE: To examine whether polypharmacy increases the risk of bleeding in elderly patients who receive vitamin K antagonists for acute venous thromboembolism (VTE). DESIGN: We used a prospective cohort study. PARTICIPANTS: In a multicenter Swiss cohort, we studied 830 patients aged ≥ 65 years with VTE. MAIN MEASURES: We defined polypharmacy as the prescription of more than four different drugs. We assessed the association between polypharmacy and the time to a first major and clinically relevant non-major bleeding, accounting for the competing risk of death. We adjusted for known bleeding risk factors (age, gender, pulmonary embolism, active cancer, arterial hypertension, cardiac disease, cerebrovascular disease, chronic liver and renal disease, diabetes mellitus, history of major bleeding, recent surgery, anemia, thrombocytopenia) and periods of vitamin K antagonist treatment as a time-varying covariate. KEY RESULTS: Overall, 413 (49.8 %) patients had polypharmacy. The mean follow-up duration was 17.8 months. Patients with polypharmacy had a significantly higher incidence of major (9.0 vs. 4.1 events/100 patient-years; incidence rate ratio [IRR] 2.18, 95 % confidence interval [CI] 1.32-3.68) and clinically relevant non-major bleeding (14.8 vs. 8.0 events/100 patient-years; IRR 1.85, 95 % CI 1.27-2.71) than patients without polypharmacy. After adjustment, polypharmacy was significantly associated with major (sub-hazard ratio [SHR] 1.83, 95 % CI 1.03-3.25) and clinically relevant non-major bleeding (SHR 1.60, 95 % CI 1.06-2.42). CONCLUSIONS: Polypharmacy is associated with an increased risk of both major and clinically relevant non-major bleeding in elderly patients receiving vitamin K antagonists for VTE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the past two decades, nosocomial infections caused by extended-spectrum beta-lactamase (ESBL)-producing Klebsiella spp. have become a major problem all around the world. This situation is of concern because there are limited antimicrobial options to treat patients infected with these pathogens, and also because this kind of resistance can spread to a wide variety of Gram-negative bacilli. Our objectives wereto evaluate among in-patients at a publicuniversity tertiary-care hospital with documented infection due to Klebsiella spp., which were the risk factors (cross-sectional analysis) and the clinical impact (prospective cohort) associated with an ESBL-producing strain. Study subjects were all patients admitted at the study hospital between April 2002 and October 2003, with a clinically and microbiologically confirmed infection caused by Klebsiella spp. at any body site, except infections restricted to the urinary tract. Of the 104 patients studied, 47 were infected with an ESBL-producing strain and 57 with a non-ESBL-producing strain. Independent risk factors associated with infection with an ESBL-producing strain were young age, exposure to mechanical ventilation, central venous catheter, use of any antimicrobial agent, and particularly use of a 4th generation cephalosporin or a quinolone. Length of stay was significant longer for patients infected with ESBL-producing strains than for those infected with non-ESBL-producing strains, although fatality rate was not significantly affected by ESBL-production in this cohort. In fact, mechanical ventilation and bacteremia were the only variables withindependent association withdeath detected in this investigation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

CONTEXT: Infection of implantable cardiac devices is an emerging disease with significant morbidity, mortality, and health care costs. OBJECTIVES: To describe the clinical characteristics and outcome of cardiac device infective endocarditis (CDIE) with attention to its health care association and to evaluate the association between device removal during index hospitalization and outcome. DESIGN, SETTING, AND PATIENTS: Prospective cohort study using data from the International Collaboration on Endocarditis-Prospective Cohort Study (ICE-PCS), conducted June 2000 through August 2006 in 61 centers in 28 countries. Patients were hospitalized adults with definite endocarditis as defined by modified Duke endocarditis criteria. MAIN OUTCOME MEASURES: In-hospital and 1-year mortality. RESULTS: CDIE was diagnosed in 177 (6.4% [95% CI, 5.5%-7.4%]) of a total cohort of 2760 patients with definite infective endocarditis. The clinical profile of CDIE included advanced patient age (median, 71.2 years [interquartile range, 59.8-77.6]); causation by staphylococci (62 [35.0% {95% CI, 28.0%-42.5%}] Staphylococcus aureus and 56 [31.6% {95% CI, 24.9%-39.0%}] coagulase-negative staphylococci); and a high prevalence of health care-associated infection (81 [45.8% {95% CI, 38.3%-53.4%}]). There was coexisting valve involvement in 66 (37.3% [95% CI, 30.2%-44.9%]) patients, predominantly tricuspid valve infection (43/177 [24.3%]), with associated higher mortality. In-hospital and 1-year mortality rates were 14.7% (26/177 [95% CI, 9.8%-20.8%]) and 23.2% (41/177 [95% CI, 17.2%-30.1%]), respectively. Proportional hazards regression analysis showed a survival benefit at 1 year for device removal during the initial hospitalization (28/141 patients [19.9%] who underwent device removal during the index hospitalization had died at 1 year, vs 13/34 [38.2%] who did not undergo device removal; hazard ratio, 0.42 [95% CI, 0.22-0.82]). CONCLUSIONS: Among patients with CDIE, the rate of concomitant valve infection is high, as is mortality, particularly if there is valve involvement. Early device removal is associated with improved survival at 1 year.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

To assess the effectiveness of a multidisciplinary evaluation and referral process in a prospective cohort of general hospital patients with alcohol dependence. Alcohol-dependent patients were identified in the wards of the general hospital and its primary care center. They were evaluated and then referred to treatment by a multidisciplinary team; those patients who accepted to participate in this cohort study were consecutively included and followed for 6 months. Not included patients were lost for follow-up, whereas all included patients were assessed at time of inclusion, 2 and 6 months later by a research psychologist in order to collect standardized baseline patients' characteristics, process salient features and patients outcomes (defined as treatment adherence and abstinence). Multidisciplinary evaluation and therapeutic referral was feasible and effective, with a success rate of 43%for treatment adherence and 28%for abstinence at 6 months. Among patients' characteristics, predictors of success were an age over 45, not living alone, being employed and being motivated to treatment (RAATE-A score &lt; 18), whereas successful process characteristics included detoxification of the patient at time of referral and a full multidisciplinary referral meeting. This multidisciplinary model of evaluation and referral of alcohol dependent patients of a general hospital had a satisfactory level of effectiveness. Predictors of success and failure allow to identify subsets of patients for whom new strategies of motivation and treatment referral should be designed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

More than one hundred years ago the "protein hypothesis" of the pathogenesis of atherosclerosis and its association with cardiovascular disease was put forward on the basis of animal experiments; however, it has so far never been verified in humans. This theory was soon replaced by the "lipid hypothesis", which was confirmed in humans as of 1994. Epidemiological ecological studies in the 1960 s showed significant associations between dietary animal protein and mortality from cardiovascular disease. However, animal protein intake was also significantly correlated with saturated fatty acid and cholesterol intake. In the last decades two prospective cohort studies demonstrated a decreased cardiovascular risk in women during high- versus low-protein intake when adjusting for other dietary factors (e. g., saturated fats) and other cardiovascular risk factors. A direct cholesterol lowering effect of proteins has not been shown. Despite earlier research indicating that soy protein has cardioprotective effects as compared to other proteins, these observations have not been confirmed by randomized placebo-controlled trials. However, most experts recommend the consumption of foods rich in plant proteins as alternatives to meat and dairy products rich in saturated fat and containing cholesterol. There are no scientific arguments to increase the daily protein intake to more than 20 % of total energy intake as recommended by the guidelines, in order to improve cardiovascular health.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Significant decrease in human immunodeficiency virus type 1 (HIV-1) vertical transmission has been observed worldwide in centers where interventions such as antiretroviral therapy (ART), elective cesarean section, and avoidance of breastfeeding have been implemented. This prospective cohort study aimed to assess the determinants of and the temporal trends in HIV-1 vertical transmission in the metropolitan area of Belo Horizonte, Brazil from January 1998 to December 2005. The rate of HIV-1 vertical transmission decreased from 20% in 1998 to 3% in 2005. This decline was associated with increased use of more complex ART regimens during pregnancy. Multivariate analysis restricted to clinical variables demonstrated that non ART, neonatal respiratory distress/sepsis and breastfeeding were independently associated with HIV-1 vertical transmission. When laboratory parameters were included in the model, high maternal viral load and non maternal ART were associated with HIV-1 vertical transmission. The results from this study confirm the impact of ART in the reduction of HIV-1 vertical transmission and indicate the need for improvement in the care and monitoring of mother and infant pairs affected by HIV-1.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Frailty, as defined by the index derived from the Cardiovascular Health Study (CHS index), predicts risk of adverse outcomes in older adults. Use of this index, however, is impractical in clinical practice. METHODS: We conducted a prospective cohort study in 6701 women 69 years or older to compare the predictive validity of a simple frailty index with the components of weight loss, inability to rise from a chair 5 times without using arms, and reduced energy level (Study of Osteoporotic Fractures [SOF index]) with that of the CHS index with the components of unintentional weight loss, poor grip strength, reduced energy level, slow walking speed, and low level of physical activity. Women were classified as robust, of intermediate status, or frail using each index. Falls were reported every 4 months for 1 year. Disability (> or =1 new impairment in performing instrumental activities of daily living) was ascertained at 4(1/2) years, and fractures and deaths were ascertained during 9 years of follow-up. Area under the curve (AUC) statistics from receiver operating characteristic curve analysis and -2 log likelihood statistics were compared for models containing the CHS index vs the SOF index. RESULTS: Increasing evidence of frailty as defined by either the CHS index or the SOF index was similarly associated with an increased risk of adverse outcomes. Frail women had a higher age-adjusted risk of recurrent falls (odds ratio, 2.4), disability (odds ratio, 2.2-2.8), nonspine fracture (hazard ratio, 1.4-1.5), hip fracture (hazard ratio, 1.7-1.8), and death (hazard ratio, 2.4-2.7) (P < .001 for all models). The AUC comparisons revealed no differences between models with the CHS index vs the SOF index in discriminating falls (AUC = 0.61 for both models; P = .66), disability (AUC = 0.64; P = .23), nonspine fracture (AUC = 0.55; P = .80), hip fracture (AUC = 0.63; P = .64), or death (AUC = 0.72; P = .10). Results were similar when -2 log likelihood statistics were compared. CONCLUSION: The simple SOF index predicts risk of falls, disability, fracture, and death as well as the more complex CHS index and may provide a useful definition of frailty to identify older women at risk of adverse health outcomes in clinical practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background and Aims: To protect the population from environmental tobacco smoke (ETS) Switzerland introduced a nationwide rather heterogeneous smoking ban in May 2010. The exposure situation of non-smoking hospitality workers before and after implementation of the new law is being assessed in a prospective cohort study. Methods: Exposure to ETS was measured using a novel method developed by the Institute for Work and Health in Lausanne. It is a passive sampler called MoNIC (Monitor of NICotine). The nicotine of the ETS is fixed onto a filter and transformed into salt of not volatile nicotine. Subsequently the number of passively smoked cigarettes is calculated. Badges were placed at the workplace as well as distributed to the participants for personal measuring. Additionally a salivary sample was taken to determine nicotine concentration. Results: At baseline Spearman's correlation between workplace and personal badge was 0.47. The average cigarette equivalents per day at the workplace obtained by badge significantly dropped from 5.1 (95%- CI: 2.4 to 7.9) at baseline to 0.3 (0.2 to 0.4) at first follow-up (n=29) three months later (p<0.001). For personal badges the number of passively smoked cigarettes declined from 1.5 (2.7 to 0.4) per day to 0.5 (0.3 to 0.8) (n=16).Salivary nicotine concentration in a subset of 13 participants who had worked on the day prior to the examination was 2.63 ng/ml before and 1.53 ng/ml after the ban (p=0.04). Spearman's correlation of salivary nicotine was 0.56 with workplace badge and 0.79 with personal badge concentrations. Conclusions: Workplace measurements clearly reflect the smoking regulation in a venue. The MoNIC badge proves to be a sensitive measuring device to determine personal ETS exposure and it is a demonstrative measure for communication with lay audiences and study participants as the number of passively smoked cigarettes is an easily conceivable result.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: The association between alcohol consumption and coronary heart disease (CHD) has been widely studied. Most of these studies have concluded that moderate alcohol intake reduces the risk of CHD. There are numerous discussions regarding whether this association is causal or biased. The objective of this paper is to analyse the association between alcohol intake and CHD risk in the Spanish cohort of the European Prospective Investigation into Cancer (EPIC). Methods: Participants from the EPIC Spanish cohort were included (15 630 men and 25 808 women). The median follow-up period was 10 years. Ethanol intake was calculated using a validated dietary history questionnaire. Participants with a definite CHD event were considered cases. A Cox regression model adjusted for relevant co-variables and stratified by age was produced. Separate models were carried out for men and women. Results: The crude CHD incidence rate was 300.6/100 000 person-years for men and 47.9/100 000 person-years for women. Moderate, high and very high consumption was associated with a reduced risk of CHD in men: hazard ratio 0.90 (95% CI 0.56 to 1.44) for former drinkers, 0.65 (95% CI 0.41 to 1.04) for low, 0.49 (95% CI 0.32 to 0.76) for moderate, 0.46 (95% CI 0.30 to 0.71) for high and 0.50 (95% CI 0.29 to 0.85) for very high consumers. A negative association was found in women, with p values above 0.05 in all categories. Conclusions: Alcohol intake in men aged 29–69 years was associated with a more than 30% lower CHD incidence. This study is based on a large prospective cohort study and is free of the abstainer error.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: The aim of this investigation was to assess the effect of malabsorptive bariatric surgery (BS) on the quality of life (QoL), applying the Nottingham Health Profile (NHP) and the bariatric analysis and reporting outcome system (BAROS). Design: A prospective cohort study was performed in 100 adult patients (> 18 years) undergoing bariatric surgery by malabsorptive technique for one year. Research methods and procedures: Patients were monitored from the beginning of the BS program until a year after the intervention, applying the NHP and the BAROS test. At baseline, the mean weight of the women was 132 ± 22 kg and the Body Mass Index (BMI) was 50.7 kg/m2. Results: The values obtained from different areas applying the NHP questionnaire showed statistical significant differences (p < 0.001) with respect to baseline values. According to the BAROS test, 48% of patients lost 25-49% of weight excess and 80.8% had resolved major comorbidities at 1 yr. According to the Moorehead-Ardelt QoL score, there were major improvements in employment and self-esteem in 89% and 87% of patients, respectively, and improvements in physical activity, sexual and social relationships. According to the total mean BAROS score, the outcome was considered “very good”. Conclusion: NHP and BAROS questionnaires appear to be useful and easily applicable tools to assess the QoL of obese patients.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: Status epilepticus (SE) prognosis, is mostly related to non-modifiable factors (especially age, etiology), but the specific role of treatment appropriateness (TA) has not been investigated. Methods: In a prospective cohort with incident SE (excluding postanoxic), TA was defined, after recent European recommendations, in terms of drug dosage (630% deviation) and sequence. Outcome at hospital discharge was categorized into mortality, new handicap, or return to baseline. Results: Among 225 adults, treatment was inappropriate in 37%. In univariate analyses, age, etiology, SE severity and comorbidity, but not TA, were significantly related to outcome. Etiology (95% CI 4.3-82.8) and SE severity (95% CI 1.2-2.4) were independent predictors of mortality, and of lack of return to baseline conditions (etiology: 95% CI 3.9-14.0; SE severity: 95% CI 1.4-2.2). Moreover, TA did not improve outcome prediction in the corresponding ROC curves. Conclusions: This large analysis suggests that TA plays a negligible prognostic role in SE, probably reflecting the outstanding importance of the biological background. Awaiting treatment trials in SE, it appears questionable to apply further resources in refining treatment protocols involving existing compounds; rather, new therapeutic approaches should be identified and tested.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: Whether or not a high risk of falls increases the risk of bleeding in patients receiving anticoagulants remains a matter of debate. METHODS: We conducted a prospective cohort study involving 991 patients ≥65 years of age who received anticoagulants for acute venous thromboembolism (VTE) at nine Swiss hospitals between September 2009 and September 2012. The study outcomes were as follows: the time to a first major episode of bleeding; and clinically relevant nonmajor bleeding. We determined the associations between the risk of falls and the time to a first episode of bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS: Four hundred fifty-eight of 991 patients (46%) were at high risk of falls. The mean duration of follow-up was 16.7 months. Patients at high risk of falls had a higher incidence of major bleeding (9.6 vs. 6.6 events/100 patient-years; P = 0.05) and a significantly higher incidence of clinically relevant nonmajor bleeding (16.7 vs. 8.3 events/100 patient-years; P < 0.001) than patients at low risk of falls. After adjustment, a high risk of falls was associated with clinically relevant nonmajor bleeding [subhazard ratio (SHR) = 1.74, 95% confidence interval (CI) = 1.23-2.46], but not with major bleeding (SHR = 1.24, 95% CI = 0.83-1.86). CONCLUSION: In elderly patients who receive anticoagulants because of VTE, a high risk of falls is significantly associated with clinically relevant nonmajor bleeding, but not with major bleeding. Whether or not a high risk of falls is a reason against providing anticoagulation beyond 3 months should be based on patient preferences and the risk of VTE recurrence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The goal of this study was to evaluate changes in plasma human immunodeficiency virus (HIV) RNA concentration [viral load (VL)] and CD4+ percentage (CD4%) during 6-12 weeks postpartum (PP) among HIV-infected women and to assess differences according to the reason for receipt of antiretrovirals (ARVs) during pregnancy [prophylaxis (PR) vs. treatment (TR)]. Data from a prospective cohort of HIV-infected pregnant women (National Institute of Child Health and Human Development International Site Development Initiative Perinatal Study) were analyzed. Women experiencing their first pregnancy who received ARVs for PR (started during pregnancy, stopped PP) or for TR (initiated prior to pregnancy and/or continued PP) were included and were followed PP. Increases in plasma VL (> 0.5 log10) and decreases in CD4% (> 20% relative decrease in CD4%) between hospital discharge (HD) and PP were assessed. Of the 1,229 women enrolled, 1,119 met the inclusion criteria (PR: 601; TR: 518). At enrollment, 87% were asymptomatic. The median CD4% values were: HD [34% (PR); 25% (TR)] and PP [29% (PR); 24% (TR)]. The VL increases were 60% (PR) and 19% (TR) (p < 0.0001). The CD4% decreases were 36% (PR) and 18% (TR) (p < 0.0001). Women receiving PR were more likely to exhibit an increase in VL [adjusted odds ratio (AOR) 7.7 (95% CI: 5.5-10.9) and a CD4% decrease (AOR 2.3; 95% CI: 1.6-3.2). Women receiving PR are more likely to have VL increases and CD4% decreases compared to those receiving TR. The clinical implications of these VL and CD4% changes remain to be explored.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE. The main goal of this paper is to obtain a classification model based on feed-forward multilayer perceptrons in order to improve postpartum depression prediction during the 32 weeks after childbirth with a high sensitivity and specificity and to develop a tool to be integrated in a decision support system for clinicians. MATERIALS AND METHODS. Multilayer perceptrons were trained on data from 1397 women who had just given birth, from seven Spanish general hospitals, including clinical, environmental and genetic variables. A prospective cohort study was made just after delivery, at 8 weeks and at 32 weeks after delivery. The models were evaluated with the geometric mean of accuracies using a hold-out strategy. RESULTS. Multilayer perceptrons showed good performance (high sensitivity and specificity) as predictive models for postpartum depression. CONCLUSIONS. The use of these models in a decision support system can be clinically evaluated in future work. The analysis of the models by pruning leads to a qualitative interpretation of the influence of each variable in the interest of clinical protocols.