716 resultados para Prospective cohort
Resumo:
Sialic-acid-binding immunoglobulin-like lectin (Siglec) 9 mediates death signals in neutrophils. The objective of this study was to determine the heterogeneity of neutrophil death responses in septic shock patients and to analyze whether these ex vivo data are related to the severity and outcome of septic shock. In this prospective cohort study, blood samples of patients with septic shock (n = 26) in a medical-surgical intensive care unit (ICU) were taken within 24 h of starting the treatment of septic shock (phase A), after circulatory stabilization (phase B), and 10 days after admission or at ICU discharge if earlier (phase C). Neutrophil death was quantified in the presence and absence of an agonistic anti-Siglec-9 antibody after 24 h ex vivo. In phase A, two distinct patterns of Siglec-9-mediated neutrophil death were observed: resistance to neutrophil death (n = 14; Siglec-9 nonresponders) and increased neutrophil death (n = 12; Siglec-9 responders) after Siglec-9 ligation compared with neutrophils from normal donors. Experiments using a pharmacological pan-caspase-inhibitor provided evidence for caspase-independent neutrophil death in Siglec-9 responders upon Siglec-9 ligation. There were no differences between Siglec-9 responders and nonresponders in length of ICU or hospital stay of survivors or severity of organ dysfunction. Taken together, septic shock patients exhibit different ex vivo death responses of blood neutrophils after Siglec-9 ligation early in shock. Both the resistance and the increased susceptibility to Siglec-9-mediated neutrophil death tend to normalize within 72 h after shock. Further studies are required to understand the role of Siglec-9-mediated neutrophil death in septic shock.
Resumo:
OBJECTIVES: To investigate the contribution of a real-time PCR assay for the detection of Treponema pallidum in various biological specimens with the secondary objective of comparing its value according to HIV status. METHODS: Prospective cohort of incident syphilis cases from three Swiss hospitals (Geneva and Bern University Hospitals, Outpatient Clinic for Dermatology of Triemli, Zurich) diagnosed between January 2006 and September 2008. A case-control study was nested into the cohort. Biological specimens (blood, lesion swab or urine) were taken at diagnosis (as clinical information) and analysed by real-time PCR using the T pallidum 47 kDa gene. RESULTS: 126 specimens were collected from 74 patients with primary (n = 26), secondary (n = 40) and latent (n = 8) syphilis. Among primary syphilis, sensitivity was 80% in lesion swabs, 28% in whole blood, 55% in serum and 29% in urine, whereas among secondary syphilis, it was 20%, 36%, 47% and 44%, respectively. Among secondary syphilis, plasma and cerebrospinal fluid were also tested and provided a sensitivity of 100% and 50%, respectively. The global sensitivity of T pallidum by PCR (irrespective of the compartment tested) was 65% during primary, 53% during secondary and null during latent syphilis. No difference regarding serology or PCR results was observed among HIV-infected patients. Specificity was 100%. CONCLUSIONS: Syphilis PCR provides better sensitivity in lesion swabs from primary syphilis and displays only moderate sensitivity in blood from primary and secondary syphilis. HIV status did not modify the internal validity of PCR for the diagnosis of primary or secondary syphilis.
Resumo:
This editorial refers to ‘Increased risk of coronary heart disease among individuals reporting adverse impact of stress on their health: the Whitehall II prospective cohort study’†, by H. Nabi et al., on page 2697
Resumo:
BACKGROUND The use of combination antiretroviral therapy (cART) comprising three antiretroviral medications from at least two classes of drugs is the current standard treatment for HIV infection in adults and children. Current World Health Organization (WHO) guidelines for antiretroviral therapy recommend early treatment regardless of immunologic thresholds or the clinical condition for all infants (less than one years of age) and children under the age of two years. For children aged two to five years current WHO guidelines recommend (based on low quality evidence) that clinical and immunological thresholds be used to identify those who need to start cART (advanced clinical stage or CD4 counts ≤ 750 cells/mm(3) or per cent CD4 ≤ 25%). This Cochrane review will inform the current available evidence regarding the optimal time for treatment initiation in children aged two to five years with the goal of informing the revision of WHO 2013 recommendations on when to initiate cART in children. OBJECTIVES To assess the evidence for the optimal time to initiate cART in treatment-naive, HIV-infected children aged 2 to 5 years. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, the AEGIS conference database, specific relevant conferences, www.clinicaltrials.gov, the World Health Organization International Clinical Trials Registry platform and reference lists of articles. The date of the most recent search was 30 September 2012. SELECTION CRITERIA Randomised controlled trials (RCTs) that compared immediate with deferred initiation of cART, and prospective cohort studies which followed children from enrolment to start of cART and on cART. DATA COLLECTION AND ANALYSIS Two review authors considered studies for inclusion in the review, assessed the risk of bias, and extracted data on the primary outcome of death from all causes and several secondary outcomes, including incidence of CDC category C and B clinical events and per cent CD4 cells (CD4%) at study end. For RCTs we calculated relative risks (RR) or mean differences with 95% confidence intervals (95% CI). For cohort data, we extracted relative risks with 95% CI from adjusted analyses. We combined results from RCTs using a random effects model and examined statistical heterogeneity. MAIN RESULTS Two RCTs in HIV-positive children aged 1 to 12 years were identified. One trial was the pilot study for the larger second trial and both compared initiation of cART regardless of clinical-immunological conditions with deferred initiation until per cent CD4 dropped to <15%. The two trials were conducted in Thailand, and Thailand and Cambodia, respectively. Unpublished analyses of the 122 children enrolled at ages 2 to 5 years were included in this review. There was one death in the immediate cART group and no deaths in the deferred group (RR 2.9; 95% CI 0.12 to 68.9). In the subgroup analysis of children aged 24 to 59 months, there was one CDC C event in each group (RR 0.96; 95% CI 0.06 to 14.87) and 8 and 11 CDC B events in the immediate and deferred groups respectively (RR 0.95; 95% CI 0.24 to 3.73). In this subgroup, the mean difference in CD4 per cent at study end was 5.9% (95% CI 2.7 to 9.1). One cohort study from South Africa, which compared the effect of delaying cART for up to 60 days in 573 HIV-positive children starting tuberculosis treatment (median age 3.5 years), was also included. The adjusted hazard ratios for the effect on mortality of delaying ART for more than 60 days was 1.32 (95% CI 0.55 to 3.16). AUTHORS' CONCLUSIONS This systematic review shows that there is insufficient evidence from clinical trials in support of either early or CD4-guided initiation of ART in HIV-infected children aged 2 to 5 years. Programmatic issues such as the retention in care of children in ART programmes in resource-limited settings will need to be considered when formulating WHO 2013 recommendations.
Resumo:
BACKGROUND Mortality risk for people with chronic kidney disease is substantially greater than that for the general population, increasing to a 7-fold greater risk for those on dialysis therapy. Higher body mass index, generally due to higher energy intake, appears protective for people on dialysis therapy, but the relationship between energy intake and survival in those with reduced kidney function is unknown. STUDY DESIGN Prospective cohort study with a median follow-up of 14.5 (IQR, 11.2-15.2) years. SETTING & PARTICIPANTS Blue Mountains Area, west of Sydney, Australia. Participants in the general community enrolled in the Blue Mountains Eye Study (n=2,664) who underwent a detailed interview, food frequency questionnaire, and physical examination including body weight, height, blood pressure, and laboratory tests. PREDICTORS Relative energy intake, food components (carbohydrates, total sugars, fat, protein, and water), and estimated glomerular filtration rate (eGFR). Relative energy intake was dichotomized at 100%, and eGFR, at 60mL/min/1.73m(2). OUTCOMES All-cause and cardiovascular mortality. MEASUREMENTS All-cause and cardiovascular mortality using unadjusted and adjusted Cox proportional regression models. RESULTS 949 people died during follow-up, 318 of cardiovascular events. In people with eGFR<60mL/min/1.73m(2) (n=852), there was an increased risk of all-cause mortality (HR, 1.48; P=0.03), but no increased risk of cardiovascular mortality (HR, 1.59; P=0.1) among those with higher relative energy intake compared with those with lower relative energy intake. Increasing intake of carbohydrates (HR per 100g/d, 1.50; P=0.04) and total sugars (HR per 100g/d, 1.62; P=0.03) was associated significantly with increased risk of cardiovascular mortality. LIMITATIONS Under-reporting of energy intake, baseline laboratory and food intake values only, white population. CONCLUSIONS Increasing relative energy intake was associated with increased all-cause mortality in patients with eGFR<60mL/min/1.73m(2). This effect may be mediated by increasing total sugars intake on subsequent cardiovascular events.
Resumo:
OBJECTIVE To investigate the evolution of delirium of nursing home (NH) residents and their possible predictors. DESIGN Post-hoc analysis of a prospective cohort assessment. SETTING Ninety NHs in Switzerland. PARTICIPANTS Included 14,771 NH residents. MEASUREMENTS The Resident Assessment Instrument Minimum Data Set and the Nursing Home Confusion Assessment Method were used to determine follow-up of subsyndromal or full delirium in NH residents using discrete Markov chain modeling to describe long-term trajectories and multiple logistic regression analyses to determine predictors of the trajectories. RESULTS We identified four major types of delirium time courses in NH. Increasing severity of cognitive impairment and of depressive symptoms at the initial assessment predicted the different delirium time courses. CONCLUSION More pronounced cognitive impairment and depressive symptoms at the initial assessment are associated with different subsequent evolutions of delirium. The presence and evolution of delirium in the first year after NH admission predicted the subsequent course of delirium until death.
Resumo:
OBJECTIVE To expand the limited information on the prognostic impact of quantitatively obtained collateral function in patients with coronary artery disease (CAD) and to estimate causality of such a relation. DESIGN Prospective cohort study with long-term observation of clinical outcome. SETTING University Hospital. PATIENTS One thousand one hundred and eighty-one patients with chronic stable CAD undergoing 1771 quantitative, coronary pressure-derived collateral flow index measurements, as obtained during a 1-min coronary balloon occlusion (CFI is the ratio between mean distal coronary occlusive pressure and mean aortic pressure both subtracted by central venous pressure). Subgroup of 152 patients included in randomised trials on the longitudinal effect of different arteriogenic protocols on CFI. INTERVENTIONS Collection of long-term follow-up information on clinical outcome. MAIN OUTCOME MEASURES All-cause mortality and major adverse cardiac events. RESULTS Cumulative 15-year survival rate was 48% in patients with CFI<0.25 and 65% in the group with CFI≥0.25 (p=0.0057). Cumulative 10-year survival rate was 75% in patients without arteriogenic therapy and 88% (p=0.0482) in the group with arteriogenic therapy and showing a significant increase in CFI at follow-up. By proportional hazard analysis, the following variables predicted increased all-cause mortality: age, low CFI, left ventricular end-diastolic pressure and number of vessels with CAD. CONCLUSIONS A well-functioning coronary collateral circulation independently predicts lowered mortality in patients with chronic CAD. This relation appears to be causal, because augmented collateral function by arteriogenic therapy is associated with prolonged survival.
Resumo:
Background Infections with vancomycin-resistant enterococci (VRE) are a growing concern in hospitals. The impact of vancomycin resistance in enterococcal urinary tract infection is not well-defined. Aim To describe the epidemiology of enterococcal bacteriuria in a hospital and compare the clinical picture and patient outcomes depending on vancomycin resistance. Methods This was a 6-month prospective cohort study of hospital patients who were admitted with or who developed enterococcal bacteriuria in a 1250-bed tertiary care hospital. We examined clinical presentation, diagnostic work-up, management, and outcomes. Findings We included 254 patients with enterococcal bacteriuria; 160 (63%) were female and median age was 65 years (range: 17–96). A total of 116 (46%) bacteriurias were hospital-acquired and 145 (57%) catheter-associated. Most patients presented with asymptomatic bacteriuria (ASB) (119; 47%) or pyelonephritis (64; 25%); 51 (20%) had unclassifiable bacteriuria and 20 (8%) had cystitis. Secondary bloodstream infection was detected in 8 (3%) patients. Seventy of 119 (59%) with ASB received antibiotics (mostly vancomycin). There were 74 (29%) VRE bacteriurias. VRE and vancomycin-susceptible enterococci (VSE) produced similar rates of pyelonephritis [19 (25%) vs 45 (25%); P = 0.2], cystitis, and ASB. Outcomes such as ICU transfer [10 (14%) VRE vs 17 (9%) VSE; P = 0.3], hospital length of stay (6.8 vs 5.0 days; P = 0.08), and mortality [10 (14%) vs 13 (7%); P = 0.1] did not vary with vancomycin susceptibility. Conclusions Vancomycin resistance did not affect the clinical presentation nor did it impact patient outcomes in this cohort of inpatients with enterococcal bacteriuria. Almost half of our cohort had enterococcal ASB; more than 50% of these asymptomatic patients received unnecessary antibiotics. Antimicrobial stewardship efforts should address overtreatment of enterococcal bacteriurias.
Resumo:
We assessed the impact of antiviral prophylaxis and preemptive therapy on the incidence and outcomes of cytomegalovirus (CMV) disease in a nationwide prospective cohort of solid organ transplant recipients. Risk factors associated with CMV disease and graft failure-free survival were analyzed using Cox regression models. One thousand two hundred thirty-nine patients transplanted from May 2008 until March 2011 were included; 466 (38%) patients received CMV prophylaxis and 522 (42%) patients were managed preemptively. Overall incidence of CMV disease was 6.05% and was linked to CMV serostatus (D+/R− vs. R+, hazard ratio [HR] 5.36 [95% CI 3.14–9.14], p < 0.001). No difference in the incidence of CMV disease was observed in patients receiving antiviral prophylaxis as compared to the preemptive approach (HR 1.16 [95% CI 0.63–2.17], p = 0.63). CMV disease was not associated with a lower graft failure-free survival (HR 1.27 [95% CI 0.64–2.53], p = 0.50). Nevertheless, patients followed by the preemptive approach had an inferior graft failure-free survival after a median of 1.05 years of follow-up (HR 1.63 [95% CI 1.01–2.64], p = 0.044). The incidence of CMV disease in this cohort was low and not influenced by the preventive strategy used. However, patients on CMV prophylaxis were more likely to be free from graft failure.
Resumo:
Background Escherichia coli is a common cause of asymptomatic and symptomatic bacteriuria in hospitalized patients. Asymptomatic bacteriuria (ASB) is frequently treated with antibiotics without a clear indication. Our goal was to determine patient and pathogen factors suggestive of ASB. Methods We conducted a 12-month prospective cohort study of adult inpatients with E. coli bacteriuria seen at a tertiary care hospital in St. Louis, Missouri, USA. Urine cultures were taken at the discretion of treating physicians. Bacterial isolates were tested for 14 putative virulence genes using high-throughput dot-blot hybridization. Results The median age of the 287 study patients was 65 (19–101) years; 78% were female. Seventy percent had community-acquired bacteriuria. One-hundred ten (38.3%) patients had ASB and 177 (61.7%) had symptomatic urinary tract infection (sUTI). Asymptomatic patients were more likely than symptomatic patients to have congestive heart failure (p = 0.03), a history of myocardial infarction (p = 0.01), chronic pulmonary disease (p = 0.045), peripheral vascular disease (p = 0.04), and dementia (p = 0.03). Patients with sUTI were more likely to be neutropenic at the time of bacteriuria (p = 0.046). Chronic pulmonary disease [OR 2.1 (95% CI 1.04, 4.1)] and dementia [OR 2.4 (95% CI 1.02, 5.8)] were independent predictors for asymptomatic bacteriuria. Absence of pyuria was not predictive of ASB. None of the individual virulence genes tested were associated with ASB nor was the total number of genes. Conclusions Asymptomatic E. coli bacteriuria in hospitalized patients was frequent and more common in patients with dementia and chronic pulmonary disease. Bacterial virulence factors could not discriminate symptomatic from asymptomatic bacteriurias. Asymptomatic E. coli bacteriuria cannot be predicted by virulence screening.
Resumo:
Health-related quality of life (HRQOL) is an important measure of the effects of chronic liver disease in affected patients that helps guide interventions to improve well-being. However, the relationship between HRQOL and survival in liver transplant candidates remains unclear. We examined whether the Physical Component Summary (PCS) and Mental Component Summary (MCS) scores from the Short Form 36 (SF-36) Health Survey were associated with survival in liver transplant candidates. We administered the SF-36 questionnaire (version 2.0) to patients in the Pulmonary Vascular Complications of Liver Disease study, a multicenter prospective cohort of patients evaluated for liver transplantation in 7 academic centers in the United States between 2003 and 2006. Cox proportional hazards models were used with death as the primary outcome and adjustment for liver transplantation as a time-varying covariate. The mean age of the 252 participants was 54 +/- 10 years, 64% were male, and 94% were white. During the 422 person years of follow-up, 147 patients (58%) were listed, 75 patients (30%) underwent transplantation, 49 patients (19%) died, and 3 patients were lost to follow-up. Lower baseline PCS scores were associated with an increased mortality rate despite adjustments for age, gender, Model for End-Stage Liver Disease score, and liver transplantation (P for the trend = 0.0001). The MCS score was not associated with mortality (P for the trend = 0.53). In conclusion, PCS significantly predicts survival in liver transplant candidates, and interventions directed toward improving the physical status may be helpful in improving outcomes in liver transplant candidates.
Resumo:
IMPORTANCE Little is known about whether sex differences in acute coronary syndrome (ACS) presentation exist in young patients and what factors determine absence of chest pain in ACS presentation. OBJECTIVES To evaluate sex differences in ACS presentation and to estimate associations between sex, sociodemographic, gender identity, psychosocial and clinical factors, markers of coronary disease severity, and absence of chest pain in young patients with ACS. DESIGN, SETTING, PARTICIPANTS We conducted a prospective cohort study of 1015 patients (30% women) 55 years or younger, hospitalized for ACS and enrolled in the GENESIS PRAXY (Gender and Sex Determinants of Cardiovascular Disease: From Bench to Beyond Premature Acute Coronary Syndrome) study (January 2009-September 2012). MAIN OUTCOMES AND MEASURES The McSweeney Acute and Prodromal Myocardial Infarction Symptom Survey was administered during hospitalization. RESULTS The median age for both sexes was 49 years. Women were more likely to have non-ST-segment elevation myocardial infarction (37.5 vs 30.7; P = .03) and present without chest pain compared with men (19.0% vs 13.7%; P = .03). Patients without chest pain reported fewer symptoms overall and no discernable pattern of non-chest pain symptoms was found. In the multivariate model, being a woman (odds ratio [OR], 1.95 [95% CI, 1.23-3.11]; P = .005) and tachycardia (OR, 2.07 [95% CI, 1.20-3.56]; P = .009) were independently associated with ACS presentation without chest pain. Patients without chest pain did not differ significantly from those with chest pain in terms of ACS type, troponin level elevation, or coronary stenosis. CONCLUSIONS AND RELEVANCE Chest pain was the most common ACS symptom in both sexes. Although women were more likely to present without chest pain than men, absence of chest pain was not associated with markers of coronary disease severity. Strategies that explicitly incorporate assessment of common non-chest pain symptoms need to be evaluated.
Resumo:
BACKGROUND Conventional factors do not fully explain the distribution of cardiovascular outcomes. Biomarkers are known to participate in well-established pathways associated with cardiovascular disease, and may therefore provide further information over and above conventional risk factors. This study sought to determine whether individual and/or combined assessment of 9 biomarkers improved discrimination, calibration and reclassification of cardiovascular mortality. METHODS 3267 patients (2283 men), aged 18-95 years, at intermediate-to-high-risk of cardiovascular disease were followed in this prospective cohort study. Conventional risk factors and biomarkers were included based on forward and backward Cox proportional stepwise selection models. RESULTS During 10-years of follow-up, 546 fatal cardiovascular events occurred. Four biomarkers (interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D) were retained during stepwise selection procedures for subsequent analyses. Simultaneous inclusion of these biomarkers significantly improved discrimination as measured by the C-index (0.78, P = 0.0001), and integrated discrimination improvement (0.0219, P<0.0001). Collectively, these biomarkers improved net reclassification for cardiovascular death by 10.6% (P<0.0001) when added to the conventional risk model. CONCLUSIONS In terms of adverse cardiovascular prognosis, a biomarker panel consisting of interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D offered significant incremental value beyond that conveyed by simple conventional risk factors.
Resumo:
Medial arterial calcification is accelerated in patients with CKD and strongly associated with increased arterial rigidity and cardiovascular mortality. Recently, a novel in vitro blood test that provides an overall measure of calcification propensity by monitoring the maturation time (T50) of calciprotein particles in serum was described. We used this test to measure serum T50 in a prospective cohort of 184 patients with stages 3 and 4 CKD, with a median of 5.3 years of follow-up. At baseline, the major determinants of serum calcification propensity included higher serum phosphate, ionized calcium, increased bone osteoclastic activity, and lower free fetuin-A, plasma pyrophosphate, and albumin concentrations, which accounted for 49% of the variation in this parameter. Increased serum calcification propensity at baseline independently associated with aortic pulse wave velocity in the complete cohort and progressive aortic stiffening over 30 months in a subgroup of 93 patients. After adjustment for demographic, renal, cardiovascular, and biochemical covariates, including serum phosphate, risk of death among patients in the lowest T50 tertile was more than two times the risk among patients in the highest T50 tertile (adjusted hazard ratio, 2.2; 95% confidence interval, 1.1 to 5.4; P=0.04). This effect was lost, however, after additional adjustment for aortic stiffness, suggesting a shared causal pathway. Longitudinally, serum calcification propensity measurements remained temporally stable (intraclass correlation=0.81). These results suggest that serum T50 may be helpful as a biomarker in designing methods to improve defenses against vascular calcification.
Resumo:
Background: Patients presenting to the emergency department (ED) currently face inacceptable delays in initial treatment, and long, costly hospital stays due to suboptimal initial triage and site-of-care decisions. Accurate ED triage should focus not only on initial treatment priority, but also on prediction of medical risk and nursing needs to improve site-of-care decisions and to simplify early discharge management. Different triage scores have been proposed, such as the Manchester triage system (MTS). Yet, these scores focus only on treatment priority, have suboptimal performance and lack validation in the Swiss health care system. Because the MTS will be introduced into clinical routine at the Kantonsspital Aarau, we propose a large prospective cohort study to optimize initial patient triage. Specifically, the aim of this trial is to derive a three-part triage algorithm to better predict (a) treatment priority; (b) medical risk and thus need for in-hospital treatment; (c) post-acute care needs of patients at the most proximal time point of ED admission. Methods/design: Prospective, observational, multicenter, multi-national cohort study. We will include all consecutive medical patients seeking ED care into this observational registry. There will be no exclusions except for non-adult and non-medical patients. Vital signs will be recorded and left over blood samples will be stored for later batch analysis of blood markers. Upon ED admission, the post-acute care discharge score (PACD) will be recorded. Attending ED physicians will adjudicate triage priority based on all available results at the time of ED discharge to the medical ward. Patients will be reassessed daily during the hospital course for medical stability and readiness for discharge from the nurses and if involved social workers perspective. To assess outcomes, data from electronic medical records will be used and all patients will be contacted 30 days after hospital admission to assess vital and functional status, re-hospitalization, satisfaction with care and quality of life measures. We aim to include between 5000 and 7000 patients over one year of recruitment to derive the three-part triage algorithm. The respective main endpoints were defined as (a) initial triage priority (high vs. low priority) adjudicated by the attending ED physician at ED discharge, (b) adverse 30 day outcome (death or intensive care unit admission) within 30 days following ED admission to assess patients risk and thus need for in-hospital treatment and (c) post acute care needs after hospital discharge, defined as transfer of patients to a post-acute care institution, for early recognition and planning of post-acute care needs. Other outcomes are time to first physician contact, time to initiation of adequate medical therapy, time to social worker involvement, length of hospital stay, reasons fordischarge delays, patient’s satisfaction with care, overall hospital costs and patients care needs after returning home. Discussion: Using a reliable initial triage system for estimating initial treatment priority, need for in-hospital treatment and post-acute care needs is an innovative and persuasive approach for a more targeted and efficient management of medical patients in the ED. The proposed interdisciplinary , multi-national project has unprecedented potential to improve initial triage decisions and optimize resource allocation to the sickest patients from admission to discharge. The algorithms derived in this study will be compared in a later randomized controlled trial against a usual care control group in terms of resource use, length of hospital stay, overall costs and patient’s outcomes in terms of mortality, re-hospitalization, quality of life and satisfaction with care.