178 resultados para PROSPECTIVE COHORT
Resumo:
In the discussion about the rationale for spine registries, two basic questions have to be answered. The first one deals with the value of orthopaedic registries per se, considering them as observational studies and comparing the evidence they generate with that of randomised controlled trials. The second question asks if the need for registries in spine surgery is similar to that in the arthroplasty sector. The widely held view that randomised controlled trials are the 'gold standard' for evaluation and that observational methods have little or no value ignores the limitations of randomised trials. They may prove unnecessary, inappropriate, impossible, or inadequate. In addition, the external validity and hence the ability to make generalisations about the results of randomised trials is often low. Therefore, the false conflict between those who advocate randomised trials in all situations and those who believe observational data provide sufficient evidence needs to be replaced with mutual recognition of their complementary roles. The fact that many surgical techniques or technologies were introduced into the field of spine surgery without randomised trials or prospective cohort comparisons makes obvious an even increased need for spine registries compared to joint arthroplasty. An essential methodological prerequisite for a registry is a common terminology for reporting results and a sophisticated technology that networks all participants so that one central data pool is created and accessed. Recognising this need, the Spine Society of Europe has researched and developed Spine Tango, the first European spine registry, which can be accessed under www.eurospine.org.
Resumo:
BACKGROUND: The medical specialties chosen by doctors for their careers play an important part in the development of health-care services. This study aimed to investigate the influence of gender, personality traits, career motivation and life goal aspirations on the choice of medical specialty. METHODS: As part of a prospective cohort study of Swiss medical school graduates on career development, 522 fourth-year residents were asked in what specialty they wanted to qualify. They also assessed their career motivation and life goal aspirations. Data concerning personality traits such as sense of coherence, self-esteem, and gender role orientation were collected at the first assessment, four years earlier, in their final year of medical school. Data analyses were conducted by univariate and multivariate analyses of variance and covariance. RESULTS: In their fourth year of residency 439 (84.1%) participants had made their specialty choice. Of these, 45 (8.6%) subjects aspired to primary care, 126 (24.1%) to internal medicine, 68 (13.0%) to surgical specialties, 31 (5.9%) to gynaecology & obstetrics (G&O), 40 (7.7%) to anaesthesiology/intensive care, 44 (8.4%) to paediatrics, 25 (4.8%) to psychiatry and 60 (11.5%) to other specialties. Female residents tended to choose G&O, paediatrics, and anaesthesiology, males more often surgical specialties; the other specialties did not show gender-relevant differences of frequency distribution. Gender had the strongest significant influence on specialty choice, followed by career motivation, personality traits, and life goals. Multivariate analyses of covariance indicated that career motivation and life goals mediated the influence of personality on career choice. Personality traits were no longer significant after controlling for career motivation and life goals as covariates. The effect of gender remained significant after controlling for personality traits, career motivation and life goals. CONCLUSION: Gender had the greatest impact on specialty and career choice, but there were also two other relevant influencing factors, namely career motivation and life goals. Senior physicians mentoring junior physicians should pay special attention to these aspects. Motivational guidance throughout medical training should not only focus on the professional career but also consider the personal life goals of those being mentored.
Resumo:
Purpose: The aim of this paper was to review the clinical literature on the Resonance frequency analysis (RFA) and Periotest techniques in order to assess the validity and prognostic value of each technique to detect implants at risk for failure. Material and methods: A search was made using the PubMed database to find clinical studies using the RFA and/or Periotest techniques. Results: A limited number of clinical reports were found. No randomized-controlled clinical trials or prospective cohort studies could be found for validity testing of the techniques. Consequently, only a narrative review was prepared to cover general aspects of the techniques, factors influencing measurements and the clinical relevance of the techniques. Conclusions: Factors such as bone density, upper or lower jaw, abutment length and supracrestal implant length seem to influence both RFA and Periotest measurements. Data suggest that high RFA and low Periotest values indicate successfully integrated implants and that low/decreasing RFA and high/increasing Periotest values may be signs of ongoing disintegration and/or marginal bone loss. However, single readings using any of the techniques are of limited clinical value. The prognostic value of the RFA and Periotest techniques in predicting loss of implant stability has yet to be established in prospective clinical studies. To cite this article: Aparicio C, Lang N P, Rangert B. Validity and clinical significance of biomechanical testing of implant/bone interface. Clin. Oral Imp. Res., 17 (Suppl. 2), 2006; 2-7.
Resumo:
Objectives: To assess the ability to predict tooth loss on the basis of clinical and radiographic parameters. Methods: Clinical and radiographic data from a five year prospective cohort were studied to identify cause of progressive tooth loss in older subjects. Results: 363 subjects with a baseline mean age of 67.1 years (S.D. + 4.7, range : 60-75), and 51.4% women were studied including 59.5% never smokers, and 33.0% current smokers. At baseline the subjects had, on average, 22.4 teeth (S.D. + 6.4). Self-assessed tooth loss risk was identified by 16.0 % of subjects while 34% of subjects lost teeth. Tooth loss due to caries was found in 24.7% (178 teeth), periodontitis in 15.4% (133 teeth), peri-apical lesions 5.9% (32 teeth), combined periodontal/peri-apical in 3.4% (18 teeth), and teeth irrational to treat in 7.5% (58 teeth) of the subjects. 122 of the extracted teeth (34%) should have been possible to save but were extracted. At year five severe caries, periodontitis, peri-apical lesions, periodontal/peri-apical, irrational to treat were found in 6.3%, 7.2%, 2.6%, 4.6%, and 1.2% of subjects, respectively. Signs of osteoporosis increased by 11.2 % (Klemetti index). Linear regression analysis failed to include smoking habits as being explanatory. Explanatory factors were researcher prediction of extraction needs, subject self assessment of risk and change in ostoporosis status (r2 = 0.39, ANOVA, F=22.6, p< 0.001). Conclusions: Caries and periodontitis are primary causes for extraction. Progressive osteoporosis is associated with tooth loss. Radiographs, and subjects self-assessment of risk for tooth loss are robust predictors.
Resumo:
OBJECTIVES: To carry out long-term follow-up after percutaneous closure of patent foramen ovale (PFO) in patients with cryptogenic stroke. DESIGN: Prospective cohort study. SETTING: Single tertiary care centre. PARTICIPANTS: 525 consecutive patients (mean (SD) age 51 (12) years; 56% male). INTERVENTIONS: Percutaneous PFO closure without intraprocedural echocardiography. MAIN OUTCOME MEASURES: Freedom from recurrent embolic events. RESULTS: A mean (SD) of 1.7 (1.0) clinically apparent embolic events occurred for each patient, and 186 patients (35%) had >1 event. An atrial septal aneurysm was associated with the PFO in 161 patients (31%). All patients were followed up prospectively for up to 11 years. The implantation procedure failed in two patients (0.4%). There were 13 procedural complications (2.5%) without any long-term sequelae. Contrast transoesophageal echocardiography at 6 months showed complete closure in 86% of patients, and a minimal, moderate or large residual shunt in 9%, 3% and 2%, respectively. Patients with small occluders (<30 mm; n = 429) had fewer residual shunts (small 11% vs large 27%; p<0.001). During a mean (SD) follow-up of 2.9 (2.2) years (median 2.3 years; total 1534 patient-years), six ischaemic strokes, nine transient ischaemic attacks (TIAs) and two peripheral emboli occurred. Freedom from recurrent stroke, TIA, or peripheral embolism was 98% at 1 year, 97% at 2 years and 96% at 5 and 10 years, respectively. A residual shunt (hazard ratio = 3.4; 95% CI 1.3 to 9.2) was a risk factor for recurrence. CONCLUSIONS: This study attests to the long-term safety and efficacy of percutaneous PFO closure guided by fluoroscopy only for secondary prevention of paradoxical embolism in a large cohort of consecutive patients.
Resumo:
OBJECTIVE: To test the feasibility of and interactions among three software-driven critical care protocols. DESIGN: Prospective cohort study. SETTING: Intensive care units in six European and American university hospitals. PATIENTS: 174 cardiac surgery and 41 septic patients. INTERVENTIONS: Application of software-driven protocols for cardiovascular management, sedation, and weaning during the first 7 days of intensive care. MEASUREMENTS AND RESULTS: All protocols were used simultaneously in 85% of the cardiac surgery and 44% of the septic patients, and any one of the protocols was used for 73 and 44% of study duration, respectively. Protocol use was discontinued in 12% of patients by the treating clinician and in 6% for technical/administrative reasons. The number of protocol steps per unit of time was similar in the two diagnostic groups (n.s. for all protocols). Initial hemodynamic stability (a protocol target) was achieved in 26+/-18 min (mean+/-SD) in cardiac surgery and in 24+/-18 min in septic patients. Sedation targets were reached in 2.4+/-0.2h in cardiac surgery and in 3.6 +/-0.2h in septic patients. Weaning protocol was started in 164 (94%; 154 extubated) cardiac surgery and in 25 (60%; 9 extubated) septic patients. The median (interquartile range) time from starting weaning to extubation (a protocol target) was 89 min (range 44-154 min) for the cardiac surgery patients and 96 min (range 56-205 min) for the septic patients. CONCLUSIONS: Multiple software-driven treatment protocols can be simultaneously applied with high acceptance and rapid achievement of primary treatment goals. Time to reach these primary goals may provide a performance indicator.
Resumo:
AIMS: It is unclear whether transcatheter aortic valve implantation (TAVI) addresses an unmet clinical need for those currently rejected for surgical aortic valve replacement (SAVR) and whether there is a subgroup of high-risk patients benefiting more from TAVI compared to SAVR. In this two-centre, prospective cohort study, we compared baseline characteristics and 30-day mortality between TAVI and SAVR in consecutive patients undergoing invasive treatment for aortic stenosis. METHODS AND RESULTS: We pre-specified different adjustment methods to examine the effect of TAVI as compared with SAVR on overall 30-day mortality: crude univariable logistic regression analysis, multivariable analysis adjusted for baseline characteristics, analysis adjusted for propensity scores, propensity score matched analysis, and weighted analysis using the inverse probability of treatment (IPT) as weights. A total of 1,122 patients were included in the study: 114 undergoing TAVI and 1,008 patients undergoing SAVR. The crude mortality rate was greater in the TAVI group (9.6% vs. 2.3%) yielding an odds ratio [OR] of 4.57 (95%-CI 2.17-9.65). Compared to patients undergoing SAVR, patients with TAVI were older, more likely to be in NYHA class III and IV, and had a considerably higher logistic EuroSCORE and more comorbid conditions. Adjusted OR depended on the method used to control for confounding and ranged from 0.60 (0.11-3.36) to 7.57 (0.91-63.0). We examined the distribution of propensity scores and found scores to overlap sufficiently only in a narrow range. In patients with sufficient overlap of propensity scores, adjusted OR ranged from 0.35 (0.04-2.72) to 3.17 (0.31 to 31.9). In patients with insufficient overlap, we consistently found increased odds of death associated with TAVI compared with SAVR irrespective of the method used to control confounding, with adjusted OR ranging from 5.88 (0.67-51.8) to 25.7 (0.88-750). Approximately one third of patients undergoing TAVI were found to be potentially eligible for a randomised comparison of TAVI versus SAVR. CONCLUSIONS: Both measured and unmeasured confounding limit the conclusions that can be drawn from observational comparisons of TAVI versus SAVR. Our study indicates that TAVI could be associated with either substantial benefits or harms. Randomised comparisons of TAVI versus SAVR are warranted.
Resumo:
Sialic-acid-binding immunoglobulin-like lectin (Siglec) 9 mediates death signals in neutrophils. The objective of this study was to determine the heterogeneity of neutrophil death responses in septic shock patients and to analyze whether these ex vivo data are related to the severity and outcome of septic shock. In this prospective cohort study, blood samples of patients with septic shock (n = 26) in a medical-surgical intensive care unit (ICU) were taken within 24 h of starting the treatment of septic shock (phase A), after circulatory stabilization (phase B), and 10 days after admission or at ICU discharge if earlier (phase C). Neutrophil death was quantified in the presence and absence of an agonistic anti-Siglec-9 antibody after 24 h ex vivo. In phase A, two distinct patterns of Siglec-9-mediated neutrophil death were observed: resistance to neutrophil death (n = 14; Siglec-9 nonresponders) and increased neutrophil death (n = 12; Siglec-9 responders) after Siglec-9 ligation compared with neutrophils from normal donors. Experiments using a pharmacological pan-caspase-inhibitor provided evidence for caspase-independent neutrophil death in Siglec-9 responders upon Siglec-9 ligation. There were no differences between Siglec-9 responders and nonresponders in length of ICU or hospital stay of survivors or severity of organ dysfunction. Taken together, septic shock patients exhibit different ex vivo death responses of blood neutrophils after Siglec-9 ligation early in shock. Both the resistance and the increased susceptibility to Siglec-9-mediated neutrophil death tend to normalize within 72 h after shock. Further studies are required to understand the role of Siglec-9-mediated neutrophil death in septic shock.
Resumo:
OBJECTIVES: To investigate the contribution of a real-time PCR assay for the detection of Treponema pallidum in various biological specimens with the secondary objective of comparing its value according to HIV status. METHODS: Prospective cohort of incident syphilis cases from three Swiss hospitals (Geneva and Bern University Hospitals, Outpatient Clinic for Dermatology of Triemli, Zurich) diagnosed between January 2006 and September 2008. A case-control study was nested into the cohort. Biological specimens (blood, lesion swab or urine) were taken at diagnosis (as clinical information) and analysed by real-time PCR using the T pallidum 47 kDa gene. RESULTS: 126 specimens were collected from 74 patients with primary (n = 26), secondary (n = 40) and latent (n = 8) syphilis. Among primary syphilis, sensitivity was 80% in lesion swabs, 28% in whole blood, 55% in serum and 29% in urine, whereas among secondary syphilis, it was 20%, 36%, 47% and 44%, respectively. Among secondary syphilis, plasma and cerebrospinal fluid were also tested and provided a sensitivity of 100% and 50%, respectively. The global sensitivity of T pallidum by PCR (irrespective of the compartment tested) was 65% during primary, 53% during secondary and null during latent syphilis. No difference regarding serology or PCR results was observed among HIV-infected patients. Specificity was 100%. CONCLUSIONS: Syphilis PCR provides better sensitivity in lesion swabs from primary syphilis and displays only moderate sensitivity in blood from primary and secondary syphilis. HIV status did not modify the internal validity of PCR for the diagnosis of primary or secondary syphilis.
Resumo:
This editorial refers to ‘Increased risk of coronary heart disease among individuals reporting adverse impact of stress on their health: the Whitehall II prospective cohort study’†, by H. Nabi et al., on page 2697
Resumo:
BACKGROUND The use of combination antiretroviral therapy (cART) comprising three antiretroviral medications from at least two classes of drugs is the current standard treatment for HIV infection in adults and children. Current World Health Organization (WHO) guidelines for antiretroviral therapy recommend early treatment regardless of immunologic thresholds or the clinical condition for all infants (less than one years of age) and children under the age of two years. For children aged two to five years current WHO guidelines recommend (based on low quality evidence) that clinical and immunological thresholds be used to identify those who need to start cART (advanced clinical stage or CD4 counts ≤ 750 cells/mm(3) or per cent CD4 ≤ 25%). This Cochrane review will inform the current available evidence regarding the optimal time for treatment initiation in children aged two to five years with the goal of informing the revision of WHO 2013 recommendations on when to initiate cART in children. OBJECTIVES To assess the evidence for the optimal time to initiate cART in treatment-naive, HIV-infected children aged 2 to 5 years. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, the AEGIS conference database, specific relevant conferences, www.clinicaltrials.gov, the World Health Organization International Clinical Trials Registry platform and reference lists of articles. The date of the most recent search was 30 September 2012. SELECTION CRITERIA Randomised controlled trials (RCTs) that compared immediate with deferred initiation of cART, and prospective cohort studies which followed children from enrolment to start of cART and on cART. DATA COLLECTION AND ANALYSIS Two review authors considered studies for inclusion in the review, assessed the risk of bias, and extracted data on the primary outcome of death from all causes and several secondary outcomes, including incidence of CDC category C and B clinical events and per cent CD4 cells (CD4%) at study end. For RCTs we calculated relative risks (RR) or mean differences with 95% confidence intervals (95% CI). For cohort data, we extracted relative risks with 95% CI from adjusted analyses. We combined results from RCTs using a random effects model and examined statistical heterogeneity. MAIN RESULTS Two RCTs in HIV-positive children aged 1 to 12 years were identified. One trial was the pilot study for the larger second trial and both compared initiation of cART regardless of clinical-immunological conditions with deferred initiation until per cent CD4 dropped to <15%. The two trials were conducted in Thailand, and Thailand and Cambodia, respectively. Unpublished analyses of the 122 children enrolled at ages 2 to 5 years were included in this review. There was one death in the immediate cART group and no deaths in the deferred group (RR 2.9; 95% CI 0.12 to 68.9). In the subgroup analysis of children aged 24 to 59 months, there was one CDC C event in each group (RR 0.96; 95% CI 0.06 to 14.87) and 8 and 11 CDC B events in the immediate and deferred groups respectively (RR 0.95; 95% CI 0.24 to 3.73). In this subgroup, the mean difference in CD4 per cent at study end was 5.9% (95% CI 2.7 to 9.1). One cohort study from South Africa, which compared the effect of delaying cART for up to 60 days in 573 HIV-positive children starting tuberculosis treatment (median age 3.5 years), was also included. The adjusted hazard ratios for the effect on mortality of delaying ART for more than 60 days was 1.32 (95% CI 0.55 to 3.16). AUTHORS' CONCLUSIONS This systematic review shows that there is insufficient evidence from clinical trials in support of either early or CD4-guided initiation of ART in HIV-infected children aged 2 to 5 years. Programmatic issues such as the retention in care of children in ART programmes in resource-limited settings will need to be considered when formulating WHO 2013 recommendations.
Resumo:
BACKGROUND Mortality risk for people with chronic kidney disease is substantially greater than that for the general population, increasing to a 7-fold greater risk for those on dialysis therapy. Higher body mass index, generally due to higher energy intake, appears protective for people on dialysis therapy, but the relationship between energy intake and survival in those with reduced kidney function is unknown. STUDY DESIGN Prospective cohort study with a median follow-up of 14.5 (IQR, 11.2-15.2) years. SETTING & PARTICIPANTS Blue Mountains Area, west of Sydney, Australia. Participants in the general community enrolled in the Blue Mountains Eye Study (n=2,664) who underwent a detailed interview, food frequency questionnaire, and physical examination including body weight, height, blood pressure, and laboratory tests. PREDICTORS Relative energy intake, food components (carbohydrates, total sugars, fat, protein, and water), and estimated glomerular filtration rate (eGFR). Relative energy intake was dichotomized at 100%, and eGFR, at 60mL/min/1.73m(2). OUTCOMES All-cause and cardiovascular mortality. MEASUREMENTS All-cause and cardiovascular mortality using unadjusted and adjusted Cox proportional regression models. RESULTS 949 people died during follow-up, 318 of cardiovascular events. In people with eGFR<60mL/min/1.73m(2) (n=852), there was an increased risk of all-cause mortality (HR, 1.48; P=0.03), but no increased risk of cardiovascular mortality (HR, 1.59; P=0.1) among those with higher relative energy intake compared with those with lower relative energy intake. Increasing intake of carbohydrates (HR per 100g/d, 1.50; P=0.04) and total sugars (HR per 100g/d, 1.62; P=0.03) was associated significantly with increased risk of cardiovascular mortality. LIMITATIONS Under-reporting of energy intake, baseline laboratory and food intake values only, white population. CONCLUSIONS Increasing relative energy intake was associated with increased all-cause mortality in patients with eGFR<60mL/min/1.73m(2). This effect may be mediated by increasing total sugars intake on subsequent cardiovascular events.
Resumo:
OBJECTIVE To investigate the evolution of delirium of nursing home (NH) residents and their possible predictors. DESIGN Post-hoc analysis of a prospective cohort assessment. SETTING Ninety NHs in Switzerland. PARTICIPANTS Included 14,771 NH residents. MEASUREMENTS The Resident Assessment Instrument Minimum Data Set and the Nursing Home Confusion Assessment Method were used to determine follow-up of subsyndromal or full delirium in NH residents using discrete Markov chain modeling to describe long-term trajectories and multiple logistic regression analyses to determine predictors of the trajectories. RESULTS We identified four major types of delirium time courses in NH. Increasing severity of cognitive impairment and of depressive symptoms at the initial assessment predicted the different delirium time courses. CONCLUSION More pronounced cognitive impairment and depressive symptoms at the initial assessment are associated with different subsequent evolutions of delirium. The presence and evolution of delirium in the first year after NH admission predicted the subsequent course of delirium until death.
Resumo:
OBJECTIVE To expand the limited information on the prognostic impact of quantitatively obtained collateral function in patients with coronary artery disease (CAD) and to estimate causality of such a relation. DESIGN Prospective cohort study with long-term observation of clinical outcome. SETTING University Hospital. PATIENTS One thousand one hundred and eighty-one patients with chronic stable CAD undergoing 1771 quantitative, coronary pressure-derived collateral flow index measurements, as obtained during a 1-min coronary balloon occlusion (CFI is the ratio between mean distal coronary occlusive pressure and mean aortic pressure both subtracted by central venous pressure). Subgroup of 152 patients included in randomised trials on the longitudinal effect of different arteriogenic protocols on CFI. INTERVENTIONS Collection of long-term follow-up information on clinical outcome. MAIN OUTCOME MEASURES All-cause mortality and major adverse cardiac events. RESULTS Cumulative 15-year survival rate was 48% in patients with CFI<0.25 and 65% in the group with CFI≥0.25 (p=0.0057). Cumulative 10-year survival rate was 75% in patients without arteriogenic therapy and 88% (p=0.0482) in the group with arteriogenic therapy and showing a significant increase in CFI at follow-up. By proportional hazard analysis, the following variables predicted increased all-cause mortality: age, low CFI, left ventricular end-diastolic pressure and number of vessels with CAD. CONCLUSIONS A well-functioning coronary collateral circulation independently predicts lowered mortality in patients with chronic CAD. This relation appears to be causal, because augmented collateral function by arteriogenic therapy is associated with prolonged survival.
Resumo:
Background Infections with vancomycin-resistant enterococci (VRE) are a growing concern in hospitals. The impact of vancomycin resistance in enterococcal urinary tract infection is not well-defined. Aim To describe the epidemiology of enterococcal bacteriuria in a hospital and compare the clinical picture and patient outcomes depending on vancomycin resistance. Methods This was a 6-month prospective cohort study of hospital patients who were admitted with or who developed enterococcal bacteriuria in a 1250-bed tertiary care hospital. We examined clinical presentation, diagnostic work-up, management, and outcomes. Findings We included 254 patients with enterococcal bacteriuria; 160 (63%) were female and median age was 65 years (range: 17–96). A total of 116 (46%) bacteriurias were hospital-acquired and 145 (57%) catheter-associated. Most patients presented with asymptomatic bacteriuria (ASB) (119; 47%) or pyelonephritis (64; 25%); 51 (20%) had unclassifiable bacteriuria and 20 (8%) had cystitis. Secondary bloodstream infection was detected in 8 (3%) patients. Seventy of 119 (59%) with ASB received antibiotics (mostly vancomycin). There were 74 (29%) VRE bacteriurias. VRE and vancomycin-susceptible enterococci (VSE) produced similar rates of pyelonephritis [19 (25%) vs 45 (25%); P = 0.2], cystitis, and ASB. Outcomes such as ICU transfer [10 (14%) VRE vs 17 (9%) VSE; P = 0.3], hospital length of stay (6.8 vs 5.0 days; P = 0.08), and mortality [10 (14%) vs 13 (7%); P = 0.1] did not vary with vancomycin susceptibility. Conclusions Vancomycin resistance did not affect the clinical presentation nor did it impact patient outcomes in this cohort of inpatients with enterococcal bacteriuria. Almost half of our cohort had enterococcal ASB; more than 50% of these asymptomatic patients received unnecessary antibiotics. Antimicrobial stewardship efforts should address overtreatment of enterococcal bacteriurias.