872 resultados para Prospective Cohort


Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: To determine whether the virulence of HIV-1 has been changing since its introduction into Switzerland. DESIGN: A prospective cohort study of HIV-1 infected individuals with well-characterized pre-therapy disease history. METHODS: To minimize the effect of recently imported viruses and ethnicity-associated host factors, the analysis was restricted to the white, north-west-European majority population of the cohort. Virulence was characterized by the decline slope of the CD4 cell count (n = 817 patients), the decline slope of the CD4:CD8 ratio (n = 815 patients) and the viral setpoint (n = 549 patients) in untreated patients with sufficient data points. Linear regression models were used to detect correlations between the date of diagnosis (ranging between 1984 and 2003) and the virulence markers, controlling for gender, exposure category, age and CD4 cell count at entry. RESULTS: We found no correlation between any of the virulence markers and the date of diagnosis. Inspection of short-term trends confirmed that virulence has fluctuated around a stable level over time. CONCLUSIONS: The lack of long-term time trends in the virulence markers indicates that HIV-1 is not evolving towards increasing or decreasing virulence at a perceptible rate. Both highly virulent and attenuated strains have apparently been unable to spread at the population level. This result suggests that either the evolution of virulence may be slow or inhibited due to evolutionary constraints, or HIV-1 may have already evolved to optimal virulence in the human host.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the discussion about the rationale for spine registries, two basic questions have to be answered. The first one deals with the value of orthopaedic registries per se, considering them as observational studies and comparing the evidence they generate with that of randomised controlled trials. The second question asks if the need for registries in spine surgery is similar to that in the arthroplasty sector. The widely held view that randomised controlled trials are the 'gold standard' for evaluation and that observational methods have little or no value ignores the limitations of randomised trials. They may prove unnecessary, inappropriate, impossible, or inadequate. In addition, the external validity and hence the ability to make generalisations about the results of randomised trials is often low. Therefore, the false conflict between those who advocate randomised trials in all situations and those who believe observational data provide sufficient evidence needs to be replaced with mutual recognition of their complementary roles. The fact that many surgical techniques or technologies were introduced into the field of spine surgery without randomised trials or prospective cohort comparisons makes obvious an even increased need for spine registries compared to joint arthroplasty. An essential methodological prerequisite for a registry is a common terminology for reporting results and a sophisticated technology that networks all participants so that one central data pool is created and accessed. Recognising this need, the Spine Society of Europe has researched and developed Spine Tango, the first European spine registry, which can be accessed under www.eurospine.org.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: The medical specialties chosen by doctors for their careers play an important part in the development of health-care services. This study aimed to investigate the influence of gender, personality traits, career motivation and life goal aspirations on the choice of medical specialty. METHODS: As part of a prospective cohort study of Swiss medical school graduates on career development, 522 fourth-year residents were asked in what specialty they wanted to qualify. They also assessed their career motivation and life goal aspirations. Data concerning personality traits such as sense of coherence, self-esteem, and gender role orientation were collected at the first assessment, four years earlier, in their final year of medical school. Data analyses were conducted by univariate and multivariate analyses of variance and covariance. RESULTS: In their fourth year of residency 439 (84.1%) participants had made their specialty choice. Of these, 45 (8.6%) subjects aspired to primary care, 126 (24.1%) to internal medicine, 68 (13.0%) to surgical specialties, 31 (5.9%) to gynaecology & obstetrics (G&O), 40 (7.7%) to anaesthesiology/intensive care, 44 (8.4%) to paediatrics, 25 (4.8%) to psychiatry and 60 (11.5%) to other specialties. Female residents tended to choose G&O, paediatrics, and anaesthesiology, males more often surgical specialties; the other specialties did not show gender-relevant differences of frequency distribution. Gender had the strongest significant influence on specialty choice, followed by career motivation, personality traits, and life goals. Multivariate analyses of covariance indicated that career motivation and life goals mediated the influence of personality on career choice. Personality traits were no longer significant after controlling for career motivation and life goals as covariates. The effect of gender remained significant after controlling for personality traits, career motivation and life goals. CONCLUSION: Gender had the greatest impact on specialty and career choice, but there were also two other relevant influencing factors, namely career motivation and life goals. Senior physicians mentoring junior physicians should pay special attention to these aspects. Motivational guidance throughout medical training should not only focus on the professional career but also consider the personal life goals of those being mentored.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: The aim of this paper was to review the clinical literature on the Resonance frequency analysis (RFA) and Periotest techniques in order to assess the validity and prognostic value of each technique to detect implants at risk for failure. Material and methods: A search was made using the PubMed database to find clinical studies using the RFA and/or Periotest techniques. Results: A limited number of clinical reports were found. No randomized-controlled clinical trials or prospective cohort studies could be found for validity testing of the techniques. Consequently, only a narrative review was prepared to cover general aspects of the techniques, factors influencing measurements and the clinical relevance of the techniques. Conclusions: Factors such as bone density, upper or lower jaw, abutment length and supracrestal implant length seem to influence both RFA and Periotest measurements. Data suggest that high RFA and low Periotest values indicate successfully integrated implants and that low/decreasing RFA and high/increasing Periotest values may be signs of ongoing disintegration and/or marginal bone loss. However, single readings using any of the techniques are of limited clinical value. The prognostic value of the RFA and Periotest techniques in predicting loss of implant stability has yet to be established in prospective clinical studies. To cite this article: Aparicio C, Lang N P, Rangert B. Validity and clinical significance of biomechanical testing of implant/bone interface. Clin. Oral Imp. Res., 17 (Suppl. 2), 2006; 2-7.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives: To assess the ability to predict tooth loss on the basis of clinical and radiographic parameters. Methods: Clinical and radiographic data from a five year prospective cohort were studied to identify cause of progressive tooth loss in older subjects. Results: 363 subjects with a baseline mean age of 67.1 years (S.D. + 4.7, range : 60-75), and 51.4% women were studied including 59.5% never smokers, and 33.0% current smokers. At baseline the subjects had, on average, 22.4 teeth (S.D. + 6.4). Self-assessed tooth loss risk was identified by 16.0 % of subjects while 34% of subjects lost teeth. Tooth loss due to caries was found in 24.7% (178 teeth), periodontitis in 15.4% (133 teeth), peri-apical lesions 5.9% (32 teeth), combined periodontal/peri-apical in 3.4% (18 teeth), and teeth irrational to treat in 7.5% (58 teeth) of the subjects. 122 of the extracted teeth (34%) should have been possible to save but were extracted. At year five severe caries, periodontitis, peri-apical lesions, periodontal/peri-apical, irrational to treat were found in 6.3%, 7.2%, 2.6%, 4.6%, and 1.2% of subjects, respectively. Signs of osteoporosis increased by 11.2 % (Klemetti index). Linear regression analysis failed to include smoking habits as being explanatory. Explanatory factors were researcher prediction of extraction needs, subject self assessment of risk and change in ostoporosis status (r2 = 0.39, ANOVA, F=22.6, p< 0.001). Conclusions: Caries and periodontitis are primary causes for extraction. Progressive osteoporosis is associated with tooth loss. Radiographs, and subjects self-assessment of risk for tooth loss are robust predictors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Prospective cohort studies have provided evidence on longer-term mortality risks of fine particulate matter (PM2.5), but due to their complexity and costs, only a few have been conducted. By linking monitoring data to the U.S. Medicare system by county of residence, we developed a retrospective cohort study, the Medicare Air Pollution Cohort Study (MCAPS), comprising over 20 million enrollees in the 250 largest counties during 2000-2002. We estimated log-linear regression models having as outcome the age-specific mortality rate for each county and as the main predictor, the average level for the study period 2000. Area-level covariates were used to adjust for socio-economic status and smoking. We reported results under several degrees of adjustment for spatial confounding and with stratification into by eastern, central and western counties. We estimated that a 10 µg/m3 increase in PM25 is associated with a 7.6% increase in mortality (95% CI: 4.4 to 10.8%). We found a stronger association in the eastern counties than nationally, with no evidence of an association in western counties. When adjusted for spatial confounding, the estimated log-relative risks drop by 50%. We demonstrated the feasibility of using Medicare data to establish cohorts for follow-up for effects of air pollution. Particulate matter (PM) air pollution is a global public health problem (1). In developing countries, levels of airborne particles still reach concentrations at which serious health consequences are well-documented; in developed countries, recent epidemiologic evidence shows continued adverse effects, even though particle levels have declined in the last two decades (2-6). Increased mortality associated with higher levels of PM air pollution has been of particular concern, giving an imperative for stronger protective regulations (7). Evidence on PM and health comes from studies of acute and chronic adverse effects (6). The London Fog of 1952 provides dramatic evidence of the unacceptable short-term risk of extremely high levels of PM air pollution (8-10); multi-site time-series studies of daily mortality show that far lower levels of particles are still associated with short-term risk (5)(11-13). Cohort studies provide complementary evidence on the longer-term risks of PM air pollution, indicating the extent to which exposure reduces life expectancy. The design of these studies involves follow-up of cohorts for mortality over periods of years to decades and an assessment of mortality risk in association with estimated long-term exposure to air pollution (2-4;14-17). Because of the complexity and costs of such studies, only a small number have been conducted. The most rigorously executed, including the Harvard Six Cities Study and the American Cancer Society’s (ACS) Cancer Prevention Study II, have provided generally consistent evidence for an association of long- term exposure to particulate matter air pollution with increased all-cause and cardio-respiratory mortality (2,4,14,15). Results from these studies have been used in risk assessments conducted for setting the U.S. National Ambient Air Quality Standard (NAAQS) for PM and for estimating the global burden of disease attributable to air pollution (18,19). Additional prospective cohort studies are necessary, however, to confirm associations between long-term exposure to PM and mortality, to broaden the populations studied, and to refine estimates by regions across which particle composition varies. Toward this end, we have used data from the U.S. Medicare system, which covers nearly all persons 65 years of age and older in the United States. We linked Medicare mortality data to (particulate matter less than 2.5 µm in aerodynamic diameter) air pollution monitoring data to create a new retrospective cohort study, the Medicare Air Pollution Cohort Study (MCAPS), consisting of 20 million persons from 250 counties and representing about 50% of the US population of elderly living in urban settings. In this paper, we report on the relationship between longer-term exposure to PM2.5 and mortality risk over the period 2000 to 2002 in the MCAPS.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES: To carry out long-term follow-up after percutaneous closure of patent foramen ovale (PFO) in patients with cryptogenic stroke. DESIGN: Prospective cohort study. SETTING: Single tertiary care centre. PARTICIPANTS: 525 consecutive patients (mean (SD) age 51 (12) years; 56% male). INTERVENTIONS: Percutaneous PFO closure without intraprocedural echocardiography. MAIN OUTCOME MEASURES: Freedom from recurrent embolic events. RESULTS: A mean (SD) of 1.7 (1.0) clinically apparent embolic events occurred for each patient, and 186 patients (35%) had >1 event. An atrial septal aneurysm was associated with the PFO in 161 patients (31%). All patients were followed up prospectively for up to 11 years. The implantation procedure failed in two patients (0.4%). There were 13 procedural complications (2.5%) without any long-term sequelae. Contrast transoesophageal echocardiography at 6 months showed complete closure in 86% of patients, and a minimal, moderate or large residual shunt in 9%, 3% and 2%, respectively. Patients with small occluders (<30 mm; n = 429) had fewer residual shunts (small 11% vs large 27%; p<0.001). During a mean (SD) follow-up of 2.9 (2.2) years (median 2.3 years; total 1534 patient-years), six ischaemic strokes, nine transient ischaemic attacks (TIAs) and two peripheral emboli occurred. Freedom from recurrent stroke, TIA, or peripheral embolism was 98% at 1 year, 97% at 2 years and 96% at 5 and 10 years, respectively. A residual shunt (hazard ratio = 3.4; 95% CI 1.3 to 9.2) was a risk factor for recurrence. CONCLUSIONS: This study attests to the long-term safety and efficacy of percutaneous PFO closure guided by fluoroscopy only for secondary prevention of paradoxical embolism in a large cohort of consecutive patients.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: To test the feasibility of and interactions among three software-driven critical care protocols. DESIGN: Prospective cohort study. SETTING: Intensive care units in six European and American university hospitals. PATIENTS: 174 cardiac surgery and 41 septic patients. INTERVENTIONS: Application of software-driven protocols for cardiovascular management, sedation, and weaning during the first 7 days of intensive care. MEASUREMENTS AND RESULTS: All protocols were used simultaneously in 85% of the cardiac surgery and 44% of the septic patients, and any one of the protocols was used for 73 and 44% of study duration, respectively. Protocol use was discontinued in 12% of patients by the treating clinician and in 6% for technical/administrative reasons. The number of protocol steps per unit of time was similar in the two diagnostic groups (n.s. for all protocols). Initial hemodynamic stability (a protocol target) was achieved in 26+/-18 min (mean+/-SD) in cardiac surgery and in 24+/-18 min in septic patients. Sedation targets were reached in 2.4+/-0.2h in cardiac surgery and in 3.6 +/-0.2h in septic patients. Weaning protocol was started in 164 (94%; 154 extubated) cardiac surgery and in 25 (60%; 9 extubated) septic patients. The median (interquartile range) time from starting weaning to extubation (a protocol target) was 89 min (range 44-154 min) for the cardiac surgery patients and 96 min (range 56-205 min) for the septic patients. CONCLUSIONS: Multiple software-driven treatment protocols can be simultaneously applied with high acceptance and rapid achievement of primary treatment goals. Time to reach these primary goals may provide a performance indicator.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

AIMS: It is unclear whether transcatheter aortic valve implantation (TAVI) addresses an unmet clinical need for those currently rejected for surgical aortic valve replacement (SAVR) and whether there is a subgroup of high-risk patients benefiting more from TAVI compared to SAVR. In this two-centre, prospective cohort study, we compared baseline characteristics and 30-day mortality between TAVI and SAVR in consecutive patients undergoing invasive treatment for aortic stenosis. METHODS AND RESULTS: We pre-specified different adjustment methods to examine the effect of TAVI as compared with SAVR on overall 30-day mortality: crude univariable logistic regression analysis, multivariable analysis adjusted for baseline characteristics, analysis adjusted for propensity scores, propensity score matched analysis, and weighted analysis using the inverse probability of treatment (IPT) as weights. A total of 1,122 patients were included in the study: 114 undergoing TAVI and 1,008 patients undergoing SAVR. The crude mortality rate was greater in the TAVI group (9.6% vs. 2.3%) yielding an odds ratio [OR] of 4.57 (95%-CI 2.17-9.65). Compared to patients undergoing SAVR, patients with TAVI were older, more likely to be in NYHA class III and IV, and had a considerably higher logistic EuroSCORE and more comorbid conditions. Adjusted OR depended on the method used to control for confounding and ranged from 0.60 (0.11-3.36) to 7.57 (0.91-63.0). We examined the distribution of propensity scores and found scores to overlap sufficiently only in a narrow range. In patients with sufficient overlap of propensity scores, adjusted OR ranged from 0.35 (0.04-2.72) to 3.17 (0.31 to 31.9). In patients with insufficient overlap, we consistently found increased odds of death associated with TAVI compared with SAVR irrespective of the method used to control confounding, with adjusted OR ranging from 5.88 (0.67-51.8) to 25.7 (0.88-750). Approximately one third of patients undergoing TAVI were found to be potentially eligible for a randomised comparison of TAVI versus SAVR. CONCLUSIONS: Both measured and unmeasured confounding limit the conclusions that can be drawn from observational comparisons of TAVI versus SAVR. Our study indicates that TAVI could be associated with either substantial benefits or harms. Randomised comparisons of TAVI versus SAVR are warranted.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Sialic-acid-binding immunoglobulin-like lectin (Siglec) 9 mediates death signals in neutrophils. The objective of this study was to determine the heterogeneity of neutrophil death responses in septic shock patients and to analyze whether these ex vivo data are related to the severity and outcome of septic shock. In this prospective cohort study, blood samples of patients with septic shock (n = 26) in a medical-surgical intensive care unit (ICU) were taken within 24 h of starting the treatment of septic shock (phase A), after circulatory stabilization (phase B), and 10 days after admission or at ICU discharge if earlier (phase C). Neutrophil death was quantified in the presence and absence of an agonistic anti-Siglec-9 antibody after 24 h ex vivo. In phase A, two distinct patterns of Siglec-9-mediated neutrophil death were observed: resistance to neutrophil death (n = 14; Siglec-9 nonresponders) and increased neutrophil death (n = 12; Siglec-9 responders) after Siglec-9 ligation compared with neutrophils from normal donors. Experiments using a pharmacological pan-caspase-inhibitor provided evidence for caspase-independent neutrophil death in Siglec-9 responders upon Siglec-9 ligation. There were no differences between Siglec-9 responders and nonresponders in length of ICU or hospital stay of survivors or severity of organ dysfunction. Taken together, septic shock patients exhibit different ex vivo death responses of blood neutrophils after Siglec-9 ligation early in shock. Both the resistance and the increased susceptibility to Siglec-9-mediated neutrophil death tend to normalize within 72 h after shock. Further studies are required to understand the role of Siglec-9-mediated neutrophil death in septic shock.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES: To investigate the contribution of a real-time PCR assay for the detection of Treponema pallidum in various biological specimens with the secondary objective of comparing its value according to HIV status. METHODS: Prospective cohort of incident syphilis cases from three Swiss hospitals (Geneva and Bern University Hospitals, Outpatient Clinic for Dermatology of Triemli, Zurich) diagnosed between January 2006 and September 2008. A case-control study was nested into the cohort. Biological specimens (blood, lesion swab or urine) were taken at diagnosis (as clinical information) and analysed by real-time PCR using the T pallidum 47 kDa gene. RESULTS: 126 specimens were collected from 74 patients with primary (n = 26), secondary (n = 40) and latent (n = 8) syphilis. Among primary syphilis, sensitivity was 80% in lesion swabs, 28% in whole blood, 55% in serum and 29% in urine, whereas among secondary syphilis, it was 20%, 36%, 47% and 44%, respectively. Among secondary syphilis, plasma and cerebrospinal fluid were also tested and provided a sensitivity of 100% and 50%, respectively. The global sensitivity of T pallidum by PCR (irrespective of the compartment tested) was 65% during primary, 53% during secondary and null during latent syphilis. No difference regarding serology or PCR results was observed among HIV-infected patients. Specificity was 100%. CONCLUSIONS: Syphilis PCR provides better sensitivity in lesion swabs from primary syphilis and displays only moderate sensitivity in blood from primary and secondary syphilis. HIV status did not modify the internal validity of PCR for the diagnosis of primary or secondary syphilis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This editorial refers to ‘Increased risk of coronary heart disease among individuals reporting adverse impact of stress on their health: the Whitehall II prospective cohort study’†, by H. Nabi et al., on page 2697

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND The use of combination antiretroviral therapy (cART) comprising three antiretroviral medications from at least two classes of drugs is the current standard treatment for HIV infection in adults and children. Current World Health Organization (WHO) guidelines for antiretroviral therapy recommend early treatment regardless of immunologic thresholds or the clinical condition for all infants (less than one years of age) and children under the age of two years. For children aged two to five years current WHO guidelines recommend (based on low quality evidence) that clinical and immunological thresholds be used to identify those who need to start cART (advanced clinical stage or CD4 counts ≤ 750 cells/mm(3) or per cent CD4 ≤ 25%). This Cochrane review will inform the current available evidence regarding the optimal time for treatment initiation in children aged two to five years with the goal of informing the revision of WHO 2013 recommendations on when to initiate cART in children. OBJECTIVES To assess the evidence for the optimal time to initiate cART in treatment-naive, HIV-infected children aged 2 to 5 years. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, the AEGIS conference database, specific relevant conferences, www.clinicaltrials.gov, the World Health Organization International Clinical Trials Registry platform and reference lists of articles. The date of the most recent search was 30 September 2012. SELECTION CRITERIA Randomised controlled trials (RCTs) that compared immediate with deferred initiation of cART, and prospective cohort studies which followed children from enrolment to start of cART and on cART. DATA COLLECTION AND ANALYSIS Two review authors considered studies for inclusion in the review, assessed the risk of bias, and extracted data on the primary outcome of death from all causes and several secondary outcomes, including incidence of CDC category C and B clinical events and per cent CD4 cells (CD4%) at study end. For RCTs we calculated relative risks (RR) or mean differences with 95% confidence intervals (95% CI). For cohort data, we extracted relative risks with 95% CI from adjusted analyses. We combined results from RCTs using a random effects model and examined statistical heterogeneity. MAIN RESULTS Two RCTs in HIV-positive children aged 1 to 12 years were identified. One trial was the pilot study for the larger second trial and both compared initiation of cART regardless of clinical-immunological conditions with deferred initiation until per cent CD4 dropped to <15%. The two trials were conducted in Thailand, and Thailand and Cambodia, respectively. Unpublished analyses of the 122 children enrolled at ages 2 to 5 years were included in this review. There was one death in the immediate cART group and no deaths in the deferred group (RR 2.9; 95% CI 0.12 to 68.9). In the subgroup analysis of children aged 24 to 59 months, there was one CDC C event in each group (RR 0.96; 95% CI 0.06 to 14.87) and 8 and 11 CDC B events in the immediate and deferred groups respectively (RR 0.95; 95% CI 0.24 to 3.73). In this subgroup, the mean difference in CD4 per cent at study end was 5.9% (95% CI 2.7 to 9.1). One cohort study from South Africa, which compared the effect of delaying cART for up to 60 days in 573 HIV-positive children starting tuberculosis treatment (median age 3.5 years), was also included. The adjusted hazard ratios for the effect on mortality of delaying ART for more than 60 days was 1.32 (95% CI 0.55 to 3.16). AUTHORS' CONCLUSIONS This systematic review shows that there is insufficient evidence from clinical trials in support of either early or CD4-guided initiation of ART in HIV-infected children aged 2 to 5 years. Programmatic issues such as the retention in care of children in ART programmes in resource-limited settings will need to be considered when formulating WHO 2013 recommendations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Mortality risk for people with chronic kidney disease is substantially greater than that for the general population, increasing to a 7-fold greater risk for those on dialysis therapy. Higher body mass index, generally due to higher energy intake, appears protective for people on dialysis therapy, but the relationship between energy intake and survival in those with reduced kidney function is unknown. STUDY DESIGN Prospective cohort study with a median follow-up of 14.5 (IQR, 11.2-15.2) years. SETTING & PARTICIPANTS Blue Mountains Area, west of Sydney, Australia. Participants in the general community enrolled in the Blue Mountains Eye Study (n=2,664) who underwent a detailed interview, food frequency questionnaire, and physical examination including body weight, height, blood pressure, and laboratory tests. PREDICTORS Relative energy intake, food components (carbohydrates, total sugars, fat, protein, and water), and estimated glomerular filtration rate (eGFR). Relative energy intake was dichotomized at 100%, and eGFR, at 60mL/min/1.73m(2). OUTCOMES All-cause and cardiovascular mortality. MEASUREMENTS All-cause and cardiovascular mortality using unadjusted and adjusted Cox proportional regression models. RESULTS 949 people died during follow-up, 318 of cardiovascular events. In people with eGFR<60mL/min/1.73m(2) (n=852), there was an increased risk of all-cause mortality (HR, 1.48; P=0.03), but no increased risk of cardiovascular mortality (HR, 1.59; P=0.1) among those with higher relative energy intake compared with those with lower relative energy intake. Increasing intake of carbohydrates (HR per 100g/d, 1.50; P=0.04) and total sugars (HR per 100g/d, 1.62; P=0.03) was associated significantly with increased risk of cardiovascular mortality. LIMITATIONS Under-reporting of energy intake, baseline laboratory and food intake values only, white population. CONCLUSIONS Increasing relative energy intake was associated with increased all-cause mortality in patients with eGFR<60mL/min/1.73m(2). This effect may be mediated by increasing total sugars intake on subsequent cardiovascular events.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE To investigate the evolution of delirium of nursing home (NH) residents and their possible predictors. DESIGN Post-hoc analysis of a prospective cohort assessment. SETTING Ninety NHs in Switzerland. PARTICIPANTS Included 14,771 NH residents. MEASUREMENTS The Resident Assessment Instrument Minimum Data Set and the Nursing Home Confusion Assessment Method were used to determine follow-up of subsyndromal or full delirium in NH residents using discrete Markov chain modeling to describe long-term trajectories and multiple logistic regression analyses to determine predictors of the trajectories. RESULTS We identified four major types of delirium time courses in NH. Increasing severity of cognitive impairment and of depressive symptoms at the initial assessment predicted the different delirium time courses. CONCLUSION More pronounced cognitive impairment and depressive symptoms at the initial assessment are associated with different subsequent evolutions of delirium. The presence and evolution of delirium in the first year after NH admission predicted the subsequent course of delirium until death.