162 resultados para Mortality Risk
Resumo:
OBJECTIVE The aim of this study was to examine the prevalence of nutritional risk and its association with multiple adverse clinical outcomes in a large cohort of acutely ill medical inpatients from a Swiss tertiary care hospital. METHODS We prospectively followed consecutive adult medical inpatients for 30 d. Multivariate regression models were used to investigate the association of the initial Nutritional Risk Score (NRS 2002) with mortality, impairment in activities of daily living (Barthel Index <95 points), hospital length of stay, hospital readmission rates, and quality of life (QoL; adapted from EQ5 D); all parameters were measured at 30 d. RESULTS Of 3186 patients (mean age 71 y, 44.7% women), 887 (27.8%) were at risk for malnutrition with an NRS ≥3 points. We found strong associations (odds ratio/hazard ratio [OR/HR], 95% confidence interval [CI]) between nutritional risk and mortality (OR/HR, 7.82; 95% CI, 6.04-10.12), impaired Barthel Index (OR/HR, 2.56; 95% CI, 2.12-3.09), time to hospital discharge (OR/HR, 0.48; 95% CI, 0.43-0.52), hospital readmission (OR/HR, 1.46; 95% CI, 1.08-1.97), and all five dimensions of QoL measures. Associations remained significant after adjustment for sociodemographic characteristics, comorbidities, and medical diagnoses. Results were robust in subgroup analysis with evidence of effect modification (P for interaction < 0.05) based on age and main diagnosis groups. CONCLUSION Nutritional risk is significant in acutely ill medical inpatients and is associated with increased medical resource use, adverse clinical outcomes, and impairments in functional ability and QoL. Randomized trials are needed to evaluate evidence-based preventive and treatment strategies focusing on nutritional factors to improve outcomes in these high-risk patients.
Resumo:
BACKGROUND & AIMS Non-selective beta-blockers (NSBB) are used in patients with cirrhosis and oesophageal varices. Experimental data suggest that NSBB inhibit angiogenesis and reduce bacterial translocation, which may prevent hepatocellular carcinoma (HCC). We therefore assessed the effect of NSBB on HCC by performing a systematic review with meta-analyses of randomized trials. METHODS Electronic and manual searches were combined. Authors were contacted for unpublished data. Included trials assessed NSBB for patients with cirrhosis; the control group could receive any other intervention than NSBB. Fixed and random effects meta-analyses were performed with I(2) as a measure of heterogeneity. Subgroup, sensitivity, regression and sequential analyses were performed to evaluate heterogeneity, bias and the robustness of the results after adjusting for multiple testing. RESULTS Twenty-three randomized trials on 2618 patients with cirrhosis were included, of which 12 reported HCC incidence and 23 reported HCC mortality. The mean duration of follow-up was 26 months (range 8-82). In total, 47 of 694 patients randomized to NSBB developed HCC vs 65 of 697 controls (risk difference -0.026; 95% CI-0.052 to -0.001; number needed to treat 38 patients). There was no heterogeneity (I(2) = 7%) or evidence of small study effects (Eggers P = 0.402). The result was not confirmed in sequential analysis, which suggested that 3719 patients were needed to achieve the required information size. NSBB did not reduce HCC-related mortality (RD -0.011; 95% CI -0.040 to 0.017). CONCLUSIONS Non-selective beta-blockers may prevent HCC in patients with cirrhosis.
Resumo:
BACKGROUND The safety and efficacy of new-generation drug-eluting stents (DES) in women with multiple atherothrombotic risk (ATR) factors is unclear. METHODS AND RESULTS We pooled patient-level data for women enrolled in 26 randomized trials. Study population was categorized based on the presence or absence of high ATR, which was defined as having history of diabetes mellitus, prior percutaneous or surgical coronary revascularization, or prior myocardial infarction. The primary end point was major adverse cardiovascular events defined as a composite of all-cause mortality, myocardial infarction, or target lesion revascularization at 3 years of follow-up. Out of 10 449 women included in the pooled database, 5333 (51%) were at high ATR. Compared with women not at high ATR, those at high ATR had significantly higher risk of major adverse cardiovascular events (15.8% versus 10.6%; adjusted hazard ratio: 1.53; 95% confidence interval: 1.34-1.75; P=0.006) and all-cause mortality. In high-ATR risk women, the use of new-generation DES was associated with significantly lower risk of 3-year major adverse cardiovascular events (adjusted hazard ratio: 0.69; 95% confidence interval: 0.52-0.92) compared with early-generation DES. The benefit of new-generation DES on major adverse cardiovascular events was uniform between high-ATR and non-high-ATR women, without evidence of interaction (Pinteraction=0.14). At landmark analysis, in high-ATR women, stent thrombosis rates were comparable between DES generations in the first year, whereas between 1 and 3 years, stent thrombosis risk was lower with new-generation devices. CONCLUSIONS Use of new-generation DES even in women at high ATR is associated with a benefit consistent over 3 years of follow-up and a substantial improvement in very-late thrombotic safety.
Resumo:
BACKGROUND Management of tuberculosis in patients with HIV in eastern Europe is complicated by the high prevalence of multidrug-resistant tuberculosis, low rates of drug susceptibility testing, and poor access to antiretroviral therapy (ART). We report 1 year mortality estimates from a multiregional (eastern Europe, western Europe, and Latin America) prospective cohort study: the TB:HIV study. METHODS Consecutive HIV-positive patients aged 16 years or older with a diagnosis of tuberculosis between Jan 1, 2011, and Dec 31, 2013, were enrolled from 62 HIV and tuberculosis clinics in 19 countries in eastern Europe, western Europe, and Latin America. The primary endpoint was death within 12 months after starting tuberculosis treatment; all deaths were classified according to whether or not they were tuberculosis related. Follow-up was either until death, the final visit, or 12 months after baseline, whichever occurred first. Risk factors for all-cause and tuberculosis-related deaths were assessed using Kaplan-Meier estimates and Cox models. FINDINGS Of 1406 patients (834 in eastern Europe, 317 in western Europe, and 255 in Latin America), 264 (19%) died within 12 months. 188 (71%) of these deaths were tuberculosis related. The probability of all-cause death was 29% (95% CI 26-32) in eastern Europe, 4% (3-7) in western Europe, and 11% (8-16) in Latin America (p<0·0001) and the corresponding probabilities of tuberculosis-related death were 23% (20-26), 1% (0-3), and 4% (2-8), respectively (p<0·0001). Patients receiving care outside eastern Europe had a 77% decreased risk of death: adjusted hazard ratio (aHR) 0·23 (95% CI 0·16-0·31). In eastern Europe, compared with patients who started a regimen with at least three active antituberculosis drugs, those who started fewer than three active antituberculosis drugs were at a higher risk of tuberculosis-related death (aHR 3·17; 95% CI 1·83-5·49) as were those who did not have baseline drug-susceptibility tests (2·24; 1·31-3·83). Other prognostic factors for increased tuberculosis-related mortality were disseminated tuberculosis and a low CD4 cell count. 18% of patients were receiving ART at tuberculosis diagnosis in eastern Europe compared with 44% in western Europe and 39% in Latin America (p<0·0001); 12 months later the proportions were 67% in eastern Europe, 92% in western Europe, and 85% in Latin America (p<0·0001). INTERPRETATION Patients with HIV and tuberculosis in eastern Europe have a risk of death nearly four-times higher than that in patients from western Europe and Latin America. This increased mortality rate is associated with modifiable risk factors such as lack of drug susceptibility testing and suboptimal initial antituberculosis treatment in settings with a high prevalence of drug resistance. Urgent action is needed to improve tuberculosis care for patients living with HIV in eastern Europe. FUNDING EU Seventh Framework Programme.
Resumo:
Background and Study Aim Intra- and paraventricular tumors are frequently associated with cerebrospinal fluid (CSF) pathway obstruction. Thus the aim of an endoscopic approach is to restore patency of the CSF pathways and to obtain a tumor biopsy. Because endoscopic tumor biopsy may increase tumor cell dissemination, this study sought to evaluate this risk. Patients, Materials, and Methods Forty-four patients who underwent endoscopic biopsies for ventricular or paraventricular tumors between 1993 and 2011 were included in the study. Charts and images were reviewed retrospectively to evaluate rates of adverse events, mortality, and tumor cell dissemination. Adverse events, mortality, and tumor cell dissemination were evaluated. Results Postoperative clinical condition improved in 63.0% of patients, remained stable in 30.4%, and worsened in 6.6%. One patient (2.2%) had a postoperative thalamic stroke leading to hemiparesis and hemineglect. No procedure-related deaths occurred. Postoperative tumor cell dissemination was observed in 14.3% of patients available for follow-up. Conclusions For patients presenting with occlusive hydrocephalus due to tumors in or adjacent to the ventricular system, endoscopic CSF diversion is the procedure of first choice. Tumor biopsy in the current study did not affect safety or efficacy.
Resumo:
Survivors of childhood cancer have a higher mortality than the general population. We describe cause-specific long-term mortality in a population-based cohort of childhood cancer survivors. We included all children diagnosed with cancer in Switzerland (1976-2007) at age 0-14 years, who survived ≥5 years after diagnosis and followed survivors until December 31, 2012. We obtained causes of death (COD) from the Swiss mortality statistics and used data from the Swiss general population to calculate age-, calendar year- and sex-standardized mortality ratios (SMR), and absolute excess risks (AER) for different COD, by Poisson regression. We included 3'965 survivors and 49'704 person years at risk. Of these, 246 (6.2%) died, which was 11 times higher than expected (SMR 11.0). Mortality was particularly high for diseases of the respiratory (SMR 14.8) and circulatory system (SMR 12.7), and for second cancers (SMR 11.6). The pattern of cause-specific mortality differed by primary cancer diagnosis, and changed with time since diagnosis. In the first 10 years after 5-year survival, 78.9% of excess deaths were caused by recurrence of the original cancer (AER 46.1). Twenty-five years after diagnosis, only 36.5% (AER 9.1) were caused by recurrence, 21.3% by second cancers (AER 5.3) and 33.3% by circulatory diseases (AER 8.3). Our study confirms an elevated mortality in survivors of childhood cancer for at least 30 years after diagnosis with an increased proportion of deaths caused by late toxicities of the treatment. The results underline the importance of clinical follow-up continuing years after the end of treatment for childhood cancer. This article is protected by copyright. All rights reserved.
Resumo:
We used meat-inspection data collected over a period of three years in Switzerland to evaluate slaughterhouse-level, farm-level and animal-level factors that may be associated with whole carcass condemnation (WCC) in cattle after slaughter. The objective of this study was to identify WCC risk factors so they can be communicated to, and managed by, the slaughter industry and veterinary services. During meat inspection, there were three main important predictors of the risk of WCC; the slaughtered animal's sex, age, and the size of the slaughterhouse it was processed in. WCC for injuries and significant weight loss (visible welfare indicators) were almost exclusive to smaller slaughterhouses. Cattle exhibiting clinical syndromes that were not externally visible (e.g. pneumonia lesions) and that are associated with fattening of cattle, end up in larger slaughterhouses. For this reason, it is important for animal health surveillance to collect data from both types of slaughterhouses. Other important risk factors for WCC were on-farm mortality rate and the number of cattle on the farm of origin. This study highlights the fact that the many risk factors for WCC are as complex as the production system itself, with risk factors interacting with one another in ways which are sometimes difficult to interpret biologically. Risk-based surveillance aimed at farms with reoccurring health problems (e.g. a history of above average condemnation rates) may be more appropriate than the selection, of higher-risk animals arriving at slaughter. In Switzerland, the introduction of a benchmarking system that would provide feedback to the farmer with information on condemnation reasons, and his/her performance compared to the national/regional average could be a first step towards improving herd-management and financial returns for producers.
Resumo:
BACKGROUND Strategies to improve risk prediction are of major importance in patients with heart failure (HF). Fibroblast growth factor 23 (FGF-23) is an endocrine regulator of phosphate and vitamin D homeostasis associated with an increased cardiovascular risk. We aimed to assess the prognostic effect of FGF-23 on mortality in HF patients with a particular focus on differences between patients with HF with preserved ejection fraction and patients with HF with reduced ejection fraction (HFrEF). METHODS AND RESULTS FGF-23 levels were measured in 980 patients with HF enrolled in the Ludwigshafen Risk and Cardiovascular Health (LURIC) study including 511 patients with HFrEF and 469 patients with HF with preserved ejection fraction and a median follow-up time of 8.6 years. FGF-23 was additionally measured in a second cohort comprising 320 patients with advanced HFrEF. FGF-23 was independently associated with mortality with an adjusted hazard ratio per 1-SD increase of 1.30 (95% confidence interval, 1.14-1.48; P<0.001) in patients with HFrEF, whereas no such association was found in patients with HF with preserved ejection fraction (for interaction, P=0.043). External validation confirmed the significant association with mortality with an adjusted hazard ratio per 1 SD of 1.23 (95% confidence interval, 1.02-1.60; P=0.027). FGF-23 demonstrated an increased discriminatory power for mortality in addition to N-terminal pro-B-type natriuretic peptide (C-statistic: 0.59 versus 0.63) and an improvement in net reclassification index (39.6%; P<0.001). CONCLUSIONS FGF-23 is independently associated with an increased risk of mortality in patients with HFrEF but not in those with HF with preserved ejection fraction, suggesting a different pathophysiologic role for both entities.
Resumo:
AIMS High-density lipoproteins (HDLs) are considered as anti-atherogenic. Recent experimental findings suggest that their biological properties can be modified in certain clinical conditions by accumulation of serum amyloid A (SAA). The effect of SAA on the association between HDL-cholesterol (HDL-C) and cardiovascular outcome remains unknown. METHODS AND RESULTS We examined the association of SAA and HDL-C with mortality in the Ludwigshafen Risk and Cardiovascular Health (LURIC) study, which included 3310 patients undergoing coronary angiography. To validate our findings, we analysed 1255 participants of the German Diabetes and Dialysis study (4D) and 4027 participants of the Cooperative Health Research in the Region of Augsburg (KORA) S4 study. In LURIC, SAA concentrations predicted all-cause and cardiovascular mortality. In patients with low SAA, higher HDL-C was associated with lower all-cause and cardiovascular mortality. In contrast, in patients with high SAA, higher HDL-C was associated with increased all-cause and cardiovascular mortality, indicating that SAA indeed modifies the beneficial properties of HDL. We complemented these clinical observations by in vitro experiments, in which SAA impaired vascular functions of HDL. We further derived a formula for the simple calculation of the amount of biologically 'effective' HDL-C based on measured HDL-C and SAA from the LURIC study. In 4D and KORA S4 studies, we found that measured HDL-C was not associated with clinical outcomes, whereas calculated 'effective' HDL-C significantly predicted better outcome. CONCLUSION The acute-phase protein SAA modifies the biological effects of HDL-C in several clinical conditions. The concomitant measurement of SAA is a simple, useful, and clinically applicable surrogate for the vascular functionality of HDL.
Resumo:
Predicting the timing and amount of tree mortality after a forest fire is of paramount importance for post-fire management decisions, such as salvage logging or reforestation. Such knowledge is particularly needed in mountainous regions where forest stands often serve as protection against natural hazards (e.g., snow avalanches, rockfalls, landslides). In this paper, we focus on the drivers and timing of mortality in fire-injured beech trees (Fagus sylvatica L.) in mountain regions. We studied beech forests in the southwestern European Alps, which burned between 1970 and 2012. The results show that beech trees, which lack fire-resistance traits, experience increased mortality within the first two decades post-fire with a timing and amount strongly related to the burn severity. Beech mortality is fast and ubiquitous in high severity sites, whereas small- (DBH <12 cm) and intermediate-diameter (DBH 12–36 cm) trees face a higher risk to die in moderate-severity sites. Large-diameter trees mostly survive, representing a crucial ecological legacy for beech regeneration. Mortality remains low and at a level similar to unburnt beech forests for low burn severity sites. Beech trees diameter, the presence of fungal infestation and elevation are the most significant drivers of mortality. The risk of beech to die increases toward higher elevation and is higher for small-diameter than for large-diameter trees. In case of secondary fungi infestation beech faces generally a higher risk to die. Interestingly, fungi that initiate post-fire tree mortality differ from fungi occurring after mechanical injury. From a management point of view, the insights about the controls of post-fire mortality provided by this study should help in planning post-fire silvicultural measures in montane beech forests.
Resumo:
BACKGROUND Antiretroviral therapy (ART) initiation is now recommended irrespective of CD4 count. However data on the relationship between CD4 count at ART initiation and loss to follow-up (LTFU) are limited and conflicting. METHODS We conducted a cohort analysis including all adults initiating ART (2008-2012) at three public sector sites in South Africa. LTFU was defined as no visit in the 6 months before database closure. The Kaplan-Meier estimator and Cox's proportional hazards models examined the relationship between CD4 count at ART initiation and 24-month LTFU. Final models were adjusted for demographics, year of ART initiation, programme expansion and corrected for unascertained mortality. RESULTS Among 17 038 patients, the median CD4 at initiation increased from 119 (IQR 54-180) in 2008 to 257 (IQR 175-318) in 2012. In unadjusted models, observed LTFU was associated with both CD4 counts <100 cells/μL and CD4 counts ≥300 cells/μL. After adjustment, patients with CD4 counts ≥300 cells/μL were 1.35 (95% CI 1.12 to 1.63) times as likely to be LTFU after 24 months compared to those with a CD4 150-199 cells/μL. This increased risk for patients with CD4 counts ≥300 cells/μL was largest in the first 3 months on treatment. Correction for unascertained deaths attenuated the association between CD4 counts <100 cells/μL and LTFU while the association between CD4 counts ≥300 cells/μL and LTFU persisted. CONCLUSIONS Patients initiating ART at higher CD4 counts may be at increased risk for LTFU. With programmes initiating patients at higher CD4 counts, models of ART delivery need to be reoriented to support long-term retention.
Resumo:
OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.