808 resultados para Mortality Risk
Resumo:
BACKGROUND Management of tuberculosis in patients with HIV in eastern Europe is complicated by the high prevalence of multidrug-resistant tuberculosis, low rates of drug susceptibility testing, and poor access to antiretroviral therapy (ART). We report 1 year mortality estimates from a multiregional (eastern Europe, western Europe, and Latin America) prospective cohort study: the TB:HIV study. METHODS Consecutive HIV-positive patients aged 16 years or older with a diagnosis of tuberculosis between Jan 1, 2011, and Dec 31, 2013, were enrolled from 62 HIV and tuberculosis clinics in 19 countries in eastern Europe, western Europe, and Latin America. The primary endpoint was death within 12 months after starting tuberculosis treatment; all deaths were classified according to whether or not they were tuberculosis related. Follow-up was either until death, the final visit, or 12 months after baseline, whichever occurred first. Risk factors for all-cause and tuberculosis-related deaths were assessed using Kaplan-Meier estimates and Cox models. FINDINGS Of 1406 patients (834 in eastern Europe, 317 in western Europe, and 255 in Latin America), 264 (19%) died within 12 months. 188 (71%) of these deaths were tuberculosis related. The probability of all-cause death was 29% (95% CI 26-32) in eastern Europe, 4% (3-7) in western Europe, and 11% (8-16) in Latin America (p<0·0001) and the corresponding probabilities of tuberculosis-related death were 23% (20-26), 1% (0-3), and 4% (2-8), respectively (p<0·0001). Patients receiving care outside eastern Europe had a 77% decreased risk of death: adjusted hazard ratio (aHR) 0·23 (95% CI 0·16-0·31). In eastern Europe, compared with patients who started a regimen with at least three active antituberculosis drugs, those who started fewer than three active antituberculosis drugs were at a higher risk of tuberculosis-related death (aHR 3·17; 95% CI 1·83-5·49) as were those who did not have baseline drug-susceptibility tests (2·24; 1·31-3·83). Other prognostic factors for increased tuberculosis-related mortality were disseminated tuberculosis and a low CD4 cell count. 18% of patients were receiving ART at tuberculosis diagnosis in eastern Europe compared with 44% in western Europe and 39% in Latin America (p<0·0001); 12 months later the proportions were 67% in eastern Europe, 92% in western Europe, and 85% in Latin America (p<0·0001). INTERPRETATION Patients with HIV and tuberculosis in eastern Europe have a risk of death nearly four-times higher than that in patients from western Europe and Latin America. This increased mortality rate is associated with modifiable risk factors such as lack of drug susceptibility testing and suboptimal initial antituberculosis treatment in settings with a high prevalence of drug resistance. Urgent action is needed to improve tuberculosis care for patients living with HIV in eastern Europe. FUNDING EU Seventh Framework Programme.
Resumo:
Background and Study Aim Intra- and paraventricular tumors are frequently associated with cerebrospinal fluid (CSF) pathway obstruction. Thus the aim of an endoscopic approach is to restore patency of the CSF pathways and to obtain a tumor biopsy. Because endoscopic tumor biopsy may increase tumor cell dissemination, this study sought to evaluate this risk. Patients, Materials, and Methods Forty-four patients who underwent endoscopic biopsies for ventricular or paraventricular tumors between 1993 and 2011 were included in the study. Charts and images were reviewed retrospectively to evaluate rates of adverse events, mortality, and tumor cell dissemination. Adverse events, mortality, and tumor cell dissemination were evaluated. Results Postoperative clinical condition improved in 63.0% of patients, remained stable in 30.4%, and worsened in 6.6%. One patient (2.2%) had a postoperative thalamic stroke leading to hemiparesis and hemineglect. No procedure-related deaths occurred. Postoperative tumor cell dissemination was observed in 14.3% of patients available for follow-up. Conclusions For patients presenting with occlusive hydrocephalus due to tumors in or adjacent to the ventricular system, endoscopic CSF diversion is the procedure of first choice. Tumor biopsy in the current study did not affect safety or efficacy.
Resumo:
Survivors of childhood cancer have a higher mortality than the general population. We describe cause-specific long-term mortality in a population-based cohort of childhood cancer survivors. We included all children diagnosed with cancer in Switzerland (1976-2007) at age 0-14 years, who survived ≥5 years after diagnosis and followed survivors until December 31, 2012. We obtained causes of death (COD) from the Swiss mortality statistics and used data from the Swiss general population to calculate age-, calendar year- and sex-standardized mortality ratios (SMR), and absolute excess risks (AER) for different COD, by Poisson regression. We included 3'965 survivors and 49'704 person years at risk. Of these, 246 (6.2%) died, which was 11 times higher than expected (SMR 11.0). Mortality was particularly high for diseases of the respiratory (SMR 14.8) and circulatory system (SMR 12.7), and for second cancers (SMR 11.6). The pattern of cause-specific mortality differed by primary cancer diagnosis, and changed with time since diagnosis. In the first 10 years after 5-year survival, 78.9% of excess deaths were caused by recurrence of the original cancer (AER 46.1). Twenty-five years after diagnosis, only 36.5% (AER 9.1) were caused by recurrence, 21.3% by second cancers (AER 5.3) and 33.3% by circulatory diseases (AER 8.3). Our study confirms an elevated mortality in survivors of childhood cancer for at least 30 years after diagnosis with an increased proportion of deaths caused by late toxicities of the treatment. The results underline the importance of clinical follow-up continuing years after the end of treatment for childhood cancer. This article is protected by copyright. All rights reserved.
Resumo:
We used meat-inspection data collected over a period of three years in Switzerland to evaluate slaughterhouse-level, farm-level and animal-level factors that may be associated with whole carcass condemnation (WCC) in cattle after slaughter. The objective of this study was to identify WCC risk factors so they can be communicated to, and managed by, the slaughter industry and veterinary services. During meat inspection, there were three main important predictors of the risk of WCC; the slaughtered animal's sex, age, and the size of the slaughterhouse it was processed in. WCC for injuries and significant weight loss (visible welfare indicators) were almost exclusive to smaller slaughterhouses. Cattle exhibiting clinical syndromes that were not externally visible (e.g. pneumonia lesions) and that are associated with fattening of cattle, end up in larger slaughterhouses. For this reason, it is important for animal health surveillance to collect data from both types of slaughterhouses. Other important risk factors for WCC were on-farm mortality rate and the number of cattle on the farm of origin. This study highlights the fact that the many risk factors for WCC are as complex as the production system itself, with risk factors interacting with one another in ways which are sometimes difficult to interpret biologically. Risk-based surveillance aimed at farms with reoccurring health problems (e.g. a history of above average condemnation rates) may be more appropriate than the selection, of higher-risk animals arriving at slaughter. In Switzerland, the introduction of a benchmarking system that would provide feedback to the farmer with information on condemnation reasons, and his/her performance compared to the national/regional average could be a first step towards improving herd-management and financial returns for producers.
Resumo:
BACKGROUND Strategies to improve risk prediction are of major importance in patients with heart failure (HF). Fibroblast growth factor 23 (FGF-23) is an endocrine regulator of phosphate and vitamin D homeostasis associated with an increased cardiovascular risk. We aimed to assess the prognostic effect of FGF-23 on mortality in HF patients with a particular focus on differences between patients with HF with preserved ejection fraction and patients with HF with reduced ejection fraction (HFrEF). METHODS AND RESULTS FGF-23 levels were measured in 980 patients with HF enrolled in the Ludwigshafen Risk and Cardiovascular Health (LURIC) study including 511 patients with HFrEF and 469 patients with HF with preserved ejection fraction and a median follow-up time of 8.6 years. FGF-23 was additionally measured in a second cohort comprising 320 patients with advanced HFrEF. FGF-23 was independently associated with mortality with an adjusted hazard ratio per 1-SD increase of 1.30 (95% confidence interval, 1.14-1.48; P<0.001) in patients with HFrEF, whereas no such association was found in patients with HF with preserved ejection fraction (for interaction, P=0.043). External validation confirmed the significant association with mortality with an adjusted hazard ratio per 1 SD of 1.23 (95% confidence interval, 1.02-1.60; P=0.027). FGF-23 demonstrated an increased discriminatory power for mortality in addition to N-terminal pro-B-type natriuretic peptide (C-statistic: 0.59 versus 0.63) and an improvement in net reclassification index (39.6%; P<0.001). CONCLUSIONS FGF-23 is independently associated with an increased risk of mortality in patients with HFrEF but not in those with HF with preserved ejection fraction, suggesting a different pathophysiologic role for both entities.
Resumo:
AIMS High-density lipoproteins (HDLs) are considered as anti-atherogenic. Recent experimental findings suggest that their biological properties can be modified in certain clinical conditions by accumulation of serum amyloid A (SAA). The effect of SAA on the association between HDL-cholesterol (HDL-C) and cardiovascular outcome remains unknown. METHODS AND RESULTS We examined the association of SAA and HDL-C with mortality in the Ludwigshafen Risk and Cardiovascular Health (LURIC) study, which included 3310 patients undergoing coronary angiography. To validate our findings, we analysed 1255 participants of the German Diabetes and Dialysis study (4D) and 4027 participants of the Cooperative Health Research in the Region of Augsburg (KORA) S4 study. In LURIC, SAA concentrations predicted all-cause and cardiovascular mortality. In patients with low SAA, higher HDL-C was associated with lower all-cause and cardiovascular mortality. In contrast, in patients with high SAA, higher HDL-C was associated with increased all-cause and cardiovascular mortality, indicating that SAA indeed modifies the beneficial properties of HDL. We complemented these clinical observations by in vitro experiments, in which SAA impaired vascular functions of HDL. We further derived a formula for the simple calculation of the amount of biologically 'effective' HDL-C based on measured HDL-C and SAA from the LURIC study. In 4D and KORA S4 studies, we found that measured HDL-C was not associated with clinical outcomes, whereas calculated 'effective' HDL-C significantly predicted better outcome. CONCLUSION The acute-phase protein SAA modifies the biological effects of HDL-C in several clinical conditions. The concomitant measurement of SAA is a simple, useful, and clinically applicable surrogate for the vascular functionality of HDL.
Resumo:
Predicting the timing and amount of tree mortality after a forest fire is of paramount importance for post-fire management decisions, such as salvage logging or reforestation. Such knowledge is particularly needed in mountainous regions where forest stands often serve as protection against natural hazards (e.g., snow avalanches, rockfalls, landslides). In this paper, we focus on the drivers and timing of mortality in fire-injured beech trees (Fagus sylvatica L.) in mountain regions. We studied beech forests in the southwestern European Alps, which burned between 1970 and 2012. The results show that beech trees, which lack fire-resistance traits, experience increased mortality within the first two decades post-fire with a timing and amount strongly related to the burn severity. Beech mortality is fast and ubiquitous in high severity sites, whereas small- (DBH <12 cm) and intermediate-diameter (DBH 12–36 cm) trees face a higher risk to die in moderate-severity sites. Large-diameter trees mostly survive, representing a crucial ecological legacy for beech regeneration. Mortality remains low and at a level similar to unburnt beech forests for low burn severity sites. Beech trees diameter, the presence of fungal infestation and elevation are the most significant drivers of mortality. The risk of beech to die increases toward higher elevation and is higher for small-diameter than for large-diameter trees. In case of secondary fungi infestation beech faces generally a higher risk to die. Interestingly, fungi that initiate post-fire tree mortality differ from fungi occurring after mechanical injury. From a management point of view, the insights about the controls of post-fire mortality provided by this study should help in planning post-fire silvicultural measures in montane beech forests.
Resumo:
BACKGROUND Antiretroviral therapy (ART) initiation is now recommended irrespective of CD4 count. However data on the relationship between CD4 count at ART initiation and loss to follow-up (LTFU) are limited and conflicting. METHODS We conducted a cohort analysis including all adults initiating ART (2008-2012) at three public sector sites in South Africa. LTFU was defined as no visit in the 6 months before database closure. The Kaplan-Meier estimator and Cox's proportional hazards models examined the relationship between CD4 count at ART initiation and 24-month LTFU. Final models were adjusted for demographics, year of ART initiation, programme expansion and corrected for unascertained mortality. RESULTS Among 17 038 patients, the median CD4 at initiation increased from 119 (IQR 54-180) in 2008 to 257 (IQR 175-318) in 2012. In unadjusted models, observed LTFU was associated with both CD4 counts <100 cells/μL and CD4 counts ≥300 cells/μL. After adjustment, patients with CD4 counts ≥300 cells/μL were 1.35 (95% CI 1.12 to 1.63) times as likely to be LTFU after 24 months compared to those with a CD4 150-199 cells/μL. This increased risk for patients with CD4 counts ≥300 cells/μL was largest in the first 3 months on treatment. Correction for unascertained deaths attenuated the association between CD4 counts <100 cells/μL and LTFU while the association between CD4 counts ≥300 cells/μL and LTFU persisted. CONCLUSIONS Patients initiating ART at higher CD4 counts may be at increased risk for LTFU. With programmes initiating patients at higher CD4 counts, models of ART delivery need to be reoriented to support long-term retention.
Resumo:
OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.
Resumo:
Worker populations are potentially exposed to multiple chemical substances simultaneously during the performance of routine tasks. The acute health effects from exposure to toxic concentrations of these substances are usually well-described. However, very little is known about the long-term health effects of chronic low dose exposure to all except a few chemical substances. A mortality study was performed on a population of workers employed at a butyl rubber manufacturing plant in Baton Rouge, Louisiana for the period 1943-1978, with special emphasis on potential exposure to methyl chloride.^ The study population was enumerated using company records. The mortality experience among the population was evaluated by comparing the number of observed deaths (total and cause-specific) to the expected number of deaths, based on the U.S. general age, race, sex specific rates. An internal comparison population was assembled to address the issue of lack of comparability when the U.S. rates are used to calculate expected deaths in an employed population.^ There were 18% fewer total observed deaths compared to the expected when the U.S. death rates were used to obtain the expected. Deaths from specific causes were also less than expected except when numbers of observed and expected deaths were small. Similar results were obtained when the population was characterized by intensity and duration of potential exposure to methyl chloride. When the internal comparison population was utilized to evaluate overall mortality of the study population, the relative risk was about 1.2.^ The study results were discussed and conclusions drawn in light of certain limitations of the methodology and study population size. ^
Resumo:
Severe liver injury (SLI) due to drugs is a frequent cause of catastrophic illness and hospitalization. Due to significant morbidity, mortality, and excess medical care costs, this poses a challenge as a public health problem. The role of associated risk factors like alcohol consumption in contributing to the high mortality remains to be studied. This study was conducted to assess the impact of alcohol use on mortality in IDILI patients, while adjusting for age, gender, race/ethnicity, and education level. The data from this study indicate only a small excess risk of death among IDILI patients using alcohol, but the difference was not statistically significant. The major contribution of this study to the field of public health is that it excludes a large hazard of alcohol consumption on the mortality among idiosyncratic drug induced liver injury (IDILI) patients. ^
Resumo:
A retrospective cohort study was conducted among 1542 patients diagnosed with CLL between 1970 and 2001 at the M. D. Anderson Cancer Center (MDACC). Changes in clinical characteristics and the impact of CLL on life expectancy were assessed across three decades (1970–2001) and the role of clinical factors on prognosis of CLL were evaluated among patients diagnosed between 1985 and 2001 using Kaplan-Meier and Cox proportional hazards method. Among 1485 CLL patients diagnosed from 1970 to 2001, patients in the recent cohort (1985–2001) were diagnosed at a younger age and an earlier stage compared to the earliest cohort (1970–1984). There was a 44% reduction in mortality among patients diagnosed in 1985–1995 compared to those diagnosed in 1970–1984 after adjusting for age, sex and Rai stage among patients who ever received treatment. There was an overall 11 years (5 years for stage 0) loss of life expectancy among 1485 patients compared with the expected life expectancy based on the age-, sex- and race-matched US general population, with a 43% decrease in the 10-year survival rate. Abnormal cytogenetics was associated with shorter progression-free (PF) survival after adjusting for age, sex, Rai stage and beta-2 microglobulin (beta-2M); whereas, older age, abnormal cytogenetics and a higher beta-2M level were adverse predictors for overall survival. No increased risk of second cancer overall was observed, however, patients who received treatment for CLL had an elevated risk of developing AML and HD. Two out of three patients who developed AML were treated with alkylating agents. In conclusion, CLL patients had improved survival over time. The identification of clinical predictors of PF/overall survival has important clinical significance. Close surveillance of the development of second cancer is critical to improve the quality of life of long-term survivors. ^
Resumo:
The global social and economic burden of HIV/AIDS is great, with over forty million people reported to be living with HIV/AIDS at the end of 2005; two million of these are children from birth to 15 years of age. Antiretroviral therapy has been shown to improve growth and survival of HIV-infected individuals. The purpose of this study is to describe a cohort of HIV-infected pediatric patients and assess the association between clinical factors, with growth and mortality outcomes. ^ This was a historical cohort study. Medical records of infants and children receiving HIV care at Mulago Pediatric Infectious Disease Clinic (PIDC) in Uganda between July 2003 and March 2006 were analyzed. Height and weight measurements were age and sex standardized to Centers for Disease Control and prevention (CDC) 2000 reference. Descriptive and logistic regression analyses were performed to identify covariates associated with risk of stunting or being underweight, and mortality. Longitudinal regression analysis with a mixed model using autoregressive covariance structure was used to compare change in height and weight before and after initiation of highly active antiretroviral therapy (HAART). ^ The study population was comprised of 1059 patients 0-20 years of age, the majority of whom were aged thirteen years and below (74.6%). Mean height-for-age before initiation of HAART was in the 10th percentile, mean weight-for-age was in the 8th percentile, and the mean weight-for-height was in the 23rd percentile. Initiation of HAART resulted in improvement in both the mean standardized weight-for-age Z score and weight-for-age percentiles (p <0.001). Baseline age, and weight-for-age Z score were associated with stunting (p <0.001). A negative weight-for-age Z score was associated with stunting (OR 4.60, CI 3.04-5.49). Risk of death decreased from 84% in the >2-8 years age category to 21% in the >13 years age category respectively, compared to the 0-2 years of age (p <0.05). ^ This pediatric population gained weight significantly more rapidly than height after starting HAART. A low weight-for-age Z score was associated with poor survival in children. These findings suggest that age, weight, and height measurements be monitored closely at Mulago PIDC. ^
Resumo:
Hypertension is a significant risk factor for cardiovascular disease, which in turn is a major cause of morbidity and mortality worldwide. While the pathogenesis of vascular injury and subsequent end organ damage is complex, there is emerging data to support a role for the complement system in endovascular diseases. The complement Factor H Y402H polymorphism has been associated with a number of vasculopathies, including age-related macular degeneration (AMD), ischemic stroke and myocardial infarction. The current study evaluated the relationship of the Y402H polymorphism with hypertension and microalbuminuria in large the bi-racial Atherosclerosis Risk in Communities (ARIC) study. The Y402H polymorphism was found to be associated with a 48% (p-value 0.042) increase in the risk of developing incident hypertension in African American participants. No significant association was found with the Y402H polymorphism and microalbuminuria. The results from this investigation reveal the first association of the Factor H Y402H polymorphism and an increased risk of incident hypertension in African Americans. ^
Resumo:
Background. Liver cancer mortality continues to be a significant factor in deaths worldwide and in the U.S., yet there remains a lack of studies on how mortality burden is impacted by racial groups or by heavy alcohol use. This study evaluated the geographic distribution of liver cancer mortality across population groups in Texas and the U.S. over a 24-year period, as well as determining whether alcohol dependence or abuse correlates with mortality rates. ^ Methods. The Spatial Scan Statistic was used to identify regions of excess liver cancer mortality in Texas counties and the U.S. from 1980 to 2003. The statistic was conducted with a spatial cluster size of 50% of the population at risk, and all analyses used publicly available data. Alcohol abuse data by state and ethnicity were extracted from SAMHSA datasets for the study period 2000–2004. ^ Results. The results of the geographic analysis of liver cancer mortality in both Texas and the U.S. indicate that there were four and seven regions, respectively, that were identified as having statistically significant excess mortality rates with elevated relative risks ranging from 1.38–2.07 and 1.05–1.623 (p = 0.001), respectively. ^ Conclusion. This study revealed seven regions of excess mortality of liver cancer mortality across the U.S. and four regions of excess mortality in Texas between 1980–2003, as well as demonstrated a correlation between elevated liver cancer mortality rates and reporting of alcohol dependence among Hispanics and Other populations. ^