936 resultados para toxicological mortality data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM: To study prospectively patients after heart transplantation with respect to quality of life, mortality, morbidity, and clinical parameters before and up to 10 years after the operation. METHODS: Sixty patients (47.9 +/- 10.9 years, 57 men, 3 women) were transplanted at the University of Vienna Hospital, Department for Heart and Thorax Surgery and were included in this study. They were assessed when set on the waiting list, then exactly one, 5 and 10 years after the transplantation. The variables evaluated included physical and emotional complaints, well-being, mortality and morbidity. In the sample of patients who survived 10 years (n = 23), morbidity (infections, malignancies, graft arteriosclerosis, and rejection episodes) as well as quality of life were evaluated. RESULTS: Actuarial survival rates were 83.3, 66.7, 48.3% at 1, 5, and 10 years after transplantation, respectively. During the first year, infections were the most important reasons for premature death. As a cause of mortality, malignancies were found between years 1 and 5, and graft arteriosclerosis between years 5 and 10. Physical complaints diminished significantly after the operation, but grew significantly during the period from 5 to 10 years (p < 0.001). However, trembling (p < 0.05) and paraesthesies (p < 0.01) diminished continuously. Emotional complaints such as depression and dysphoria (both p < 0.05) increased until the tenth year after their nadir at year 1. In long-time survivors, 3 malignancies (lung, skin, thyroidea) were diagnosed 6 to 9 years postoperatively. Three patients (13%) had signs of graft arteriosclerosis at year 10; 9 (40%) patients suffered from rejection episodes during the course of 10 years. There were no serious rejection episodes deserving immediate therapy. Quality of life at 10 years is good in these patients. CONCLUSIONS: Heart transplantation is a successful therapy for patients with terminal heart disease. Long-term survivors feel well after 10 years and report a good quality of life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background mortality is an essential component of any forest growth and yield model. Forecasts of mortality contribute largely to the variability and accuracy of model predictions at the tree, stand and forest level. In the present study, I implement and evaluate state-of-the-art techniques to increase the accuracy of individual tree mortality models, similar to those used in many of the current variants of the Forest Vegetation Simulator, using data from North Idaho and Montana. The first technique addresses methods to correct for bias induced by measurement error typically present in competition variables. The second implements survival regression and evaluates its performance against the traditional logistic regression approach. I selected the regression calibration (RC) algorithm as a good candidate for addressing the measurement error problem. Two logistic regression models for each species were fitted, one ignoring the measurement error, which is the “naïve” approach, and the other applying RC. The models fitted with RC outperformed the naïve models in terms of discrimination when the competition variable was found to be statistically significant. The effect of RC was more obvious where measurement error variance was large and for more shade-intolerant species. The process of model fitting and variable selection revealed that past emphasis on DBH as a predictor variable for mortality, while producing models with strong metrics of fit, may make models less generalizable. The evaluation of the error variance estimator developed by Stage and Wykoff (1998), and core to the implementation of RC, in different spatial patterns and diameter distributions, revealed that the Stage and Wykoff estimate notably overestimated the true variance in all simulated stands, but those that are clustered. Results show a systematic bias even when all the assumptions made by the authors are guaranteed. I argue that this is the result of the Poisson-based estimate ignoring the overlapping area of potential plots around a tree. Effects, especially in the application phase, of the variance estimate justify suggested future efforts of improving the accuracy of the variance estimate. The second technique implemented and evaluated is a survival regression model that accounts for the time dependent nature of variables, such as diameter and competition variables, and the interval-censored nature of data collected from remeasured plots. The performance of the model is compared with the traditional logistic regression model as a tool to predict individual tree mortality. Validation of both approaches shows that the survival regression approach discriminates better between dead and alive trees for all species. In conclusion, I showed that the proposed techniques do increase the accuracy of individual tree mortality models, and are a promising first step towards the next generation of background mortality models. I have also identified the next steps to undertake in order to advance mortality models further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relation between residential magnetic field exposure from power lines and mortality from neurodegenerative conditions was analyzed among 4.7 million persons of the Swiss National Cohort (linking mortality and census data), covering the period 2000-2005. Cox proportional hazard models were used to analyze the relation of living in the proximity of 220-380 kV power lines and the risk of death from neurodegenerative diseases, with adjustment for a range of potential confounders. Overall, the adjusted hazard ratio for Alzheimer's disease in persons living within 50 m of a 220-380 kV power line was 1.24 (95% confidence interval (CI): 0.80, 1.92) compared with persons who lived at a distance of 600 m or more. There was a dose-response relation with respect to years of residence in the immediate vicinity of power lines and Alzheimer's disease: Persons living at least 5 years within 50 m had an adjusted hazard ratio of 1.51 (95% CI: 0.91, 2.51), increasing to 1.78 (95% CI: 1.07, 2.96) with at least 10 years and to 2.00 (95% CI: 1.21, 3.33) with at least 15 years. The pattern was similar for senile dementia. There was little evidence for an increased risk of amyotrophic lateral sclerosis, Parkinson's disease, or multiple sclerosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Exposure to intermittent magnetic fields of 16 Hz has been shown to reduce heart rate variability, and decreased heart rate variability predicts cardiovascular mortality. We examined mortality from cardiovascular causes in railway workers exposed to varying degrees to intermittent 16.7 Hz magnetic fields. METHODS: We studied a cohort of 20,141 Swiss railway employees between 1972 and 2002, including highly exposed train drivers (median lifetime exposure 120.5 muT-years), and less or little exposed shunting yard engineers (42.1 muT-years), train attendants (13.3 muT-years) and station masters (5.7 muT-years). During 464,129 person-years of follow up, 5,413 deaths were recorded and 3,594 deaths were attributed to cardio-vascular diseases. We analyzed data using Cox proportional hazards models. RESULTS: For all cardiovascular mortality the hazard ratio compared to station masters was 0.99 (95%CI: 0.91, 1.08) in train drivers, 1.13 (95%CI: 0.98, 1.30) in shunting yard engineers, and 1.09 (95%CI: 1.00, 1.19) in train attendants. Corresponding hazard ratios for arrhythmia related deaths were 1.04 (95%CI: 0.68, 1.59), 0.58 (95%CI: 0.24, 1.37) and 10 (95%CI: 0.87, 1.93) and for acute myocardial infarction 1.00 (95%CI: 0.73, 1.36), 1.56 (95%CI: 1.04, 2.32), and 1.14 (95%CI: 0.85, 1.53). The hazard ratio for arrhythmia related deaths per 100 muT-years of cumulative exposure was 0.94 (95%CI: 0.71, 1.24) and 0.91 (95%CI: 0.75, 1.11) for acute myocardial infarction. CONCLUSION: This study provides evidence against an association between long-term occupational exposure to intermittent 16.7 Hz magnetic fields and cardiovascular mortality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While studies from other countries have shown an excess mortality in diabetic individuals when compared with the general population, comparable long-term data is not available for Switzerland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Erythropoiesis-stimulating agents (ESAs) reduce anemia in cancer patients and may improve quality of life, but there are concerns that ESAs might increase mortality. OBJECTIVES: Our objectives were to examine the effect of ESAs and identify factors that modify the effects of ESAs on overall survival, progression free survival, thromboembolic and cardiovascular events as well as need for transfusions and other important safety and efficacy outcomes in cancer patients. SEARCH STRATEGY: We searched the Cochrane Library, Medline, Embase and conference proceedings for eligible trials. Manufacturers of ESAs were contacted to identify additional trials. SELECTION CRITERIA: We included randomized controlled trials comparing epoetin or darbepoetin plus red blood cell transfusions (as necessary) versus red blood cell transfusions (as necessary) alone, to prevent or treat anemia in adult or pediatric cancer patients with or without concurrent antineoplastic therapy. DATA COLLECTION AND ANALYSIS: We performed a meta-analysis of randomized controlled trials comparing epoetin alpha, epoetin beta or darbepoetin alpha plus red blood cell transfusions versus transfusion alone, for prophylaxis or therapy of anemia while or after receiving anti-cancer treatment. Patient-level data were obtained and analyzed by independent statisticians at two academic departments, using fixed-effects and random-effects meta-analysis. Analyses were according to the intention-to-treat principle. Primary endpoints were on study mortality and overall survival during the longest available follow-up, regardless of anticancer treatment, and in patients receiving chemotherapy. Tests for interactions were used to identify differences in effects of ESAs on mortality across pre-specified subgroups. The present review reports only the results for the primary endpoint. MAIN RESULTS: A total of 13933 cancer patients from 53 trials were analyzed, 1530 patients died on-study and 4993 overall. ESAs increased on study mortality (combined hazard ratio [cHR] 1.17; 95% CI 1.06-1.30) and worsened overall survival (cHR 1.06; 95% CI 1.00-1.12), with little heterogeneity between trials (I(2) 0%, p=0.87 and I(2) 7.1%, p=0.33, respectively). Thirty-eight trials enrolled 10441 patients receiving chemotherapy. The cHR for on study mortality was 1.10 (95% CI 0.98-1.24) and 1.04; 95% CI 0.97-1.11) for overall survival. There was little evidence for a difference between trials of patients receiving different cancer treatments (P for interaction=0.42). AUTHORS' CONCLUSIONS: ESA treatment in cancer patients increased on study mortality and worsened overall survival. For patients undergoing chemotherapy the increase was less pronounced, but an adverse effect could not be excluded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Erythropoiesis-stimulating agents reduce anaemia in patients with cancer and could improve their quality of life, but these drugs might increase mortality. We therefore did a meta-analysis of randomised controlled trials in which these drugs plus red blood cell transfusions were compared with transfusion alone for prophylaxis or treatment of anaemia in patients with cancer. METHODS: Data for patients treated with epoetin alfa, epoetin beta, or darbepoetin alfa were obtained and analysed by independent statisticians using fixed-effects and random-effects meta-analysis. Analyses were by intention to treat. Primary endpoints were mortality during the active study period and overall survival during the longest available follow-up, irrespective of anticancer treatment, and in patients given chemotherapy. Tests for interactions were used to identify differences in effects of erythropoiesis-stimulating agents on mortality across prespecified subgroups. FINDINGS: Data from a total of 13 933 patients with cancer in 53 trials were analysed. 1530 patients died during the active study period and 4993 overall. Erythropoiesis-stimulating agents increased mortality during the active study period (combined hazard ratio [cHR] 1.17, 95% CI 1.06-1.30) and worsened overall survival (1.06, 1.00-1.12), with little heterogeneity between trials (I(2) 0%, p=0.87 for mortality during the active study period, and I(2) 7.1%, p=0.33 for overall survival). 10 441 patients on chemotherapy were enrolled in 38 trials. The cHR for mortality during the active study period was 1.10 (0.98-1.24), and 1.04 (0.97-1.11) for overall survival. There was little evidence for a difference between trials of patients given different anticancer treatments (p for interaction=0.42). INTERPRETATION: Treatment with erythropoiesis-stimulating agents in patients with cancer increased mortality during active study periods and worsened overall survival. The increased risk of death associated with treatment with these drugs should be balanced against their benefits. FUNDING: German Federal Ministry of Education and Research, Medical Faculty of University of Cologne, and Oncosuisse (Switzerland).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The extent to which mortality differs following individual acquired immunodeficiency syndrome (AIDS)-defining events (ADEs) has not been assessed among patients initiating combination antiretroviral therapy. METHODS: We analyzed data from 31,620 patients with no prior ADEs who started combination antiretroviral therapy. Cox proportional hazards models were used to estimate mortality hazard ratios for each ADE that occurred in >50 patients, after stratification by cohort and adjustment for sex, HIV transmission group, number of antiretroviral drugs initiated, regimen, age, date of starting combination antiretroviral therapy, and CD4+ cell count and HIV RNA load at initiation of combination antiretroviral therapy. ADEs that occurred in <50 patients were grouped together to form a "rare ADEs" category. RESULTS: During a median follow-up period of 43 months (interquartile range, 19-70 months), 2880 ADEs were diagnosed in 2262 patients; 1146 patients died. The most common ADEs were esophageal candidiasis (in 360 patients), Pneumocystis jiroveci pneumonia (320 patients), and Kaposi sarcoma (308 patients). The greatest mortality hazard ratio was associated with non-Hodgkin's lymphoma (hazard ratio, 17.59; 95% confidence interval, 13.84-22.35) and progressive multifocal leukoencephalopathy (hazard ratio, 10.0; 95% confidence interval, 6.70-14.92). Three groups of ADEs were identified on the basis of the ranked hazard ratios with bootstrapped confidence intervals: severe (non-Hodgkin's lymphoma and progressive multifocal leukoencephalopathy [hazard ratio, 7.26; 95% confidence interval, 5.55-9.48]), moderate (cryptococcosis, cerebral toxoplasmosis, AIDS dementia complex, disseminated Mycobacterium avium complex, and rare ADEs [hazard ratio, 2.35; 95% confidence interval, 1.76-3.13]), and mild (all other ADEs [hazard ratio, 1.47; 95% confidence interval, 1.08-2.00]). CONCLUSIONS: In the combination antiretroviral therapy era, mortality rates subsequent to an ADE depend on the specific diagnosis. The proposed classification of ADEs may be useful in clinical end point trials, prognostic studies, and patient management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The aim was to compare cause-specific mortality, self-rated health (SRH) and risk factors in the French and German part of Switzerland and to discuss to what extent variations between these regions reflect differences between France and Germany. METHODS: Data were used from the general population of German and French Switzerland with 2.8 million individuals aged 45-74 years, contributing 176 782 deaths between 1990 and 2000. Adjusted mortality risks were calculated from the Swiss National Cohort, a longitudinal census-based record linkage study. Results were contrasted with cross-sectional analyses of SRH and risk factors (Swiss Health Survey 1992/3) and with cross-sectional national and international mortality rates for 1980, 1990 and 2000. RESULTS: Despite similar all-cause mortality, there were substantial differences in cause-specific mortality between Swiss regions. Deaths from circulatory disease were more common in German Switzerland, while causes related to alcohol consumption were more prevalent in French Switzerland. Many but not all of the mortality differences between the two regions could be explained by variations in risk factors. Similar patterns were found between Germany and France. CONCLUSION: Characteristic mortality and behavioural differentials between the German- and the French-speaking parts of Switzerland could also be found between Germany and France. However, some of the international variations in mortality were not in line with the Swiss regional comparison nor with differences in risk factors. These could relate to peculiarities in assignment of cause of death. With its cultural diversity, Switzerland offers the opportunity to examine cultural determinants of mortality without bias due to different statistical systems or national health policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: It is unclear to which level mean arterial blood pressure (MAP) should be increased during septic shock in order to improve outcome. In this study we investigated the association between MAP values of 70 mmHg or higher, vasopressor load, 28-day mortality and disease-related events in septic shock. METHODS: This is a post hoc analysis of data of the control group of a multicenter trial and includes 290 septic shock patients in whom a mean MAP > or = 70 mmHg could be maintained during shock. Demographic and clinical data, MAP, vasopressor requirements during the shock period, disease-related events and 28-day mortality were documented. Logistic regression models adjusted for the geographic region of the study center, age, presence of chronic arterial hypertension, simplified acute physiology score (SAPS) II and the mean vasopressor load during the shock period was calculated to investigate the association between MAP or MAP quartiles > or = 70 mmHg and mortality or the frequency and occurrence of disease-related events. RESULTS: There was no association between MAP or MAP quartiles and mortality or the occurrence of disease-related events. These associations were not influenced by age or pre-existent arterial hypertension (all P > 0.05). The mean vasopressor load was associated with mortality (relative risk (RR), 1.83; confidence interval (CI) 95%, 1.4-2.38; P < 0.001), the number of disease-related events (P < 0.001) and the occurrence of acute circulatory failure (RR, 1.64; CI 95%, 1.28-2.11; P < 0.001), metabolic acidosis (RR, 1.79; CI 95%, 1.38-2.32; P < 0.001), renal failure (RR, 1.49; CI 95%, 1.17-1.89; P = 0.001) and thrombocytopenia (RR, 1.33; CI 95%, 1.06-1.68; P = 0.01). CONCLUSIONS: MAP levels of 70 mmHg or higher do not appear to be associated with improved survival in septic shock. Elevating MAP >70 mmHg by augmenting vasopressor dosages may increase mortality. Future trials are needed to identify the lowest acceptable MAP level to ensure tissue perfusion and avoid unnecessary high catecholamine infusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Despite the key role of hemodynamic goals, there are few data addressing the question as to which hemodynamic variables are associated with outcome or should be targeted in cardiogenic shock patients. The aim of this study was to investigate the association between hemodynamic variables and cardiogenic shock mortality. METHODS: Medical records and the patient data management system of a multidisciplinary intensive care unit (ICU) were reviewed for patients admitted because of cardiogenic shock. In all patients, the hourly variable time integral of hemodynamic variables during the first 24 hours after ICU admission was calculated. If hemodynamic variables were associated with 28-day mortality, the hourly variable time integral of drops below clinically relevant threshold levels was computed. Regression models and receiver operator characteristic analyses were calculated. All statistical models were adjusted for age, admission year, mean catecholamine doses and the Simplified Acute Physiology Score II (excluding hemodynamic counts) in order to account for the influence of age, changes in therapies during the observation period, the severity of cardiovascular failure and the severity of the underlying disease on 28-day mortality. RESULTS: One-hundred and nineteen patients were included. Cardiac index (CI) (P = 0.01) and cardiac power index (CPI) (P = 0.03) were the only hemodynamic variables separately associated with mortality. The hourly time integral of CI drops <3, 2.75 (both P = 0.02) and 2.5 (P = 0.03) L/min/m2 was associated with death but not that of CI drops <2 L/min/m2 or lower thresholds (all P > 0.05). The hourly time integral of CPI drops <0.5-0.8 W/m2 (all P = 0.04) was associated with 28-day mortality but not that of CPI drops <0.4 W/m2 or lower thresholds (all P > 0.05). CONCLUSIONS: During the first 24 hours after intensive care unit admission, CI and CPI are the most important hemodynamic variables separately associated with 28-day mortality in patients with cardiogenic shock. A CI of 3 L/min/m2 and a CPI of 0.8 W/m2 were most predictive of 28-day mortality. Since our results must be considered hypothesis-generating, randomized controlled trials are required to evaluate whether targeting these levels as early resuscitation endpoints can improve mortality in cardiogenic shock.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Improved survival among HIV-infected individuals on antiretroviral therapy (ART) has focused attention on AIDS-related cancers including Kaposi sarcoma (KS). However, the effect of KS on response to ART is not well-described in Southern Africa. We assessed the effect of KS on survival and immunologic and virologic treatment responses at 6- and 12-months after initiation of ART. METHODS We analyzed prospectively collected data from a cohort of HIV-infected adults initiating ART in South Africa. Differences in mortality between those with and without KS at ART initiation were estimated with Cox proportional hazard models. Log-binomial models were used to assess differences in CD4 count response and HIV virologic suppression within a year of initiating treatment. RESULTS Between January 2001-January 2008, 13,847 HIV-infected adults initiated ART at the study clinics. Those with KS at ART initiation (n = 247, 2%) were similar to those without KS (n = 13600,98%) with respect to age (35 vs. 35yrs), presenting CD4 count (74 vs. 85cells/mm³) and proportion on TB treatment (37% vs. 30%). In models adjusted for sex, baseline CD4 count, age, treatment site, tuberculosis and year of ART initiation, KS patients were over three times more likely to have died at any time after ART initiation (hazard ratio[HR]: 3.62; 95% CI: 2.71-4.84) than those without KS. The increased risk was highest within the first year on ART (HR: 4.05; 95% CI: 2.95-5.55) and attenuated thereafter (HR: 2.30; 95% CI: 1.08-4.89). Those with KS also gained, on average, 29 fewer CD4 cells (95% CI: 7-52cells/mm³) and were less likely to increase their CD4 count by 50 cells from baseline (RR: 1.43; 95% CI: 0.99-2.06) within the first 6-months of treatment. CONCLUSIONS HIV-infected adults presenting with KS have increased risk of mortality even after initiation of ART with the greatest risk in the first year. Among those who survive the first year on therapy, subjects with KS demonstrated a poorer immunologic response to ART than those without KS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND In many resource-limited settings monitoring of combination antiretroviral therapy (cART) is based on the current CD4 count, with limited access to HIV RNA tests or laboratory diagnostics. We examined whether the CD4 count slope over 6 months could provide additional prognostic information. METHODS We analyzed data from a large multicohort study in South Africa, where HIV RNA is routinely monitored. Adult HIV-positive patients initiating cART between 2003 and 2010 were included. Mortality was analyzed in Cox models; CD4 count slope by HIV RNA level was assessed using linear mixed models. RESULTS About 44,829 patients (median age: 35 years, 58% female, median CD4 count at cART initiation: 116 cells/mm) were followed up for a median of 1.9 years, with 3706 deaths. Mean CD4 count slopes per week ranged from 1.4 [95% confidence interval (CI): 1.2 to 1.6] cells per cubic millimeter when HIV RNA was <400 copies per milliliter to -0.32 (95% CI: -0.47 to -0.18) cells per cubic millimeter with >100,000 copies per milliliter. The association of CD4 slope with mortality depended on current CD4 count: the adjusted hazard ratio (aHRs) comparing a >25% increase over 6 months with a >25% decrease was 0.68 (95% CI: 0.58 to 0.79) at <100 cells per cubic millimeter but 1.11 (95% CI: 0.78 to 1.58) at 201-350 cells per cubic millimeter. In contrast, the aHR for current CD4 count, comparing >350 with <100 cells per cubic millimeter, was 0.10 (95% CI: 0.05 to 0.20). CONCLUSIONS Absolute CD4 count remains a strong risk for mortality with a stable effect size over the first 4 years of cART. However, CD4 count slope and HIV RNA provide independently added to the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Loss to follow-up (LTFU) is a common problem in many epidemiological studies. In antiretroviral treatment (ART) programs for patients with human immunodeficiency virus (HIV), mortality estimates can be biased if the LTFU mechanism is non-ignorable, that is, mortality differs between lost and retained patients. In this setting, routine procedures for handling missing data may lead to biased estimates. To appropriately deal with non-ignorable LTFU, explicit modeling of the missing data mechanism is needed. This can be based on additional outcome ascertainment for a sample of patients LTFU, for example, through linkage to national registries or through survey-based methods. In this paper, we demonstrate how this additional information can be used to construct estimators based on inverse probability weights (IPW) or multiple imputation. We use simulations to contrast the performance of the proposed estimators with methods widely used in HIV cohort research for dealing with missing data. The practical implications of our approach are illustrated using South African ART data, which are partially linkable to South African national vital registration data. Our results demonstrate that while IPWs and proper imputation procedures can be easily constructed from additional outcome ascertainment to obtain valid overall estimates, neglecting non-ignorable LTFU can result in substantial bias. We believe the proposed estimators are readily applicable to a growing number of studies where LTFU is appreciable, but additional outcome data are available through linkage or surveys of patients LTFU. Copyright © 2013 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES Mortality in patients starting antiretroviral therapy (ART) is higher in Malawi and Zambia than in South Africa. We examined whether different monitoring of ART (viral load [VL] in South Africa and CD4 count in Malawi and Zambia) could explain this mortality difference. DESIGN Mathematical modelling study based on data from ART programmes. METHODS We used a stochastic simulation model to study the effect of VL monitoring on mortality over 5 years. In baseline scenario A all parameters were identical between strategies except for more timely and complete detection of treatment failure with VL monitoring. Additional scenarios introduced delays in switching to second-line ART (scenario B) or higher virologic failure rates (due to worse adherence) when monitoring was based on CD4 counts only (scenario C). Results are presented as relative risks (RR) with 95% prediction intervals and percent of observed mortality difference explained. RESULTS RRs comparing VL with CD4 cell count monitoring were 0.94 (0.74-1.03) in scenario A, 0.94 (0.77-1.02) with delayed switching (scenario B) and 0.80 (0.44-1.07) when assuming a 3-times higher rate of failure (scenario C). The observed mortality at 3 years was 10.9% in Malawi and Zambia and 8.6% in South Africa (absolute difference 2.3%). The percentage of the mortality difference explained by VL monitoring ranged from 4% (scenario A) to 32% (scenarios B and C combined, assuming a 3-times higher failure rate). Eleven percent was explained by non-HIV related mortality. CONCLUSIONS VL monitoring reduces mortality moderately when assuming improved adherence and decreased failure rates.