79 resultados para proportional hazards
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Objectives: We assessed mortality associated with immunologic and virologic patterns of response at 6 months of highly active antiretroviral therapy (HAART) in HIV-infected individuals from resource-limited countries in Africa and South America. Methods: Patients who initiated HAART between 1996 and 2007, aged 16 years or older, and had at least 1 measurement (HIV-1 RNA plasma viral load or CD4 cell count) at 6 months of therapy (3-9 month window) were included. Therapy response was categorized as complete, discordant (virologic only or immunologic only), and absent. Associations between 6-month response to therapy and all-cause mortality were assessed by Cox proportional hazards regression. Robust standard errors were calculated to account for intrasite correlation. Results: A total of 7160 patients, corresponding to 15,107 person-years, were analyzed. In multivariable analysis adjusted for age at HAART initiation, baseline clinical stage and CD4 cell count, year of HAART initiation, clinic, occurrence of an AIDS-defining condition within the first 6 months of treatment, and discordant and absent responses were associated with increased risk of death. Conclusions: Similar to reports from high-income countries, discordant immunologic and virologic responses were associated with intermediate risk of death compared with complete and no response in this large cohort of HIV-1 patients from resource-limited countries. Our results support a recommendation for wider availability of plasma viral load testing to monitor antiretroviral therapy in these settings.
Resumo:
The aim of this analysis was to assess the effect of body mass index (BMI) on 1-year outcomes in patients enrolled in a contemporary percutaneous coronary intervention trial comparing a sirolimus-eluting stent with a durable polymer to a biolimus-eluting stent with a biodegradable polymer. A total of 1,707 patients who underwent percutaneous coronary intervention were randomized to treatment with either biolimus-eluting stents (n = 857) or sirolimus-eluting stents (n = 850). Patients were assigned to 1 of 3 groups according to BMI: normal (<25 kg/m(2)), overweight (25 to 30 kg/m(2)), or obese (>30 kg/m(2)). At 1 year, the incidence of the composite of cardiac death, myocardial infarction, and clinically justified target vessel revascularization was assessed. In addition, rates of clinically justified target lesion revascularization and stent thrombosis were assessed. Cox proportional-hazards analysis, adjusted for clinical differences, was used to develop models for 1-year mortality. Forty-five percent of the patients (n = 770) were overweight, 26% (n = 434) were obese, and 29% (n = 497) had normal BMIs. At 1-year follow-up, the cumulative rate of cardiac death, myocardial infarction, and clinically justified target vessel revascularization was significantly higher in the obese group (8.7% in normal-weight, 11.3% in overweight, and 14.5% in obese patients, p = 0.01). BMI (hazard ratio 1.47, 95% confidence interval 1.02 to 2.14, p = 0.04) was an independent predictor of stent thrombosis. Stent type had no impact on the composite of cardiac death, myocardial infarction, and clinically justified target vessel revascularization at 1 year in the 3 BMI groups (hazard ratio 1.08, 95% confidence interval 0.63 to 1.83, p = 0.73). In conclusion, BMI was an independent predictor of major adverse cardiac events at 1-year clinical follow-up. The higher incidence of stent thrombosis in the obese group may suggest the need for a weight-adjusted dose of clopidogrel.
Resumo:
Background: With expanding pediatric antiretroviral therapy (ART) access, children will begin to experience treatment failure and require second-line therapy. We evaluated the probability and determinants of virologic failure and switching in children in South Africa. Methods: Pooled analysis of routine individual data from children who initiated ART in 7 South African treatment programs with 6-monthly viral load and CD4 monitoring produced Kaplan-Meier estimates of probability of virologic failure (2 consecutive unsuppressed viral loads with the second being >1000 copies/mL, after ≥24 weeks of therapy) and switch to second-line. Cox-proportional hazards models stratified by program were used to determine predictors of these outcomes. Results: The 3-year probability of virologic failure among 5485 children was 19.3% (95% confidence interval: 17.6 to 21.1). Use of nevirapine or ritonavir alone in the initial regimen (compared with efavirenz) and exposure to prevention of mother to child transmission regimens were independently associated with failure [adjusted hazard ratios (95% confidence interval): 1.77 (1.11 to 2.83), 2.39 (1.57 to 3.64) and 1.40 (1.02 to 1.92), respectively]. Among 252 children with ≥1 year follow-up after failure, 38% were switched to second-line. Median (interquartile range) months between failure and switch was 5.7 (2.9-11.0). Conclusions: Triple ART based on nevirapine or ritonavir as a single protease inhibitor seems to be associated with a higher risk of virologic failure. A low proportion of virologically failing children were switched.
Resumo:
BACKGROUND: Tumor levels of steroid hormone receptors, a factor used to select adjuvant treatment for early-stage breast cancer, are currently determined with immunohistochemical assays. These assays have a discordance of 10%-30% with previously used extraction assays. We assessed the concordance and predictive value of hormone receptor status as determined by immunohistochemical and extraction assays on specimens from International Breast Cancer Study Group Trials VIII and IX. These trials predominantly used extraction assays and compared adjuvant chemoendocrine therapy with endocrine therapy alone among pre- and postmenopausal patients with lymph node-negative breast cancer. Trial conclusions were that combination therapy provided a benefit to pre- and postmenopausal patients with estrogen receptor (ER)-negative tumors but not to ER-positive postmenopausal patients. ER-positive premenopausal patients required further study. METHODS: Tumor specimens from 571 premenopausal and 976 postmenopausal patients on which extraction assays had determined ER and progesterone receptor (PgR) levels before randomization from October 1, 1988, through October 1, 1999, were re-evaluated with an immunohistochemical assay in a central pathology laboratory. The endpoint was disease-free survival. Hazard ratios of recurrence or death for treatment comparisons were estimated with Cox proportional hazards regression models, and discriminatory ability was evaluated with the c index. All statistical tests were two-sided. RESULTS: Concordance of hormone receptor status determined by both assays ranged from 74% (kappa = 0.48) for PgR among postmenopausal patients to 88% (kappa = 0.66) for ER in postmenopausal patients. Hazard ratio estimates were similar for the association between disease-free survival and ER status (among all patients) or PgR status (among postmenopausal patients) as determined by the two methods. However, among premenopausal patients treated with endocrine therapy alone, the discriminatory ability of PgR status as determined by immunohistochemical assay was statistically significantly better (c index = 0.60 versus 0.51; P = .003) than that determined by extraction assay, and so immunohistochemically determined PgR status could predict disease-free survival. CONCLUSIONS: Trial conclusions in which ER status (for all patients) or PgR status (for postmenopausal patients) was determined by immunohistochemical assay supported those determined by extraction assays. However, among premenopausal patients, trial conclusions drawn from PgR status differed--immunohistochemically determined PgR status could predict response to endocrine therapy, unlike that determined by the extraction assay.
Resumo:
BACKGROUND: Patients coinfected with hepatitis C virus (HCV) and HIV experience higher mortality rates than patients infected with HIV alone. We designed a study to determine whether risks for later mortality are similar for HCV-positive and HCV-negative individuals when subjects are stratified on the basis of baseline CD4+ T-cell counts. METHODS: Antiretroviral-naive individuals, who initiated highly active antiretroviral therapy (HAART) between 1996 and 2002 were included in the study. HCV-positive and HCV-negative individuals were stratified separately by baseline CD4+ T-cell counts of 50 cell/microl increments. Cox-proportional hazards regression was used to model the effect of these strata with other variables on survival. RESULTS: CD4+ T-cell strata below 200 cells/microl, but not above, imparted an increased relative hazard (RH) of mortality for both HCV-positive and HCV-negative individuals. Among HCV-positive individuals, after adjustment for baseline age, HIV RNA levels, history of injection drug use and adherence to therapy, only CD4+ T-cell strata of <50 cells/microl (RH=4.60; 95% confidence interval [CI] 2.72-7.76) and 50-199 cells/microl (RH=2.49; 95% CI 1.63-3.81) were significantly associated with increased mortality when compared with those initiating therapy at cell counts >500 cells/microl. The same baseline CD4+ T-cell strata were found for HCV-negative individuals. CONCLUSION: In a within-groups analysis, the baseline CD4+ T-cell strata that are associated with increased RHs for mortality are the same for HCV-positive and HCV-negative individuals initiating HAART. However, a between-groups analysis reveals a higher absolute mortality risk for HCV-positive individuals.
Resumo:
BACKGROUND: We sought to characterize the impact that hepatitis C virus (HCV) infection has on CD4 cells during the first 48 weeks of antiretroviral therapy (ART) in previously ART-naive human immunodeficiency virus (HIV)-infected patients. METHODS: The HIV/AIDS Drug Treatment Programme at the British Columbia Centre for Excellence in HIV/AIDS distributes all ART in this Canadian province. Eligible individuals were those whose first-ever ART included 2 nucleoside reverse transcriptase inhibitors and either a protease inhibitor or a nonnucleoside reverse transcriptase inhibitor and who had a documented positive result for HCV antibody testing. Outcomes were binary events (time to an increase of > or = 75 CD4 cells/mm3 or an increase of > or = 10% in the percentage of CD4 cells in the total T cell population [CD4 cell fraction]) and continuous repeated measures. Statistical analyses used parametric and nonparametric methods, including multivariate mixed-effects linear regression analysis and Cox proportional hazards analysis. RESULTS: Of 1186 eligible patients, 606 (51%) were positive and 580 (49%) were negative for HCV antibodies. HCV antibody-positive patients were slower to have an absolute (P<.001) and a fraction (P = .02) CD4 cell event. In adjusted Cox proportional hazards analysis (controlling for age, sex, baseline absolute CD4 cell count, baseline pVL, type of ART initiated, AIDS diagnosis at baseline, adherence to ART regimen, and number of CD4 cell measurements), HCV antibody-positive patients were less likely to have an absolute CD4 cell event (adjusted hazard ratio [AHR], 0.84 [95% confidence interval [CI], 0.72-0.98]) and somewhat less likely to have a CD4 cell fraction event (AHR, 0.89 [95% CI, 0.70-1.14]) than HCV antibody-negative patients. In multivariate mixed-effects linear regression analysis, HCV antibody-negative patients had increases of an average of 75 cells in the absolute CD4 cell count and 4.4% in the CD4 cell fraction, compared with 20 cells and 1.1% in HCV antibody-positive patients, during the first 48 weeks of ART, after adjustment for time-updated pVL, number of CD4 cell measurements, and other factors. CONCLUSION: HCV antibody-positive HIV-infected patients may have an altered immunologic response to ART.
Resumo:
To compare the prediction of hip fracture risk of several bone ultrasounds (QUS), 7062 Swiss women > or =70 years of age were measured with three QUSs (two of the heel, one of the phalanges). Heel QUSs were both predictive of hip fracture risk, whereas the phalanges QUS was not. INTRODUCTION: As the number of hip fracture is expected to increase during these next decades, it is important to develop strategies to detect subjects at risk. Quantitative bone ultrasound (QUS), an ionizing radiation-free method, which is transportable, could be interesting for this purpose. MATERIALS AND METHODS: The Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture Risk (SEMOF) study is a multicenter cohort study, which compared three QUSs for the assessment of hip fracture risk in a sample of 7609 elderly ambulatory women > or =70 years of age. Two QUSs measured the heel (Achilles+; GE-Lunar and Sahara; Hologic), and one measured the heel (DBM Sonic 1200; IGEA). The Cox proportional hazards regression was used to estimate the hazard of the first hip fracture, adjusted for age, BMI, and center, and the area under the ROC curves were calculated to compare the devices and their parameters. RESULTS: From the 7609 women who were included in the study, 7062 women 75.2 +/- 3.1 (SD) years of age were prospectively followed for 2.9 +/- 0.8 years. Eighty women reported a hip fracture. A decrease by 1 SD of the QUS variables corresponded to an increase of the hip fracture risk from 2.3 (95% CI, 1.7, 3.1) to 2.6 (95% CI, 1.9, 3.4) for the three variables of Achilles+ and from 2.2 (95% CI, 1.7, 3.0) to 2.4 (95% CI, 1.8, 3.2) for the three variables of Sahara. Risk gradients did not differ significantly among the variables of the two heel QUS devices. On the other hand, the phalanges QUS (DBM Sonic 1200) was not predictive of hip fracture risk, with an adjusted hazard risk of 1.2 (95% CI, 0.9, 1.5), even after reanalysis of the digitalized data and using different cut-off levels (1700 or 1570 m/s). CONCLUSIONS: In this elderly women population, heel QUS devices were both predictive of hip fracture risk, whereas the phalanges QUS device was not.
Resumo:
BACKGROUND: We evaluated the ability of CA15-3 and alkaline phosphatase (ALP) to predict breast cancer recurrence. PATIENTS AND METHODS: Data from seven International Breast Cancer Study Group trials were combined. The primary end point was relapse-free survival (RFS) (time from randomization to first breast cancer recurrence), and analyses included 3953 patients with one or more CA15-3 and ALP measurement during their RFS period. CA15-3 was considered abnormal if >30 U/ml or >50% higher than the first value recorded; ALP was recorded as normal, abnormal, or equivocal. Cox proportional hazards models with a time-varying indicator for abnormal CA15-3 and/or ALP were utilized. RESULTS: Overall, 784 patients (20%) had a recurrence, before which 274 (35%) had one or more abnormal CA15-3 and 35 (4%) had one or more abnormal ALP. Risk of recurrence increased by 30% for patients with abnormal CA15-3 [hazard ratio (HR) = 1.30; P = 0.0005], and by 4% for those with abnormal ALP (HR = 1.04; P = 0.82). Recurrence risk was greatest for patients with either (HR = 2.40; P < 0.0001) and with both (HR = 4.69; P < 0.0001) biomarkers abnormal. ALP better predicted liver recurrence. CONCLUSIONS: CA15-3 was better able to predict breast cancer recurrence than ALP, but use of both biomarkers together provided a better early indicator of recurrence. Whether routine use of these biomarkers improves overall survival remains an open question.
Resumo:
BACKGROUND: Aromatase inhibitors are considered standard adjuvant endocrine treatment of postmenopausal women with hormone receptor-positive breast cancer, but it remains uncertain whether aromatase inhibitors should be given upfront or sequentially with tamoxifen. Awaiting results from ongoing randomized trials, we examined prognostic factors of an early relapse among patients in the BIG 1-98 trial to aid in treatment choices. PATIENTS AND METHODS: Analyses included all 7707 eligible patients treated on BIG 1-98. The median follow-up was 2 years, and the primary end point was breast cancer relapse. Cox proportional hazards regression was used to identify prognostic factors. RESULTS: Two hundred and eighty-five patients (3.7%) had an early relapse (3.1% on letrozole, 4.4% on tamoxifen). Predictive factors for early relapse were node positivity (P < 0.001), absence of both receptors being positive (P < 0.001), high tumor grade (P < 0.001), HER-2 overexpression/amplification (P < 0.001), large tumor size (P = 0.001), treatment with tamoxifen (P = 0.002), and vascular invasion (P = 0.02). There were no significant interactions between treatment and the covariates, though letrozole appeared to provide a greater than average reduction in the risk of early relapse in patients with many involved lymph nodes, large tumors, and vascular invasion present. CONCLUSION: Upfront letrozole resulted in significantly fewer early relapses than tamoxifen, even after adjusting for significant prognostic factors.
Resumo:
BACKGROUND: The prognostic relevance of the collateral circulation is still controversial. The goal of this study was to assess the impact on survival of quantitatively obtained, recruitable coronary collateral flow in patients with stable coronary artery disease during 10 years of follow-up. METHODS AND RESULTS: Eight-hundred forty-five individuals (age, 62+/-11 years), 106 patients without coronary artery disease and 739 patients with chronic stable coronary artery disease, underwent a total of 1053 quantitative, coronary pressure-derived collateral measurements between March 1996 and April 2006. All patients were prospectively included in a collateral flow index (CFI) database containing information on recruitable collateral flow parameters obtained during a 1-minute coronary balloon occlusion. CFI was calculated as follows: CFI = (P(occl) - CVP)/(P(ao) - CVP) where P(occl) is mean coronary occlusive pressure, P(ao) is mean aortic pressure, and CVP is central venous pressure. Patients were divided into groups with poorly developed (CFI < 0.25) or well-grown collateral vessels (CFI > or = 0.25). Follow-up information on the occurrence of all-cause mortality and major adverse cardiac events after study inclusion was collected. Cumulative 10-year survival rates in relation to all-cause deaths and cardiac deaths were 71% and 88%, respectively, in patients with low CFI and 89% and 97% in the group with high CFI (P=0.0395, P=0.0109). Through the use of Cox proportional hazards analysis, the following variables independently predicted elevated cardiac mortality: age, low CFI (as a continuous variable), and current smoking. CONCLUSIONS: A well-functioning coronary collateral circulation saves lives in patients with chronic stable coronary artery disease. Depending on the exact amount of collateral flow recruitable during a brief coronary occlusion, long-term cardiac mortality is reduced to one fourth compared with the situation without collateral supply.
Resumo:
BACKGROUND: A growing number of case reports have described tenofovir (TDF)-related proximal renal tubulopathy and impaired calculated glomerular filtration rates (cGFR). We assessed TDF-associated changes in cGFR in a large observational HIV cohort. METHODS: We compared treatment-naive patients or patients with treatment interruptions > or = 12 months starting either a TDF-based combination antiretroviral therapy (cART) (n = 363) or a TDF-sparing regime (n = 715). The predefined primary endpoint was the time to a 10 ml/min reduction in cGFR, based on the Cockcroft-Gault equation, confirmed by a follow-up measurement at least 1 month later. In sensitivity analyses, secondary endpoints including calculations based on the modified diet in renal disease (MDRD) formula were considered. Endpoints were modelled using pre-specified covariates in a multiple Cox proportional hazards model. RESULTS: Two-year event-free probabilities were 0.65 (95% confidence interval [CI] 0.58-0.72) and 0.80 (95% CI 0.76-0.83) for patients starting TDF-containing or TDF-sparing cART, respectively. In the multiple Cox model, diabetes mellitus (hazard ratio [HR] = 2.34 [95% CI 1.24-4.42]), higher baseline cGFR (HR = 1.03 [95% CI 1.02-1.04] by 10 ml/min), TDF use (HR = 1.84 [95% CI 1.35-2.51]) and boosted protease inhibitor use (HR = 1.71 [95% CI 1.30-2.24]) significantly increased the risk for reaching the primary endpoint. Sensitivity analyses showed high consistency. CONCLUSION: There is consistent evidence for a significant reduction in cGFR associated with TDF use in HIV-infected patients. Our findings call for a strict monitoring of renal function in long-term TDF users with tests that distinguish between glomerular dysfunction and proximal renal tubulopathy, a known adverse effect of TDF.
Resumo:
PURPOSE: To compare clinical outcomes of endovascular and open aortic repair of abdominal aortic aneurysms (AAAs) in young patients at low risk. It was hypothesized that endovascular aneurysm repair (EVAR) compares favorably with open aneurysm repair (OAR) in these patients. MATERIALS AND METHODS: Twenty-five patients aged 65 years or younger with a low perioperative surgical risk profile underwent EVAR at a single institution between April 1994 and May 2007 (23 men; mean age, 62 years+/-2.8). A sex- and risk-matched control group of 25 consecutive patients aged 65 years or younger who underwent OAR was used as a control group (23 men; mean age, 59 years+/-3.9). Patient outcomes and complications were classified according to Society of Vascular Surgery/International Society for Cardiovascular Surgery reporting standards. RESULTS: Mean follow-up times were 7.1 years+/-3.2 after EVAR and 5.9 years+/-1.8 after OAR (P=.1020). Total complication rates were 20% after EVAR and 52% after OAR (P=.0378), and all complications were mild or moderate. Mean intensive care unit times were 0.2 days+/-0.4 after EVAR and 1.1 days+/-0.4 after OAR (P<.0001) and mean lengths of hospital stay were 2.3 days+/-1.0 after EVAR and 5.0 days+/-2.1 after OAR (P<.0001). Cumulative rates of long-term patient survival did not differ between EVAR and OAR (P=.144). No AAA-related deaths or aortoiliac ruptures occurred during follow-up for EVAR and OAR. In addition, no surgical conversions were necessary in EVAR recipients. Cumulative rates of freedom from secondary procedures were not significantly different between the EVAR and OAR groups (P=.418). Within a multivariable Cox proportional-hazards analysis adjusted for patient age, maximum AAA diameter, and cardiac risk score, all-cause mortality rates (odds ratio [OR], 0.125; 95% CI, 0.010-1.493; P=.100) and need for secondary procedures (OR, 5.014; 95% CI, 0.325-77.410; P=.537) were not different between EVAR and OAR. CONCLUSIONS: Results from this observational study indicate that EVAR offers a favorable alternative to OAR in young patients at low risk.
Resumo:
BACKGROUND: Exposure to intermittent magnetic fields of 16 Hz has been shown to reduce heart rate variability, and decreased heart rate variability predicts cardiovascular mortality. We examined mortality from cardiovascular causes in railway workers exposed to varying degrees to intermittent 16.7 Hz magnetic fields. METHODS: We studied a cohort of 20,141 Swiss railway employees between 1972 and 2002, including highly exposed train drivers (median lifetime exposure 120.5 muT-years), and less or little exposed shunting yard engineers (42.1 muT-years), train attendants (13.3 muT-years) and station masters (5.7 muT-years). During 464,129 person-years of follow up, 5,413 deaths were recorded and 3,594 deaths were attributed to cardio-vascular diseases. We analyzed data using Cox proportional hazards models. RESULTS: For all cardiovascular mortality the hazard ratio compared to station masters was 0.99 (95%CI: 0.91, 1.08) in train drivers, 1.13 (95%CI: 0.98, 1.30) in shunting yard engineers, and 1.09 (95%CI: 1.00, 1.19) in train attendants. Corresponding hazard ratios for arrhythmia related deaths were 1.04 (95%CI: 0.68, 1.59), 0.58 (95%CI: 0.24, 1.37) and 10 (95%CI: 0.87, 1.93) and for acute myocardial infarction 1.00 (95%CI: 0.73, 1.36), 1.56 (95%CI: 1.04, 2.32), and 1.14 (95%CI: 0.85, 1.53). The hazard ratio for arrhythmia related deaths per 100 muT-years of cumulative exposure was 0.94 (95%CI: 0.71, 1.24) and 0.91 (95%CI: 0.75, 1.11) for acute myocardial infarction. CONCLUSION: This study provides evidence against an association between long-term occupational exposure to intermittent 16.7 Hz magnetic fields and cardiovascular mortality.
Resumo:
BACKGROUND: The outcome of Kaposi sarcoma varies. While many patients do well on highly active antiretroviral therapy, others have progressive disease and need chemotherapy. In order to predict which patients are at risk of unfavorable evolution, we established a prognostic score. METHOD: The survival analysis (Kaplan-Meier method; Cox proportional hazards models) of 144 patients with Kaposi sarcoma prospectively included in the Swiss HIV Cohort Study, from January 1996 to December 2004, was conducted. OUTCOME ANALYZED: use of chemotherapy or death. VARIABLES ANALYZED: demographics, tumor staging [T0 or T1 (16)], CD4 cell counts and HIV-1 RNA concentration, human herpesvirus 8 (HHV8) DNA in plasma and serological titers to latent and lytic antigens. RESULTS: Of 144 patients, 54 needed chemotherapy or died. In the univariate analysis, tumor stage T1, CD4 cell count below 200 cells/microl, positive HHV8 DNA and absence of antibodies against the HHV8 lytic antigen at the time of diagnosis were significantly associated with a bad outcome.Using multivariate analysis, the following variables were associated with an increased risk of unfavorable outcome: T1 [hazard ratio (HR) 5.22; 95% confidence interval (CI) 2.97-9.18], CD4 cell count below 200 cells/microl (HR 2.33; 95% CI 1.22-4.45) and positive HHV8 DNA (HR 2.14; 95% CI 1.79-2.85).We created a score with these variables ranging from 0 to 4: T1 stage counted for two points, CD4 cell count below 200 cells/microl for one point, and positive HHV8 viral load for one point. Each point increase was associated with a HR of 2.26 (95% CI 1.79-2.85). CONCLUSION: In the multivariate analysis, staging (T1), CD4 cell count (<200 cells/microl), positive HHV8 DNA in plasma, at the time of diagnosis, predict evolution towards death or the need of chemotherapy.
Resumo:
OBJECTIVES: To assess paediatric antiretroviral treatment (ART) outcomes and their associations from a collaborative cohort representing 20% of the South African national treatment programme. DESIGN AND SETTING: Multi-cohort study of 7 public sector paediatric ART programmes in Gauteng, Western Cape and KwaZulu-Natal provinces. SUBJECTS: ART-naive children (< or = 16 years) who commenced treatment with > or = 3 antiretroviral drugs before March 2008. OUTCOME MEASURES: Time to death or loss to follow-up were assessed using the Kaplan-Meier method. Associations between baseline characteristics and mortality were assessed with Cox proportional hazards models stratified by site. Immune status, virological suppression and growth were described in relation to duration of ART. RESULTS: The median (interquartile range) age of 6 078 children with 9 368 child-years of follow-up was 43 (15 - 83) months, with 29% being < 18 months. Most were severely ill at ART initiation. More than 75% of children were appropriately monitored at 6-monthly intervals with viral load suppression (< 400 copies/ml) being 80% or above throughout 36 months of treatment. Mortality and retention in care at 3 years were 7.7% (95% confidence interval 7.0 - 8.6%) and 81.4% (80.1 - 82.6%), respectively. Together with young age, all markers of disease severity (low weight-for-age z-score, high viral load, severe immune suppression, stage 3/4 disease and anaemia) were independently associated with mortality. CONCLUSIONS: Dramatic clinical benefit for children accessing the national ART programme is demonstrated. Higher mortality in infants and those with advanced disease highlights the need for early diagnosis of HIV infection and commencement of ART.