893 resultados para MORTALITY RISK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The objective of this study was to evaluate the impact of newer therapies on the highest risk patients with congenital diaphragmatic hernia (CDH), those with agenesis of the diaphragm. SUMMARY BACKGROUND DATA: CDH remains a significant cause of neonatal mortality. Many novel therapeutic interventions have been used in these infants. Those children with large defects or agenesis of the diaphragm have the highest mortality and morbidity. METHODS: Twenty centers from 5 countries collected data prospectively on all liveborn infants with CDH over a 10-year period. The treatment and outcomes in these patients were examined. Patients were followed until death or hospital discharge. RESULTS: A total of 1,569 patients with CDH were seen between January 1995 and December 2004 in 20 centers. A total of 218 patients (14%) had diaphragmatic agenesis and underwent repair. The overall survival for all patients was 68%, while survival was 54% in patients with agenesis. When patients with diaphragmatic agenesis from the first 2 years were compared with similar patients from the last 2 years, there was significantly less use of ECMO (75% vs. 52%) and an increased use of inhaled nitric oxide (iNO) (30% vs. 80%). There was a trend toward improved survival in patients with agenesis from 47% in the first 2 years to 59% in the last 2 years. The survivors with diaphragmatic agenesis had prolonged hospital stays compared with patients without agenesis (median, 68 vs. 30 days). For the last 2 years of the study, 36% of the patients with agenesis were discharged on tube feedings and 22% on oxygen therapy. CONCLUSIONS: There has been a change in the management of infants with CDH with less frequent use of ECMO and a greater use of iNO in high-risk patients with a potential improvement in survival. However, the mortality, hospital length of stay, and morbidity in agenesis patients remain significant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS No standardized local thrombolysis regimen exists for the treatment of pulmonary embolism (PE). We retrospectively investigated efficacy and safety of fixed low-dose ultrasound-assisted catheter-directed thrombolysis (USAT) for intermediate- and high-risk PE. METHODS AND RESULTS Fifty-two patients (65 ± 14 years) of whom 14 had high-risk PE (troponin positive in all) and 38 intermediate-risk PE (troponin positive in 91%) were treated with intravenous unfractionated heparin and USAT using 10 mg of recombinant tissue plasminogen activator per device over the course of 15 h. Bilateral USAT was performed in 83% of patients. During 3-month follow-up, two [3.8%; 95% confidence interval (CI) 0.5-13%] patients died (one from cardiogenic shock and one from recurrent PE). Major non-fatal bleeding occurred in two (3.8%; 95% CI, 0.5-13%) patients: one intrathoracic bleeding after cardiopulmonary resuscitation requiring transfusion, one intrapulmonary bleeding requiring lobectomy. Mean pulmonary artery pressure decreased from 37 ± 9 mmHg at baseline to 25 ± 8 mmHg at 15 h (P < 0.001) and cardiac index increased from 2.0 ± 0.7 to 2.7 ± 0.9 L/min/m(2) (P < 0.001). Echocardiographic right-to-left ventricular end-diastolic dimension ratio decreased from 1.42 ± 0.21 at baseline to 1.06 ± 0.23 at 24 h (n = 21; P < 0.001). The greatest haemodynamic benefit from USAT was found in patients with high-risk PE and in those with symptom duration < 14 days. CONCLUSION A standardized catheter intervention approach using fixed low-dose USAT for the treatment of intermediate- and high-risk PE was associated with rapid improvement in haemodynamic parameters and low rates of bleeding complications and mortality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diarrhea disease is a leading cause of morbidity and mortality, especially in children in developing countries. An estimate of the global mortality caused by diarrhea among children under five years of age was 3.3 million deaths per year. Cryptosporidium parvum was first identified in 1907, but it was not until 1970 that this organism was recognized as a cause of diarrhea in calves. Then it was as late as 1976 that the first reported case of human Cryptosporidiosis occurred. This study was conducted to ascertain the risk factors of first symptomatic infection with Cryptosporidium parvum in a cohort of infants in a rural area of Egypt. The cohort was followed from birth through the first year of life. Univariate and multivariate analyses of data demonstrated that infants greater than six months of age had a two-fold risk of infection compared with infants less than six months of age (RR = 2.17; 95% C.I. = 1.01-4.82). When stratified, male infants greater than six months of age were four times more likely to become infected than male infants less than six months of age. Among female infants, there was no difference in risk between infants greater than six months of age and infants less than six months of age. Female infants less than six months of age were twice more likely to become infected than male infants less than six months of age. The reverse occurred for infants greater than six months of age, i.e., male infants greater than six months of age had twice the risk of infection compared to females of the same age group. Further analysis of the data revealed an increased risk of Cryptosporidiosis infection in infants who were attended in childbirth by traditional childbirth attendants compared to infants who were attended by modern childbirth attendants (nurses, trained midwives, physicians) (RR = 4. 18; 95% C.I. = 1.05-36.06). The final risk factor of significance was the number of people residing in the household. Infants in households which housed more than seven persons had an almost two-fold risk of infection compared with infants in homes with fewer than seven persons. Other risk factors which suggested increased risk were lack of education among the mothers, absence of latrines and faucets in the homes, and mud used as building material for walls and floors in the homes. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer is a chronic disease that often necessitates recurrent hospitalizations, a costly pattern of medical care utilization. In chronically ill patients, most readmissions are for treatment of the same condition that caused the preceding hospitalization. There is concern that rather than reducing costs, earlier discharge may shift costs from the initial hospitalization to emergency center visits. ^ This is the first descriptive study to measure the incidence of emergency center visits (ECVs) after hospitalization at The University of M. D. Anderson Cancer Center (UTMDACC), to identify the risk factors for and outcomes of these ECVs, and to compare 30-day all-cause mortality and costs for episodes of care with and without ECVs. ^ We identified all hospitalizations at UTMDACC with admission dates from September 1, 1993 through August 31, 1997 which met inclusion criteria. Data were electronically obtained primarily from UTMDACC's institutional database. Demographic factors, clinical factors, duration of the index hospitalization, method of payment for care, and year of hospitalization study were variables determined for each hospitalization. ^ The overall incidence of ECVs was 18%. Forty-five percent of ECVs resulted in hospital readmission (8% of all hospitalizations). In 1% of ECVs the patient died in the emergency center, and for the remaining 54% of ECVs the patient was discharged home. Risk factors for ECVs were marital status, type of index hospitalization, cancer type, and duration of the index hospitalization. The overall 30-day all-cause mortality rate was 8.6% for hospitalizations with an ECV and 5.3% for those without an ECV. In all subgroups, the 30-day all-cause mortality rate was higher for groups with ECVs than for those without ECVs. The most important factor increasing cost was having an ECV. In all patient subgroups, the cost per episode of care with an ECV was at least 1.9 times the cost per episode without an ECV. ^ The higher costs and poorer outcomes of episodes of care with ECVs and hospital readmissions suggest that interventions to avoid these ECVs or mitigate their costs are needed. Further research is needed to improve understanding of the methodological issues involved in relation to health care issues for cancer patients. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Breast cancer incidence and mortality rates for Hispanic women are lower than for non-Hispanic white (NHW) women, but recently rates have increased more rapidly among Hispanic women. Many studies have shown a consistent increased breast cancer risk associated with modest or high alcohol intake, but few included Hispanic women. Alcohol consumption and risk of breast cancer was investigated in a New Mexico statewide population-based case-control study. The New Mexico Tumor Registry ascertained women, newly diagnosed with breast cancer (1992–1994) aged 30–74 years. Controls were identified by random digit dialing and were frequency-matched for ethnicity, age-group, and health planning district. In-person interviews of 712 cases and 844 controls were conducted. Data were collected for breast cancer risk factors, including alcohol intake. Recent alcohol intake data was collected for a four-week period, six months prior to interview. Past alcohol intake included information on alcohol consumption at ages 25, 35, and 50. History of alcohol consumption was reported by 81% of cases and 85% of controls. Of these women, 42% of cases and 48% of controls reported recent alcohol intake. Results for past alcohol intake did not show any trend with breast cancer risk, and were nonsignificant. Multivariate-adjusted odds ratios for recent alcohol intake and breast cancer suggested an increased risk at the highest level for both ethnic groups, but estimates were unstable and statistically nonsignificant. Low level of recent alcohol intake (<148 grams/week) was associated with a reduced risk for NHW women (Odds Ratio (OR) = 0.49 95% Confidence Interval (CI) 0.35–0.69). This pattern was independent of hormone-receptor status. The reduced breast cancer risk for low alcohol intake was present for premenopausal (OR = 0.29, 95% CI 0.15–0.56) and postmenopausal NHW women (OR = 0.56, 95% CI 0.35–0.90). The possibility of an increased risk associated with high alcohol intake could not be adequately addressed, because there were few drinkers with more than light to moderate intake, especially among Hispanic women. An alcohol-estrogen link is hypothesized to be the mechanism responsible for increased breast cancer risk, but has not been consistently substantiated. More studies are needed of the underlying mechanism for an association between alcohol intake and breast cancer. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Blood cholesterol and blood pressure development in childhood and adolescence have important impact on the future adult level of cholesterol and blood pressure, and on increased risk of cardiovascular diseases. The U.S. has higher mortality rates of coronary heart diseases than Japan. A longitudinal comparison in children of risk factor development in the two countries provides more understanding about the causes of cardiovascular disease and its prevention. Such comparisons have not been reported in the past. ^ In Project HeartBeat!, 506 non-Hispanic white, 136 black and 369 Japanese children participated in the study in the U.S. and Japan from 1991 to 1995. A synthetic cohort of ages 8 to 18 years was composed by three cohorts with starting ages at 8, 11, and 14. A multilevel regression model was used for data analysis. ^ The study revealed that the Japanese children had significantly higher slopes of mean total cholesterol (TC) and high density lipoprotein (HDL) cholesterol levels than the U.S. children after adjusting for age and sex. The mean TC level of Japanese children was not significantly different from white and black children. The mean HDL level of Japanese children was significantly higher than white and black children after adjusting for age and sex. The ratio of HDL/TC in Japanese children was significantly higher than in U.S. whites, but not significantly different from the black children. The Japanese group had significantly lower mean diastolic blood pressure phase IV (DBP4) and phase V (DBP5) than the two U.S. groups. The Japanese group also showed significantly higher slopes in systolic blood pressure, DBP5 and DBP4 during the study period than both U.S. groups. The differences were independent from height and body mass index. ^ The study provided the first longitudinal comparison of blood cholesterol and blood pressure between the U.S. and Japanese children and adolescents. It revealed the dynamic process of these factors in the three ethnic groups. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Conventional factors do not fully explain the distribution of cardiovascular outcomes. Biomarkers are known to participate in well-established pathways associated with cardiovascular disease, and may therefore provide further information over and above conventional risk factors. This study sought to determine whether individual and/or combined assessment of 9 biomarkers improved discrimination, calibration and reclassification of cardiovascular mortality. METHODS 3267 patients (2283 men), aged 18-95 years, at intermediate-to-high-risk of cardiovascular disease were followed in this prospective cohort study. Conventional risk factors and biomarkers were included based on forward and backward Cox proportional stepwise selection models. RESULTS During 10-years of follow-up, 546 fatal cardiovascular events occurred. Four biomarkers (interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D) were retained during stepwise selection procedures for subsequent analyses. Simultaneous inclusion of these biomarkers significantly improved discrimination as measured by the C-index (0.78, P = 0.0001), and integrated discrimination improvement (0.0219, P<0.0001). Collectively, these biomarkers improved net reclassification for cardiovascular death by 10.6% (P<0.0001) when added to the conventional risk model. CONCLUSIONS In terms of adverse cardiovascular prognosis, a biomarker panel consisting of interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D offered significant incremental value beyond that conveyed by simple conventional risk factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS High-density lipoprotein (HDL) cholesterol is a strong predictor of cardiovascular mortality. This work aimed to investigate whether the presence of coronary artery disease (CAD) impacts on its predictive value. METHODS AND RESULTS We studied 3141 participants (2191 males, 950 females) of the LUdwigshafen RIsk and Cardiovascular health (LURIC) study. They had a mean ± standard deviation age of 62.6 ± 10.6 years, body mass index of 27.5 ± 4.1 kg/m², and HDL cholesterol of 38.9 ± 10.8 mg/dL. The cohort consisted of 699 people without CAD, 1515 patients with stable CAD, and 927 patients with unstable CAD. The participants were prospectively followed for cardiovascular mortality over a median (inter-quartile range) period of 9.9 (8.7-10.7) years. A total of 590 participants died from cardiovascular diseases. High-density lipoprotein cholesterol by tertiles was inversely related to cardiovascular mortality in the entire cohort (P = 0.009). There was significant interaction between HDL cholesterol and CAD in predicting the outcome (P = 0.007). In stratified analyses, HDL cholesterol was strongly associated with cardiovascular mortality in people without CAD [3rd vs. 1st tertile: HR (95% CI) = 0.37 (0.18-0.74), P = 0.005], but not in patients with stable [3rd vs. 1st tertile: HR (95% CI) = 0.81 (0.61-1.09), P = 0.159] and unstable [3rd vs. 1st tertile: HR (95% CI) = 0.91 (0.59-1.41), P = 0.675] CAD. These results were replicated by analyses in 3413 participants of the AtheroGene cohort and 5738 participants of the ESTHER cohort, and by a meta-analysis comprising all three cohorts. CONCLUSION The inverse relationship of HDL cholesterol with cardiovascular mortality is weakened in patients with CAD. The usefulness of considering HDL cholesterol for cardiovascular risk stratification seems limited in such patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a need to validate risk assessment tools for hospitalised medical patients at risk of venous thromboembolism (VTE). We investigated whether a predefined cut-off of the Geneva Risk Score, as compared to the Padua Prediction Score, accurately distinguishes low-risk from high-risk patients regardless of the use of thromboprophylaxis. In the multicentre, prospective Explicit ASsessment of Thromboembolic RIsk and Prophylaxis for Medical PATients in SwitzErland (ESTIMATE) cohort study, 1,478 hospitalised medical patients were enrolled of whom 637 (43%) did not receive thromboprophylaxis. The primary endpoint was symptomatic VTE or VTE-related death at 90 days. The study is registered at ClinicalTrials.gov, number NCT01277536. According to the Geneva Risk Score, the cumulative rate of the primary endpoint was 3.2% (95% confidence interval [CI] 2.2-4.6%) in 962 high-risk vs 0.6% (95% CI 0.2-1.9%) in 516 low-risk patients (p=0.002); among patients without prophylaxis, this rate was 3.5% vs 0.8% (p=0.029), respectively. In comparison, the Padua Prediction Score yielded a cumulative rate of the primary endpoint of 3.5% (95% CI 2.3-5.3%) in 714 high-risk vs 1.1% (95% CI 0.6-2.3%) in 764 low-risk patients (p=0.002); among patients without prophylaxis, this rate was 3.2% vs 1.5% (p=0.130), respectively. Negative likelihood ratio was 0.28 (95% CI 0.10-0.83) for the Geneva Risk Score and 0.51 (95% CI 0.28-0.93) for the Padua Prediction Score. In conclusion, among hospitalised medical patients, the Geneva Risk Score predicted VTE and VTE-related mortality and compared favourably with the Padua Prediction Score, particularly for its accuracy to identify low-risk patients who do not require thromboprophylaxis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Medial arterial calcification is accelerated in patients with CKD and strongly associated with increased arterial rigidity and cardiovascular mortality. Recently, a novel in vitro blood test that provides an overall measure of calcification propensity by monitoring the maturation time (T50) of calciprotein particles in serum was described. We used this test to measure serum T50 in a prospective cohort of 184 patients with stages 3 and 4 CKD, with a median of 5.3 years of follow-up. At baseline, the major determinants of serum calcification propensity included higher serum phosphate, ionized calcium, increased bone osteoclastic activity, and lower free fetuin-A, plasma pyrophosphate, and albumin concentrations, which accounted for 49% of the variation in this parameter. Increased serum calcification propensity at baseline independently associated with aortic pulse wave velocity in the complete cohort and progressive aortic stiffening over 30 months in a subgroup of 93 patients. After adjustment for demographic, renal, cardiovascular, and biochemical covariates, including serum phosphate, risk of death among patients in the lowest T50 tertile was more than two times the risk among patients in the highest T50 tertile (adjusted hazard ratio, 2.2; 95% confidence interval, 1.1 to 5.4; P=0.04). This effect was lost, however, after additional adjustment for aortic stiffness, suggesting a shared causal pathway. Longitudinally, serum calcification propensity measurements remained temporally stable (intraclass correlation=0.81). These results suggest that serum T50 may be helpful as a biomarker in designing methods to improve defenses against vascular calcification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: There are differences in the literature regarding outcomes of premature small-for-gestational-age (SGA) and appropriate-for gestational-age (AGA) infants, possibly due to failure to take into account gestational age at birth. OBJECTIVE: To compare mortality and respiratory morbidity of SGA and AGA premature newborn infants. DESIGN/METHODS: A retrospective study was done of the 2,487 infants born without congenital anomalies at RESULTS: Controlling for GA, premature SGA infants were at a higher risk for mortality (Odds ratio 3.1, P = 0.001) and at lower risk of respiratory distress syndrome (OR = 0.71, p = 0.02) than AGA infants. However multivariate logistic regression modeling found that the odds of having respiratory distress syndrome (RDS) varied between SGA and AGA infants by GA. There was no change in RDS risk in SGA infants at GA 32 wk (OR = 0.41, 95% CI 0.27 - 0.63; p < 0.01). After controlling for GA, SGA infants were observed to be at a significantly higher risk for developing chronic lung disease as compared to AGA infants (OR = 2.2, 95% CI = 1.2 - 3.9, P = 0.01). There was no significant difference between SGA and AGA infants in total days on ventilator. Among infants who survived, mean length of hospital stay was significantly higher in SGA infants born between 26-36 wks GA than AGA infants. CONCLUSIONS: Premature SGA infants have significantly higher mortality, significantly higher risk of developing chronic lung disease and longer hospital stay as compared to premature AGA infants. Even the reduced risk of RDS in infants born at >/=32 wk GA, (conferred possibly by intra-uterine stress leading to accelerated lung maturation) appears to be of transient effect and is counterbalanced by adverse effects of poor intrauterine growth on long term pulmonary outcomes such as chronic lung disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Elevated resting heart rate is known to be detrimental to morbidity and mortality in cardiovascular disease, though its effect in patients with ischemic stroke is unclear. We analyzed the effect of baseline resting heart rate on myocardial infarction (MI) in patients with a recent noncardioembolic cerebral ischemic event participating in PERFORM. METHODS We compared fatal or nonfatal MI using adjusted Cox proportional hazards models for PERFORM patients with baseline heart rate <70 bpm (n=8178) or ≥70 bpm (n=10,802). In addition, heart rate was analyzed as a continuous variable. Other cerebrovascular and cardiovascular outcomes were also explored. RESULTS Heart rate ≥70 bpm was associated with increased relative risk for fatal or nonfatal MI (HR 1.32, 95% CI 1.03-1.69, P=0.029). For every 5-bpm increase in heart rate, there was an increase in relative risk for fatal and nonfatal MI (11.3%, P=0.0002). Heart rate ≥70 bpm was also associated with increased relative risk for a composite of fatal or nonfatal ischemic stroke, fatal or nonfatal MI, or other vascular death (excluding hemorrhagic death) (P<0001); vascular death (P<0001); all-cause mortality (P<0001); and fatal or nonfatal stroke (P=0.04). For every 5-bpm increase in heart rate, there were increases in relative risk for fatal or nonfatal ischemic stroke, fatal or nonfatal MI, or other vascular death (4.7%, P<0.0001), vascular death (11.0%, P<0.0001), all-cause mortality (8.0%, P<0.0001), and fatal and nonfatal stroke (2.4%, P=0.057). CONCLUSION Elevated heart rate ≥70 bpm places patients with a noncardioembolic cerebral ischemic event at increased risk for MI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cirrhotic patients with chronic hepatitis C virus (HCV) infection remain at risk for complications following sustained virological response (SVR). Therefore, we aimed to evaluate treatment efficacy with the number needed to treat (NNT) to prevent clinical endpoints. Mortality and cirrhosis-related morbidity were assessed in an international multicentre cohort of consecutively treated patients with HCV genotype 1 infection and cirrhosis. The NNT to prevent death or clinical disease progression (any cirrhosis-related event or death) in one patient was determined with the adjusted (event-free) survival among patients without SVR and adjusted hazard ratio of SVR. Overall, 248 patients were followed for a median of 8.3 (IQR 6.2-11.1) years. Fifty-nine (24%) patients attained SVR. Among patients without SVR, the adjusted 5-year survival and event-free survival were 94.4% and 80.0%, respectively. SVR was associated with reduced all-cause mortality (HR 0.15, 95% CI 0.05-0.48, P = 0.002) and clinical disease progression (HR 0.16, 95% CI 0.07-0.36, P < 0.001). The NNT to prevent one death in 5 years declined from 1052 (95% CI 937-1755) at 2% SVR (interferon monotherapy) to 61 (95% CI 54-101) at 35% SVR (peginterferon and ribavirin). At 50% SVR, which might be expected with triple therapy, the estimated NNT was 43 (95% CI 38-71). The NNT to prevent clinical disease progression in one patient in 5 years was 302 (95% CI 271-407), 18 (95% CI 16-24) and 13 (95% CI 11-17) at 2%, 35% and 50% SVR, respectively. In conclusion, the NNT to prevent clinical endpoints among cirrhotic patients with HCV genotype 1 has declined enormously with the improvement of antiviral therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The objective of the present investigation is to assess the baseline mortality-adjusted 10-year survival of rectal cancer patients. METHODS Ten-year survival was analyzed in 771 consecutive American Joint Committee on Cancer (AJCC) stage I-IV rectal cancer patients undergoing open resection between 1991 and 2008 using risk-adjusted Cox proportional hazard regression models adjusting for population-based baseline mortality. RESULTS The median follow-up of patients alive was 8.8 years. The 10-year relative, overall, and cancer-specific survival were 66.5% [95% confidence interval (CI) 61.3-72.1], 48.7% (95% CI 44.9-52.8), and 66.4% (95% CI 62.5-70.5), respectively. In the entire patient sample (stage I-IV) 47.3% and in patients with stage I-III 33.6 % of all deaths were related to rectal cancer during the 10-year period. For patients with AJCC stage I rectal cancer, the 10-year overall survival was 96% and did not significantly differ from an average population after matching for gender, age, and calendar year (p = 0.151). For the more advanced tumor stages, however, survival was significantly impaired (p < 0.001). CONCLUSIONS Retrospective investigations of survival after rectal cancer resection should adjust for baseline mortality because a large fraction of deaths is not cancer related. Stage I rectal cancer patients, compared to patients with more advanced disease stages, have a relative survival close to 100% and can thus be considered cured. Using this relative-survival approach, the real public health burden caused by rectal cancer can reliably be analyzed and reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Treatment of patients with paediatric acute lymphoblastic leukaemia has evolved such that the risk of late effects in survivors treated in accordance with contemporary protocols could be different from that noted in those treated decades ago. We aimed to estimate the risk of late effects in children with standard-risk acute lymphoblastic leukaemia treated with contemporary protocols. METHODS We used data from similarly treated members of the Childhood Cancer Survivor Study cohort. The Childhood Cancer Survivor Study is a multicentre, North American study of 5-year survivors of childhood cancer diagnosed between 1970 and 1986. We included cohort members if they were aged 1·0-9·9 years at the time of diagnosis of acute lymphoblastic leukaemia and had received treatment consistent with contemporary standard-risk protocols for acute lymphoblastic leukaemia. We calculated mortality rates and standardised mortality ratios, stratified by sex and survival time, after diagnosis of acute lymphoblastic leukaemia. We calculated standardised incidence ratios and absolute excess risk for subsequent neoplasms with age-specific, sex-specific, and calendar-year-specific rates from the Surveillance, Epidemiology and End Results Program. Outcomes were compared with a sibling cohort and the general US population. FINDINGS We included 556 (13%) of 4329 cohort members treated for acute lymphoblastic leukaemia. Median follow-up of the survivors from 5 years after diagnosis was 18·4 years (range 0·0-33·0). 28 (5%) of 556 participants had died (standardised mortality ratio 3·5, 95% CI 2·3-5·0). 16 (57%) deaths were due to causes other than recurrence of acute lymphoblastic leukaemia. Six (1%) survivors developed a subsequent malignant neoplasm (standardised incidence ratio 2·6, 95% CI 1·0-5·7). 107 participants (95% CI 81-193) in each group would need to be followed-up for 1 year to observe one extra chronic health disorder in the survivor group compared with the sibling group. 415 participants (376-939) in each group would need to be followed-up for 1 year to observe one extra severe, life-threatening, or fatal disorder in the group of survivors. Survivors did not differ from siblings in their educational attainment, rate of marriage, or independent living. INTERPRETATION The prevalence of adverse long-term outcomes in children treated for standard risk acute lymphoblastic leukaemia according to contemporary protocols is low, but regular care from a knowledgeable primary-care practitioner is warranted. FUNDING National Cancer Institute, American Lebanese-Syrian Associated Charities, Swiss Cancer Research.