121 resultados para mortality-incidence ratio

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: We investigated the incidence and outcome of progressive multifocal leukoencephalopathy (PML) in human immunodeficiency virus (HIV)-infected individuals before and after the introduction of combination antiretroviral therapy (cART) in 1996. METHODS: From 1988 through 2007, 226 cases of PML were reported to the Swiss HIV Cohort Study. By chart review, we confirmed 186 cases and recorded all-cause and PML-attributable mortality. For the survival analysis, 25 patients with postmortem diagnosis and 2 without CD4+ T cell counts were excluded, leaving a total of 159 patients (89 before 1996 and 70 during 1996-2007). RESULTS: The incidence rate of PML decreased from 0.24 cases per 100 patient-years (PY; 95% confidence interval [CI], 0.20-0.29 cases per 100 PY) before 1996 to 0.06 cases per 100 PY (95% CI, 0.04-0.10 cases per 100 PY) from 1996 onward. Patients who received a diagnosis before 1996 had a higher frequency of prior acquired immunodeficiency syndrome-defining conditions (P = .007) but similar CD4+ T cell counts (60 vs. 71 cells/microL; P = .25), compared with patients who received a diagnosis during 1996 or thereafter. The median time to PML-attributable death was 71 days (interquartile range, 44-140 days), compared with 90 days (interquartile range, 54-313 days) for all-cause mortality. The PML-attributable 1-year mortality rate decreased from 82.3 cases per 100 PY (95% CI, 58.8-115.1 cases per 100 PY) during the pre-cART era to 37.6 cases per 100 PY (95% CI, 23.4.-60.5 cases per 100 PY) during the cART era. In multivariate models, cART was the only factor associated with lower PML-attributable mortality (hazard ratio, 0.18; 95% CI, 0.07-0.50; P < .001), whereas all-cause mortality was associated with baseline CD4+ T cell count (hazard ratio per increase of 100 cells/microL, 0.52; 95% CI, 0.32-0.85; P = .010) and cART use (hazard ratio, 0.37; 95% CI, 0.19-0.75; P = .006). CONCLUSIONS: cART reduced the incidence and PML-attributable 1-year mortality, regardless of baseline CD4+ T cell count, whereas overall mortality was dependent on cART use and baseline CD4+ T cell count.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: The extent to which mortality differs following individual acquired immunodeficiency syndrome (AIDS)-defining events (ADEs) has not been assessed among patients initiating combination antiretroviral therapy. METHODS: We analyzed data from 31,620 patients with no prior ADEs who started combination antiretroviral therapy. Cox proportional hazards models were used to estimate mortality hazard ratios for each ADE that occurred in >50 patients, after stratification by cohort and adjustment for sex, HIV transmission group, number of antiretroviral drugs initiated, regimen, age, date of starting combination antiretroviral therapy, and CD4+ cell count and HIV RNA load at initiation of combination antiretroviral therapy. ADEs that occurred in <50 patients were grouped together to form a "rare ADEs" category. RESULTS: During a median follow-up period of 43 months (interquartile range, 19-70 months), 2880 ADEs were diagnosed in 2262 patients; 1146 patients died. The most common ADEs were esophageal candidiasis (in 360 patients), Pneumocystis jiroveci pneumonia (320 patients), and Kaposi sarcoma (308 patients). The greatest mortality hazard ratio was associated with non-Hodgkin's lymphoma (hazard ratio, 17.59; 95% confidence interval, 13.84-22.35) and progressive multifocal leukoencephalopathy (hazard ratio, 10.0; 95% confidence interval, 6.70-14.92). Three groups of ADEs were identified on the basis of the ranked hazard ratios with bootstrapped confidence intervals: severe (non-Hodgkin's lymphoma and progressive multifocal leukoencephalopathy [hazard ratio, 7.26; 95% confidence interval, 5.55-9.48]), moderate (cryptococcosis, cerebral toxoplasmosis, AIDS dementia complex, disseminated Mycobacterium avium complex, and rare ADEs [hazard ratio, 2.35; 95% confidence interval, 1.76-3.13]), and mild (all other ADEs [hazard ratio, 1.47; 95% confidence interval, 1.08-2.00]). CONCLUSIONS: In the combination antiretroviral therapy era, mortality rates subsequent to an ADE depend on the specific diagnosis. The proposed classification of ADEs may be useful in clinical end point trials, prognostic studies, and patient management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND Treatment of patients with paediatric acute lymphoblastic leukaemia has evolved such that the risk of late effects in survivors treated in accordance with contemporary protocols could be different from that noted in those treated decades ago. We aimed to estimate the risk of late effects in children with standard-risk acute lymphoblastic leukaemia treated with contemporary protocols. METHODS We used data from similarly treated members of the Childhood Cancer Survivor Study cohort. The Childhood Cancer Survivor Study is a multicentre, North American study of 5-year survivors of childhood cancer diagnosed between 1970 and 1986. We included cohort members if they were aged 1·0-9·9 years at the time of diagnosis of acute lymphoblastic leukaemia and had received treatment consistent with contemporary standard-risk protocols for acute lymphoblastic leukaemia. We calculated mortality rates and standardised mortality ratios, stratified by sex and survival time, after diagnosis of acute lymphoblastic leukaemia. We calculated standardised incidence ratios and absolute excess risk for subsequent neoplasms with age-specific, sex-specific, and calendar-year-specific rates from the Surveillance, Epidemiology and End Results Program. Outcomes were compared with a sibling cohort and the general US population. FINDINGS We included 556 (13%) of 4329 cohort members treated for acute lymphoblastic leukaemia. Median follow-up of the survivors from 5 years after diagnosis was 18·4 years (range 0·0-33·0). 28 (5%) of 556 participants had died (standardised mortality ratio 3·5, 95% CI 2·3-5·0). 16 (57%) deaths were due to causes other than recurrence of acute lymphoblastic leukaemia. Six (1%) survivors developed a subsequent malignant neoplasm (standardised incidence ratio 2·6, 95% CI 1·0-5·7). 107 participants (95% CI 81-193) in each group would need to be followed-up for 1 year to observe one extra chronic health disorder in the survivor group compared with the sibling group. 415 participants (376-939) in each group would need to be followed-up for 1 year to observe one extra severe, life-threatening, or fatal disorder in the group of survivors. Survivors did not differ from siblings in their educational attainment, rate of marriage, or independent living. INTERPRETATION The prevalence of adverse long-term outcomes in children treated for standard risk acute lymphoblastic leukaemia according to contemporary protocols is low, but regular care from a knowledgeable primary-care practitioner is warranted. FUNDING National Cancer Institute, American Lebanese-Syrian Associated Charities, Swiss Cancer Research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND Febrile neutropenia (FN) and other infectious complications are some of the most serious treatment-related toxicities of chemotherapy for cancer, with a mortality rate of 2% to 21%. The two main types of prophylactic regimens are granulocyte (macrophage) colony-stimulating factors (G(M)-CSF) and antibiotics, frequently quinolones or cotrimoxazole. Current guidelines recommend the use of colony-stimulating factors when the risk of febrile neutropenia is above 20%, but they do not mention the use of antibiotics. However, both regimens have been shown to reduce the incidence of infections. Since no systematic review has compared the two regimens, a systematic review was undertaken. OBJECTIVES To compare the efficacy and safety of G(M)-CSF compared to antibiotics in cancer patients receiving myelotoxic chemotherapy. SEARCH METHODS We searched The Cochrane Library, MEDLINE, EMBASE, databases of ongoing trials, and conference proceedings of the American Society of Clinical Oncology and the American Society of Hematology (1980 to December 2015). We planned to include both full-text and abstract publications. Two review authors independently screened search results. SELECTION CRITERIA We included randomised controlled trials (RCTs) comparing prophylaxis with G(M)-CSF versus antibiotics for the prevention of infection in cancer patients of all ages receiving chemotherapy. All study arms had to receive identical chemotherapy regimes and other supportive care. We included full-text, abstracts, and unpublished data if sufficient information on study design, participant characteristics, interventions and outcomes was available. We excluded cross-over trials, quasi-randomised trials and post-hoc retrospective trials. DATA COLLECTION AND ANALYSIS Two review authors independently screened the results of the search strategies, extracted data, assessed risk of bias, and analysed data according to standard Cochrane methods. We did final interpretation together with an experienced clinician. MAIN RESULTS In this updated review, we included no new randomised controlled trials. We included two trials in the review, one with 40 breast cancer patients receiving high-dose chemotherapy and G-CSF compared to antibiotics, a second one evaluating 155 patients with small-cell lung cancer receiving GM-CSF or antibiotics.We judge the overall risk of bias as high in the G-CSF trial, as neither patients nor physicians were blinded and not all included patients were analysed as randomised (7 out of 40 patients). We considered the overall risk of bias in the GM-CSF to be moderate, because of the risk of performance bias (neither patients nor personnel were blinded), but low risk of selection and attrition bias.For the trial comparing G-CSF to antibiotics, all cause mortality was not reported. There was no evidence of a difference for infection-related mortality, with zero events in each arm. Microbiologically or clinically documented infections, severe infections, quality of life, and adverse events were not reported. There was no evidence of a difference in frequency of febrile neutropenia (risk ratio (RR) 1.22; 95% confidence interval (CI) 0.53 to 2.84). The quality of the evidence for the two reported outcomes, infection-related mortality and frequency of febrile neutropenia, was very low, due to the low number of patients evaluated (high imprecision) and the high risk of bias.There was no evidence of a difference in terms of median survival time in the trial comparing GM-CSF and antibiotics. Two-year survival times were 6% (0 to 12%) in both arms (high imprecision, low quality of evidence). There were four toxic deaths in the GM-CSF arm and three in the antibiotics arm (3.8%), without evidence of a difference (RR 1.32; 95% CI 0.30 to 5.69; P = 0.71; low quality of evidence). There were 28% grade III or IV infections in the GM-CSF arm and 18% in the antibiotics arm, without any evidence of a difference (RR 1.55; 95% CI 0.86 to 2.80; P = 0.15, low quality of evidence). There were 5 episodes out of 360 cycles of grade IV infections in the GM-CSF arm and 3 episodes out of 334 cycles in the cotrimoxazole arm (0.8%), with no evidence of a difference (RR 1.55; 95% CI 0.37 to 6.42; P = 0.55; low quality of evidence). There was no significant difference between the two arms for non-haematological toxicities like diarrhoea, stomatitis, infections, neurologic, respiratory, or cardiac adverse events. Grade III and IV thrombopenia occurred significantly more frequently in the GM-CSF arm (60.8%) compared to the antibiotics arm (28.9%); (RR 2.10; 95% CI 1.41 to 3.12; P = 0.0002; low quality of evidence). Neither infection-related mortality, incidence of febrile neutropenia, nor quality of life were reported in this trial. AUTHORS' CONCLUSIONS As we only found two small trials with 195 patients altogether, no conclusion for clinical practice is possible. More trials are necessary to assess the benefits and harms of G(M)-CSF compared to antibiotics for infection prevention in cancer patients receiving chemotherapy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction The survival of patients admitted to an emergency department is determined by the severity of acute illness and the quality of care provided. The high number and the wide spectrum of severity of illness of admitted patients make an immediate assessment of all patients unrealistic. The aim of this study is to evaluate a scoring system based on readily available physiological parameters immediately after admission to an emergency department (ED) for the purpose of identification of at-risk patients. Methods This prospective observational cohort study includes 4,388 consecutive adult patients admitted via the ED of a 960-bed tertiary referral hospital over a period of six months. Occurrence of each of seven potential vital sign abnormalities (threat to airway, abnormal respiratory rate, oxygen saturation, systolic blood pressure, heart rate, low Glasgow Coma Scale and seizures) was collected and added up to generate the vital sign score (VSS). VSSinitial was defined as the VSS in the first 15 minutes after admission, VSSmax as the maximum VSS throughout the stay in ED. Occurrence of single vital sign abnormalities in the first 15 minutes and VSSinitial and VSSmax were evaluated as potential predictors of hospital mortality. Results Logistic regression analysis identified all evaluated single vital sign abnormalities except seizures and abnormal respiratory rate to be independent predictors of hospital mortality. Increasing VSSinitial and VSSmax were significantly correlated to hospital mortality (odds ratio (OR) 2.80, 95% confidence interval (CI) 2.50 to 3.14, P < 0.0001 for VSSinitial; OR 2.36, 95% CI 2.15 to 2.60, P < 0.0001 for VSSmax). The predictive power of VSS was highest if collected in the first 15 minutes after ED admission (log rank Chi-square 468.1, P < 0.0001 for VSSinitial;,log rank Chi square 361.5, P < 0.0001 for VSSmax). Conclusions Vital sign abnormalities and VSS collected in the first minutes after ED admission can identify patients at risk of an unfavourable outcome.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The occurrence of depression in patients with coronary heart disease (CHD) substantially increases the likelihood of a poorer cardiovascular prognosis. Although antidepressants are generally effective in decreasing depression, their use in patients with CHD is controversial. We carried out a meta-analysis to evaluate the health effects of selective serotonin reuptake inhibitors (SSRIs) versus placebo or no antidepressants in patients with CHD and depression. Observational studies and randomized controlled trials (RCTs) were searched in MEDLINE, EMBASE, PsycINFO, Cochrane Controlled Clinical Trial Register and other trial registries, and references of relevant articles. Primary outcomes were readmission for CHD (including myocardial infarction, unstable angina, and stroke) and all-cause mortality; the secondary outcome was severity of depression symptoms. Seven articles on 6 RCTs involving 2,461 participants were included. One study incorrectly randomized participants, and another was a reanalysis of RCT data. These were considered observational and analyzed separately. When only properly randomized trials were considered (n = 734 patients), patients on SSRIs showed no significant differences in mortality (risk ratio 0.39, 95% confidence interval 0.08 to 2.01) or CHD readmission rates (0.74, 0.44 to 1.23) compared to controls. Conversely, when all studies were included, SSRI use was associated with a significant decrease in CHD readmission (0.63, 0.46 to 0.86) and mortality rates (0.56, 0.35 to 0.88). A significantly greater improvement in depression symptoms was always apparent in patients on SSRIs with all selected indicators. In conclusion, in patients with CHD and depression, SSRI medication decreases depression symptoms and may improve CHD prognosis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Multidimensional preventive home visit programs aim at maintaining health and autonomy of older adults and preventing disability and subsequent nursing home admission, but results of randomized controlled trials (RCTs) have been inconsistent. Our objective was to systematically review RCTs examining the effect of home visit programs on mortality, nursing home admissions, and functional status decline. METHODS: Data sources were MEDLINE, EMBASE, Cochrane CENTRAL database, and references. Studies were reviewed to identify RCTs that compared outcome data of older participants in preventive home visit programs with control group outcome data. Publications reporting 21 trials were included. Data on study population, intervention characteristics, outcomes, and trial quality were double-extracted. We conducted random effects meta-analyses. RESULTS: Pooled effects estimates revealed statistically nonsignificant favorable, and heterogeneous effects on mortality (odds ratio [OR] 0.92, 95% confidence interval [CI], 0.80-1.05), functional status decline (OR 0.89, 95% CI, 0.77-1.03), and nursing home admission (OR 0.86, 95% CI, 0.68-1.10). A beneficial effect on mortality was seen in younger study populations (OR 0.74, 95% CI, 0.58-0.94) but not in older populations (OR 1.14, 95% CI, 0.90-1.43). Functional decline was reduced in programs including a clinical examination in the initial assessment (OR 0.64, 95% CI, 0.48-0.87) but not in other trials (OR 1.00, 95% CI, 0.88-1.14). There was no single factor explaining the heterogenous effects of trials on nursing home admissions. CONCLUSION: Multidimensional preventive home visits have the potential to reduce disability burden among older adults when based on multidimensional assessment with clinical examination. Effects on nursing home admissions are heterogeneous and likely depend on multiple factors including population factors, program characteristics, and health care setting.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION Early use of corticosteroids in patients affected by pandemic (H1N1)v influenza A infection, although relatively common, remains controversial. METHODS Prospective, observational, multicenter study from 23 June 2009 through 11 February 2010, reported in the European Society of Intensive Care Medicine (ESICM) H1N1 registry. RESULTS Two hundred twenty patients admitted to an intensive care unit (ICU) with completed outcome data were analyzed. Invasive mechanical ventilation was used in 155 (70.5%). Sixty-seven (30.5%) of the patients died in ICU and 75 (34.1%) whilst in hospital. One hundred twenty-six (57.3%) patients received corticosteroid therapy on admission to ICU. Patients who received corticosteroids were significantly older and were more likely to have coexisting asthma, chronic obstructive pulmonary disease (COPD), and chronic steroid use. These patients receiving corticosteroids had increased likelihood of developing hospital-acquired pneumonia (HAP) [26.2% versus 13.8%, p < 0.05; odds ratio (OR) 2.2, confidence interval (CI) 1.1-4.5]. Patients who received corticosteroids had significantly higher ICU mortality than patients who did not (46.0% versus 18.1%, p < 0.01; OR 3.8, CI 2.1-7.2). Cox regression analysis adjusted for severity and potential confounding factors identified that early use of corticosteroids was not significantly associated with mortality [hazard ratio (HR) 1.3, 95% CI 0.7-2.4, p = 0.4] but was still associated with an increased rate of HAP (OR 2.2, 95% CI 1.0-4.8, p < 0.05). When only patients developing acute respiratory distress syndrome (ARDS) were analyzed, similar results were observed. CONCLUSIONS Early use of corticosteroids in patients affected by pandemic (H1N1)v influenza A infection did not result in better outcomes and was associated with increased risk of superinfections.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Up to 1 in 6 patients undergoing transcatheter aortic valve implantation (TAVI) present with low-ejection fraction, low-gradient (LEF-LG) severe aortic stenosis and concomitant relevant mitral regurgitation (MR) is present in 30% to 55% of these patients. The effect of MR on clinical outcomes of LEF-LG patients undergoing TAVI is unknown. METHODS AND RESULTS Of 606 consecutive patients undergoing TAVI, 113 (18.7%) patients with LEF-LG severe aortic stenosis (mean gradient ≤40 mm Hg, aortic valve area <1.0 cm(2), left ventricular ejection fraction <50%) were analyzed. LEF-LG patients were dichotomized into ≤mild MR (n=52) and ≥moderate MR (n=61). Primary end point was all-cause mortality at 1 year. No differences in mortality were observed at 30 days (P=0.76). At 1 year, LEF-LG patients with ≥moderate MR had an adjusted 3-fold higher rate of all-cause mortality (11.5% versus 38.1%; adjusted hazard ratio, 3.27 [95% confidence interval, 1.31-8.15]; P=0.011), as compared with LEF-LG patients with ≤mild MR. Mortality was mainly driven by cardiac death (adjusted hazard ratio, 4.62; P=0.005). As compared with LEF-LG patients with ≥moderate MR assigned to medical therapy, LEF-LG patients with ≥moderate MR undergoing TAVI had significantly lower all-cause mortality (hazard ratio, 0.38; 95% confidence interval, 0.019-0.75) at 1 year. CONCLUSIONS Moderate or severe MR is a strong independent predictor of late mortality in LEF-LG patients undergoing TAVI. However, LEF-LG patients assigned to medical therapy have a dismal prognosis independent of MR severity suggesting that TAVI should not be withheld from symptomatic patients with LEF-LG severe aortic stenosis even in the presence of moderate or severe MR.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION Patients admitted to intensive care following surgery for faecal peritonitis present particular challenges in terms of clinical management and risk assessment. Collaborating surgical and intensive care teams need shared perspectives on prognosis. We aimed to determine the relationship between dynamic assessment of trends in selected variables and outcomes. METHODS We analysed trends in physiological and laboratory variables during the first week of intensive care unit (ICU) stay in 977 patients at 102 centres across 16 European countries. The primary outcome was 6-month mortality. Secondary endpoints were ICU, hospital and 28-day mortality. For each trend, Cox proportional hazards (PH) regression analyses, adjusted for age and sex, were performed for each endpoint. RESULTS Trends over the first 7 days of the ICU stay independently associated with 6-month mortality were worsening thrombocytopaenia (mortality: hazard ratio (HR) = 1.02; 95% confidence interval (CI), 1.01 to 1.03; P <0.001) and renal function (total daily urine output: HR =1.02; 95% CI, 1.01 to 1.03; P <0.001; Sequential Organ Failure Assessment (SOFA) renal subscore: HR = 0.87; 95% CI, 0.75 to 0.99; P = 0.047), maximum bilirubin level (HR = 0.99; 95% CI, 0.99 to 0.99; P = 0.02) and Glasgow Coma Scale (GCS) SOFA subscore (HR = 0.81; 95% CI, 0.68 to 0.98; P = 0.028). Changes in renal function (total daily urine output and renal component of the SOFA score), GCS component of the SOFA score, total SOFA score and worsening thrombocytopaenia were also independently associated with secondary outcomes (ICU, hospital and 28-day mortality). We detected the same pattern when we analysed trends on days 2, 3 and 5. Dynamic trends in all other measured laboratory and physiological variables, and in radiological findings, changes inrespiratory support, renal replacement therapy and inotrope and/or vasopressor requirements failed to be retained as independently associated with outcome in multivariate analysis. CONCLUSIONS Only deterioration in renal function, thrombocytopaenia and SOFA score over the first 2, 3, 5 and 7 days of the ICU stay were consistently associated with mortality at all endpoints. These findings may help to inform clinical decision making in patients with this common cause of critical illness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The long-term risk associated with different coronary artery disease (CAD) presentations in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stents (DES) is poorly characterized. We pooled patient-level data for women enrolled in 26 randomized clinical trials. Of 11,577 women included in the pooled database, 10,133 with known clinical presentation received a DES. Of them, 5,760 (57%) had stable angina pectoris (SAP), 3,594 (35%) had unstable angina pectoris (UAP) or non-ST-segment-elevation myocardial infarction (NSTEMI), and 779 (8%) had ST-segment-elevation myocardial infarction (STEMI) as clinical presentation. A stepwise increase in 3-year crude cumulative mortality was observed in the transition from SAP to STEMI (4.9% vs 6.1% vs 9.4%; p <0.01). Conversely, no differences in crude mortality rates were observed between 1 and 3 years across clinical presentations. After multivariable adjustment, STEMI was independently associated with greater risk of 3-year mortality (hazard ratio [HR] 3.45; 95% confidence interval [CI] 1.99 to 5.98; p <0.01), whereas no differences were observed between UAP or NSTEMI and SAP (HR 0.99; 95% CI 0.73 to 1.34; p = 0.94). In women with ACS, use of new-generation DES was associated with reduced risk of major adverse cardiac events (HR 0.58; 95% CI 0.34 to 0.98). The magnitude and direction of the effect with new-generation DES was uniform between women with or without ACS (pinteraction = 0.66). In conclusion, in women across the clinical spectrum of CAD, STEMI was associated with a greater risk of long-term mortality. Conversely, the adjusted risk of mortality between UAP or NSTEMI and SAP was similar. New-generation DESs provide improved long-term clinical outcomes irrespective of the clinical presentation in women.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND As access to antiretroviral therapy (ART) expands, increasing numbers of older patients will start treatment and need specialised long-term care. However, the effect of age in ART programmes in resource-constrained settings is poorly understood. The HIV epidemic is ageing rapidly and South Africa has one of the highest HIV population prevalences worldwide. We explored the effect of age on mortality of patients on ART in South Africa and whether this effect is mediated by baseline immunological status. METHODS In this retrospective cohort analysis, we studied HIV-positive patients aged 16-80 years who started ART for the first time in six large South African cohorts of the International Epidemiologic Databases to Evaluate AIDS-Southern Africa collaboration, in KwaZulu-Natal, Gauteng, and Western Cape (two primary care clinics, three hospitals, and a large rural cohort). The primary outcome was mortality. We ascertained patients' vital status through linkage to the National Population Register. We used inverse probability weighting to correct mortality for loss to follow-up. We estimated mortality using Cox's proportional hazards and competing risks regression. We tested the interaction between baseline CD4 cell count and age. FINDINGS Between Jan 1, 2004, and Dec 31, 2013, 84,078 eligible adults started ART. Of these, we followed up 83,566 patients for 174,640 patient-years. 8% (1817 of 23,258) of patients aged 16-29 years died compared with 19% (93 of 492) of patients aged 65 years or older. The age adjusted mortality hazard ratio was 2·52 (95% CI 2·01-3·17) for people aged 65 years or older compared with those 16-29 years of age. In patients starting ART with a CD4 count of less than 50 cells per μL, the adjusted mortality hazard ratio was 2·52 (2·04-3·11) for people aged 50 years or older compared with those 16-39 years old. Mortality was highest in patients with CD4 counts of less than 50 cells per μL, and 15% (1103 of 7295) of all patients aged 50 years or older starting ART were in this group. The proportion of patients aged 50 years or older enrolling in ART increased with successive years, from 6% (290 of 4999) in 2004 to 10% (961 of 9657) in 2012-13, comprising 9% of total enrolment (7295 of 83 566). At the end of the study, 6304 (14%) of 44,909 patients still alive and in care were aged 50 years or older. INTERPRETATION Health services need reorientation towards HIV diagnosis and starting of ART in older individuals. Policies are needed for long-term care of older people with HIV. FUNDING National Institutes of Health (National Institute of Allergy and Infectious Diseases), US Agency for International Development, and South African Centre for Epidemiological Modelling and Analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Treatment of chronic myeloid leukemia (CML) has been profoundly improved by the introduction of tyrosine kinase inhibitors (TKIs). Long-term survival with imatinib is excellent with a 8-year survival rate of ∼88%. Long-term toxicity of TKI treatment, especially carcinogenicity, has become a concern. We analyzed data of the CML study IV for the development of secondary malignancies. In total, 67 secondary malignancies were found in 64 of 1525 CML patients in chronic phase treated with TKI (n=61) and interferon-α only (n=3). The most common malignancies (n⩾4) were prostate, colorectal and lung cancer, non-Hodgkin's lymphoma (NHL), malignant melanoma, non-melanoma skin tumors and breast cancer. The standardized incidence ratio (SIR) for all malignancies excluding non-melanoma skin tumors was 0.88 (95% confidence interval (0.63-1.20)) for men and 1.06 (95% CI 0.69-1.55) for women. SIRs were between 0.49 (95% CI 0.13-1.34) for colorectal cancer in men and 4.29 (95% CI 1.09-11.66) for NHL in women. The SIR for NHL was significantly increased for men and women. An increase in the incidence of secondary malignancies could not be ascertained. The increased SIR for NHL has to be considered and long-term follow-up of CML patients is warranted, as the rate of secondary malignancies may increase over time.Leukemia advance online publication, 26 February 2016; doi:10.1038/leu.2016.20.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Cardiac troponin detected by new-generation, highly sensitive assays predicts clinical outcomes among patients with stable coronary artery disease (SCAD) treated medically. The prognostic value of baseline high-sensitivity cardiac troponin T (hs-cTnT) elevation in SCAD patients undergoing elective percutaneous coronary interventions is not well established. This study assessed the association of preprocedural levels of hs-cTnT with 1-year clinical outcomes among SCAD patients undergoing percutaneous coronary intervention. METHODS AND RESULTS Between 2010 and 2014, 6974 consecutive patients were prospectively enrolled in the Bern Percutaneous Coronary Interventions Registry. Among patients with SCAD (n=2029), 527 (26%) had elevated preprocedural hs-cTnT above the upper reference limit of 14 ng/L. The primary end point, mortality within 1 year, occurred in 20 patients (1.4%) with normal hs-cTnT versus 39 patients (7.7%) with elevated baseline hs-cTnT (P<0.001). Patients with elevated hs-cTnT had increased risks of all-cause (hazard ratio 5.73; 95% confidence intervals 3.34-9.83; P<0.001) and cardiac mortality (hazard ratio 4.68; 95% confidence interval 2.12-10.31; P<0.001). Preprocedural hs-TnT elevation remained an independent predictor of 1-year mortality after adjustment for relevant risk factors, including age, sex, and renal failure (adjusted hazard ratio 2.08; 95% confidence interval 1.10-3.92; P=0.024). A graded mortality risk was observed across higher tertiles of elevated preprocedural hs-cTnT, but not among patients with hs-cTnT below the upper reference limit. CONCLUSIONS Preprocedural elevation of hs-cTnT is observed in one fourth of SCAD patients undergoing elective percutaneous coronary intervention. Increased levels of preprocedural hs-cTnT are proportionally related to the risk of death and emerged as independent predictors of all-cause mortality within 1 year. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifier: NCT02241291.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND The application of therapeutic hypothermia (TH) for 12 to 24 hours following out-of-hospital cardiac arrest (OHCA) has been associated with decreased mortality and improved neurological function. However, the optimal duration of cooling is not known. We aimed to investigate whether targeted temperature management (TTM) at 33 ± 1 °C for 48 hours compared to 24 hours results in a better long-term neurological outcome. METHODS The TTH48 trial is an investigator-initiated pragmatic international trial in which patients resuscitated from OHCA are randomised to TTM at 33 ± 1 °C for either 24 or 48 hours. Inclusion criteria are: age older than 17 and below 80 years; presumed cardiac origin of arrest; and Glasgow Coma Score (GCS) <8, on admission. The primary outcome is neurological outcome at 6 months using the Cerebral Performance Category score (CPC) by an assessor blinded to treatment allocation and dichotomised to good (CPC 1-2) or poor (CPC 3-5) outcome. Secondary outcomes are: 6-month mortality, incidence of infection, bleeding and organ failure and CPC at hospital discharge, at day 28 and at day 90 following OHCA. Assuming that 50 % of the patients treated for 24 hours will have a poor outcome at 6 months, a study including 350 patients (175/arm) will have 80 % power (with a significance level of 5 %) to detect an absolute 15 % difference in primary outcome between treatment groups. A safety interim analysis was performed after the inclusion of 175 patients. DISCUSSION This is the first randomised trial to investigate the effect of the duration of TTM at 33 ± 1 °C in adult OHCA patients. We anticipate that the results of this trial will add significant knowledge regarding the management of cooling procedures in OHCA patients. TRIAL REGISTRATION NCT01689077.