166 resultados para Risk ratio
Resumo:
Bovine mastitis is a frequent problem in Swiss dairy herds. One of the main pathogens causing significant economic loss is Staphylococcus aureus. Various Staph. aureus genotypes with different biological properties have been described. Genotype B (GTB) of Staph. aureus was identified as the most contagious and one of the most prevalent strains in Switzerland. The aim of this study was to identify risk factors associated with the herd-level presence of Staph. aureus GTB and Staph. aureus non-GTB in Swiss dairy herds with an elevated yield-corrected herd somatic cell count (YCHSCC). One hundred dairy herds with a mean YCHSCC between 200,000 and 300,000cells/mL in 2010 were recruited and each farm was visited once during milking. A standardized protocol investigating demography, mastitis management, cow husbandry, milking system, and milking routine was completed during the visit. A bulk tank milk (BTM) sample was analyzed by real-time PCR for the presence of Staph. aureus GTB to classify the herds into 2 groups: Staph. aureus GTB-positive and Staph. aureus GTB-negative. Moreover, quarter milk samples were aseptically collected for bacteriological culture from cows with a somatic cell count ≥150,000cells/mL on the last test-day before the visit. The culture results allowed us to allocate the Staph. aureus GTB-negative farms to Staph. aureus non-GTB and Staph. aureus-free groups. Multivariable multinomial logistic regression models were built to identify risk factors associated with the herd-level presence of Staph. aureus GTB and Staph. aureus non-GTB. The prevalence of Staph. aureus GTB herds was 16% (n=16), whereas that of Staph. aureus non-GTB herds was 38% (n=38). Herds that sent lactating cows to seasonal communal pastures had significantly higher odds of being infected with Staph. aureus GTB (odds ratio: 10.2, 95% CI: 1.9-56.6), compared with herds without communal pasturing. Herds that purchased heifers had significantly higher odds of being infected with Staph. aureus GTB (rather than Staph. aureus non-GTB) compared with herds without purchase of heifers. Furthermore, herds that did not use udder ointment as supportive therapy for acute mastitis had significantly higher odds of being infected with Staph. aureus GTB (odds ratio: 8.5, 95% CI: 1.6-58.4) or Staph. aureus non-GTB (odds ratio: 6.1, 95% CI: 1.3-27.8) than herds that used udder ointment occasionally or regularly. Herds in which the milker performed unrelated activities during milking had significantly higher odds of being infected with Staph. aureus GTB (rather than Staph. aureus non-GTB) compared with herds in which the milker did not perform unrelated activities at milking. Awareness of 4 potential risk factors identified in this study guides implementation of intervention strategies to improve udder health in both Staph. aureus GTB and Staph. aureus non-GTB herds.
Resumo:
Poor udder health represents a serious problem in dairy production and has been investigated intensively, but heifers generally have not been the main focus of mastitis control. The aim of this study was to evaluate the prevalence, risk factors and consequences of heifer mastitis in Switzerland. The study included 166,518 heifers of different breeds (Swiss Red Pied, Swiss Brown Cattle and Holstein). Monthly somatic cell counts (SCCs) provided by the main dairy breeding organisations in Switzerland were monitored for 3 years; the prevalence of subclinical mastitis (SCM) was determined on the basis of SCCs ≥100,000 cells/mL at the first test date. The probability of having SCM at the first test date during lactation was modelled using logistic regression. Analysed factors included data for the genetic background, morphological traits, geographical region, season of parturition and milk composition. The overall prevalence of SCM in heifers during the period from 2006 to 2010 was 20.6%. Higher frequencies of SCM were present in heifers of the Holstein breed (odds ratio, OR, 1.62), heifers with high fat:protein ratios (OR 1.97) and heifers with low milk urea concentrations combined with high milk protein concentrations (OR 3.97). Traits associated with a low risk of SCM were high set udders, high overall breeding values and low milk breeding values. Heifers with SCM on the first test day had a higher risk of either developing chronic mastitis or leaving the herd prematurely.
Resumo:
The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable approach to risk estimation would be to use organ-specific non-linear risk models applied to the dose distributions of organs within or near the treatment fields (lungs and contralateral breast in the case of breast radiotherapy) as the majority of radiation-induced secondary cancers are found in the beam-bordering regions.
Resumo:
PURPOSE To analyze the frequency of perforation of the sinus membrane during maxillary sinus floor elevation (SFE) and to assess possible risk factors. MATERIALS AND METHODS Seventy-seven cases of SFE performed with a lateral window approach were evaluated retrospectively. Clinical and radiographic variables potentially influencing the risk of sinus membrane perforation were evaluated and divided into patient-related factors (age, sex, smoking habit); surgery-related factors (type of surgical approach, side, units, sites, and technique of osteotomy); and maxillary sinus-related factors (presence and height of septum, height of residual ridge, thickness of lateral sinus wall, width of antrum, and thickness and status of sinus membrane). RESULTS The following factors presented with at least a 10% difference in rates of perforations: smokers (46.2%) versus nonsmokers (23.4%), simultaneous (32%) versus staged (18.5%) approach, mixed premolar-molar sites (41.2%) versus premolar-only sites (16.7%) versus molar-only sites (26.2%), presence of septa (42.9%) versus no septa (23.8%), and minimum height of residual ridge ≤4 mm (34.2%) versus > 4 mm (20.5%). These same parameters, except minimum height of residual ridge, also showed an odds ratio above 2. However, none of the comparisons reached statistical significance. CONCLUSION The present study failed to demonstrate any factor that statistically significantly increased the risk of sinus membrane perforation during SFE using the lateral window approach.
Resumo:
BACKGROUND AND PURPOSE To assess the association of lesion location and risk of aspiration and to establish predictors of transient versus extended risk of aspiration after supratentorial ischemic stroke. METHODS Atlas-based localization analysis was performed in consecutive patients with MRI-proven first-time acute supratentorial ischemic stroke. Standardized swallowing assessment was carried out within 8±18 hours and 7.8±1.2 days after admission. RESULTS In a prospective, longitudinal analysis, 34 of 94 patients (36%) were classified as having acute risk of aspiration, which was extended (≥7 days) or transient (<7 days) in 17 cases. There were no between-group differences in age, sex, cause of stroke, risk factors, prestroke disability, lesion side, or the degree of age-related white-matter changes. Correcting for stroke volume and National Institutes of Health Stroke Scale with a multiple logistic regression model, significant adjusted odds ratios in favor of acute risk of aspiration were demonstrated for the internal capsule (adjusted odds ratio, 6.2; P<0.002) and the insular cortex (adjusted odds ratio, 4.8; P<0.003). In a multivariate model of extended versus transient risk of aspiration, combined lesions of the frontal operculum and insular cortex was the only significant independent predictor of poor recovery (adjusted odds ratio, 33.8; P<0.008). CONCLUSIONS Lesions of the insular cortex and the internal capsule are significantly associated with acute risk of aspiration after stroke. Combined ischemic infarctions of the frontal operculum and the insular cortex are likely to cause extended risk of aspiration in stroke patients, whereas risk of aspiration tends to be transient in subcortical stroke.
Resumo:
BACKGROUND AND AIMS We investigated the association between significant liver fibrosis, determined by AST-to-platelet ratio index (APRI), and all-cause mortality among HIV-infected patients prescribed antiretroviral therapy (ART) in Zambia METHODS: Among HIV-infected adults who initiated ART, we categorized baseline APRI scores according to established thresholds for significant hepatic fibrosis (APRI ≥1.5) and cirrhosis (APRI ≥2.0). Using multivariable logistic regression we identified risk factors for elevated APRI including demographic characteristics, body mass index (BMI), HIV clinical and immunologic status, and tuberculosis. In the subset tested for hepatitis B surface antigen (HBsAg), we investigated the association of hepatitis B virus co-infection with APRI score. Using Kaplan-Meier analysis and Cox proportional hazards regression we determined the association of elevated APRI with death during ART. RESULTS Among 20,308 adults in the analysis cohort, 1,027 (5.1%) had significant liver fibrosis at ART initiation including 616 (3.0%) with cirrhosis. Risk factors for significant fibrosis or cirrhosis included male sex, BMI <18, WHO clinical stage 3 or 4, CD4+ count <200 cells/mm(3) , and tuberculosis. Among the 237 (1.2%) who were tested, HBsAg-positive patients had four times the odds (adjusted odds ratio, 4.15; 95% CI, 1.71-10.04) of significant fibrosis compared HBsAg-negatives. Both significant fibrosis (adjusted hazard ratio 1.41, 95% CI, 1.21-1.64) and cirrhosis (adjusted hazard ratio 1.57, 95% CI, 1.31-1.89) were associated with increased all-cause mortality. CONCLUSION Liver fibrosis may be a risk factor for mortality during ART among HIV-infected individuals in Africa. APRI is an inexpensive and potentially useful test for liver fibrosis in resource-constrained settings. This article is protected by copyright. All rights reserved.
Resumo:
BACKGROUND Polypharmacy, defined as the concomitant use of multiple medications, is very common in the elderly and may trigger drug-drug interactions and increase the risk of falls in patients receiving vitamin K antagonists. OBJECTIVE To examine whether polypharmacy increases the risk of bleeding in elderly patients who receive vitamin K antagonists for acute venous thromboembolism (VTE). DESIGN We used a prospective cohort study. PARTICIPANTS In a multicenter Swiss cohort, we studied 830 patients aged ≥ 65 years with VTE. MAIN MEASURES We defined polypharmacy as the prescription of more than four different drugs. We assessed the association between polypharmacy and the time to a first major and clinically relevant non-major bleeding, accounting for the competing risk of death. We adjusted for known bleeding risk factors (age, gender, pulmonary embolism, active cancer, arterial hypertension, cardiac disease, cerebrovascular disease, chronic liver and renal disease, diabetes mellitus, history of major bleeding, recent surgery, anemia, thrombocytopenia) and periods of vitamin K antagonist treatment as a time-varying covariate. KEY RESULTS Overall, 413 (49.8 %) patients had polypharmacy. The mean follow-up duration was 17.8 months. Patients with polypharmacy had a significantly higher incidence of major (9.0 vs. 4.1 events/100 patient-years; incidence rate ratio [IRR] 2.18, 95 % confidence interval [CI] 1.32-3.68) and clinically relevant non-major bleeding (14.8 vs. 8.0 events/100 patient-years; IRR 1.85, 95 % CI 1.27-2.71) than patients without polypharmacy. After adjustment, polypharmacy was significantly associated with major (sub-hazard ratio [SHR] 1.83, 95 % CI 1.03-3.25) and clinically relevant non-major bleeding (SHR 1.60, 95 % CI 1.06-2.42). CONCLUSIONS Polypharmacy is associated with an increased risk of both major and clinically relevant non-major bleeding in elderly patients receiving vitamin K antagonists for VTE.
Resumo:
OBJECTIVE Whether or not a high risk of falls increases the risk of bleeding in patients receiving anticoagulants remains a matter of debate. METHODS We conducted a prospective cohort study involving 991 patients ≥ 65 years of age who received anticoagulants for acute venous thromboembolism (VTE) at nine Swiss hospitals between September 2009 and September 2012. The study outcomes were as follows: the time to a first major episode of bleeding; and clinically relevant nonmajor bleeding. We determined the associations between the risk of falls and the time to a first episode of bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS Four hundred fifty-eight of 991 patients (46%) were at high risk of falls. The mean duration of follow-up was 16.7 months. Patients at high risk of falls had a higher incidence of major bleeding (9.6 vs. 6.6 events/100 patient-years; P = 0.05) and a significantly higher incidence of clinically relevant nonmajor bleeding (16.7 vs. 8.3 events/100 patient-years; P < 0.001) than patients at low risk of falls. After adjustment, a high risk of falls was associated with clinically relevant nonmajor bleeding [subhazard ratio (SHR) = 1.74, 95% confidence interval (CI) = 1.23-2.46], but not with major bleeding (SHR = 1.24, 95% CI = 0.83-1.86). CONCLUSION In elderly patients who receive anticoagulants because of VTE, a high risk of falls is significantly associated with clinically relevant nonmajor bleeding, but not with major bleeding. Whether or not a high risk of falls is a reason against providing anticoagulation beyond 3 months should be based on patient preferences and the risk of VTE recurrence.
Resumo:
OBJECTIVE To determine the prognostic accuracy of cardiac biomarkers alone and in combination with clinical scores in elderly patients with non-high-risk pulmonary embolism (PE). DESIGN Ancillary analysis of a Swiss multicentre prospective cohort study. SUBJECTS A total of 230 patients aged ≥65 years with non-high-risk PE. MAIN OUTCOME MEASURES The study end-point was a composite of PE-related complications, defined as PE-related death, recurrent venous thromboembolism or major bleeding during a follow-up of 30 days. The prognostic accuracy of the Pulmonary Embolism Severity Index (PESI), the Geneva Prognostic Score (GPS), the precursor of brain natriuretic peptide (NT-proBNP) and high-sensitivity cardiac troponin T (hs-cTnT) was determined using sensitivity, specificity, predictive values, receiver operating characteristic (ROC) curve analysis, logistic regression and reclassification statistics. RESULTS The overall complication rate during follow-up was 8.7%. hs-cTnT achieved the highest prognostic accuracy [area under the ROC curve: 0.75, 95% confidence interval (CI): 0.63-0.86, P < 0.001). At the predefined cut-off values, the negative predictive values of the biomarkers were above 95%. For levels above the cut-off, the risk of complications increased fivefold for hs-cTnT [odds ratio (OR): 5.22, 95% CI: 1.49-18.25] and 14-fold for NT-proBNP (OR: 14.21, 95% CI: 1.73-116.93) after adjustment for both clinical scores and renal function. Reclassification statistics indicated that adding hs-cTnT to the GPS or the PESI significantly improved the prognostic accuracy of both clinical scores. CONCLUSION In elderly patients with nonmassive PE, NT-proBNP or hs-cTnT could be an adequate alternative to clinical scores for identifying low-risk individuals suitable for outpatient management.
Resumo:
BACKGROUND Although the possibility of bleeding during anticoagulant treatment may limit patients from taking part in physical activity, the association between physical activity and anticoagulation-related bleeding is uncertain. OBJECTIVES To determine whether physical activity is associated with bleeding in elderly patients taking anticoagulants. PATIENTS/METHODS In a prospective multicenter cohort study of 988 patients aged ≥65 years receiving anticoagulants for venous thromboembolism, we assessed patients' self-reported physical activity level. The primary outcome was the time to a first major bleeding, defined as fatal bleeding, symptomatic bleeding in a critical site, or bleeding causing a fall in hemoglobin or leading to transfusions. The secondary outcome was the time to a first clinically-relevant non-major bleeding. We examined the association between physical activity level and time to a first bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS During a mean follow-up of 22 months, patients with a low, moderate, and high physical activity level had an incidence of major bleeding of 11.6, 6.3, and 3.1 events per 100 patient-years, and an incidence of clinically relevant non-major bleeding of 14.0, 10.3, and 7.7 events per 100 patient-years, respectively. A high physical activity level was significantly associated with a lower risk of major bleeding (adjusted sub-hazard ratio 0.40, 95%-CI 0.22-0.72). There was no association between physical activity and non-major bleeding. CONCLUSIONS A high level of physical activity is associated with a decreased risk of major bleeding in elderly patients receiving anticoagulant therapy. This article is protected by copyright. All rights reserved.
Resumo:
CONTEXT Subclinical hypothyroidism has been associated with increased risk of coronary heart disease (CHD), particularly with thyrotropin levels of 10.0 mIU/L or greater. The measurement of thyroid antibodies helps predict the progression to overt hypothyroidism, but it is unclear whether thyroid autoimmunity independently affects CHD risk. OBJECTIVE The objective of the study was to compare the CHD risk of subclinical hypothyroidism with and without thyroid peroxidase antibodies (TPOAbs). DATA SOURCES AND STUDY SELECTION A MEDLINE and EMBASE search from 1950 to 2011 was conducted for prospective cohorts, reporting baseline thyroid function, antibodies, and CHD outcomes. DATA EXTRACTION Individual data of 38 274 participants from six cohorts for CHD mortality followed up for 460 333 person-years and 33 394 participants from four cohorts for CHD events. DATA SYNTHESIS Among 38 274 adults (median age 55 y, 63% women), 1691 (4.4%) had subclinical hypothyroidism, of whom 775 (45.8%) had positive TPOAbs. During follow-up, 1436 participants died of CHD and 3285 had CHD events. Compared with euthyroid individuals, age- and gender-adjusted risks of CHD mortality in subclinical hypothyroidism were similar among individuals with and without TPOAbs [hazard ratio (HR) 1.15, 95% confidence interval (CI) 0.87-1.53 vs HR 1.26, CI 1.01-1.58, P for interaction = .62], as were risks of CHD events (HR 1.16, CI 0.87-1.56 vs HR 1.26, CI 1.02-1.56, P for interaction = .65). Risks of CHD mortality and events increased with higher thyrotropin, but within each stratum, risks did not differ by TPOAb status. CONCLUSIONS CHD risk associated with subclinical hypothyroidism did not differ by TPOAb status, suggesting that biomarkers of thyroid autoimmunity do not add independent prognostic information for CHD outcomes.
Resumo:
BACKGROUND The early repolarization (ER) pattern is associated with an increased risk of arrhythmogenic sudden death. However, strategies for risk stratification of patients with the ER pattern are not fully defined. OBJECTIVES This study sought to determine the role of electrophysiology studies (EPS) in risk stratification of patients with ER syndrome. METHODS In a multicenter study, 81 patients with ER syndrome (age 36 ± 13 years, 60 males) and aborted sudden death due to ventricular fibrillation (VF) were included. EPS were performed following the index VF episode using a standard protocol. Inducibility was defined by the provocation of sustained VF. Patients were followed up by serial implantable cardioverter-defibrillator interrogations. RESULTS Despite a recent history of aborted sudden death, VF was inducible in only 18 of 81 (22%) patients. During follow-up of 7.0 ± 4.9 years, 6 of 18 (33%) patients with inducible VF during EPS experienced VF recurrences, whereas 21 of 63 (33%) patients who were noninducible experienced recurrent VF (p = 0.93). VF storm occurred in 3 patients from the inducible VF group and in 4 patients in the noninducible group. VF inducibility was not associated with maximum J-wave amplitude (VF inducible vs. VF noninducible; 0.23 ± 0.11 mV vs. 0.21 ± 0.11 mV; p = 0.42) or J-wave distribution (inferior, odds ratio [OR]: 0.96 [95% confidence interval (CI): 0.33 to 2.81]; p = 0.95; lateral, OR: 1.57 [95% CI: 0.35 to 7.04]; p = 0.56; inferior and lateral, OR: 0.83 [95% CI: 0.27 to 2.55]; p = 0.74), which have previously been demonstrated to predict outcome in patients with an ER pattern. CONCLUSIONS Our findings indicate that current programmed stimulation protocols do not enhance risk stratification in ER syndrome.
Resumo:
BACKGROUND Observational studies of a putative association between hormonal contraception (HC) and HIV acquisition have produced conflicting results. We conducted an individual participant data (IPD) meta-analysis of studies from sub-Saharan Africa to compare the incidence of HIV infection in women using combined oral contraceptives (COCs) or the injectable progestins depot-medroxyprogesterone acetate (DMPA) or norethisterone enanthate (NET-EN) with women not using HC. METHODS AND FINDINGS Eligible studies measured HC exposure and incident HIV infection prospectively using standardized measures, enrolled women aged 15-49 y, recorded ≥15 incident HIV infections, and measured prespecified covariates. Our primary analysis estimated the adjusted hazard ratio (aHR) using two-stage random effects meta-analysis, controlling for region, marital status, age, number of sex partners, and condom use. We included 18 studies, including 37,124 women (43,613 woman-years) and 1,830 incident HIV infections. Relative to no HC use, the aHR for HIV acquisition was 1.50 (95% CI 1.24-1.83) for DMPA use, 1.24 (95% CI 0.84-1.82) for NET-EN use, and 1.03 (95% CI 0.88-1.20) for COC use. Between-study heterogeneity was mild (I2 < 50%). DMPA use was associated with increased HIV acquisition compared with COC use (aHR 1.43, 95% CI 1.23-1.67) and NET-EN use (aHR 1.32, 95% CI 1.08-1.61). Effect estimates were attenuated for studies at lower risk of methodological bias (compared with no HC use, aHR for DMPA use 1.22, 95% CI 0.99-1.50; for NET-EN use 0.67, 95% CI 0.47-0.96; and for COC use 0.91, 95% CI 0.73-1.41) compared to those at higher risk of bias (pinteraction = 0.003). Neither age nor herpes simplex virus type 2 infection status modified the HC-HIV relationship. CONCLUSIONS This IPD meta-analysis found no evidence that COC or NET-EN use increases women's risk of HIV but adds to the evidence that DMPA may increase HIV risk, underscoring the need for additional safe and effective contraceptive options for women at high HIV risk. A randomized controlled trial would provide more definitive evidence about the effects of hormonal contraception, particularly DMPA, on HIV risk.
Resumo:
BACKGROUND Heart failure with preserved ejection fraction (HFpEF) represents a growing health burden associated with substantial mortality and morbidity. Consequently, risk prediction is of highest importance. Endothelial dysfunction has been recently shown to play an important role in the complex pathophysiology of HFpEF. We therefore aimed to assess von Willebrand factor (vWF), a marker of endothelial damage, as potential biomarker for risk assessment in patients with HFpEF. METHODS AND RESULTS Concentrations of vWF were assessed in 457 patients with HFpEF enrolled as part of the LUdwigshafen Risk and Cardiovascular Health (LURIC) study. All-cause mortality was observed in 40% of patients during a median follow-up time of 9.7 years. vWF significantly predicted mortality with a hazard ratio (HR) per increase of 1 SD of 1.45 (95% confidence interval, 1.26-1.68; P<0.001) and remained a significant predictor after adjustment for age, sex, body mass index, N-terminal pro-B-type natriuretic peptide (NT-proBNP), renal function, and frequent HFpEF-related comorbidities (adjusted HR per 1 SD, 1.22; 95% confidence interval, 1.05-1.42; P=0.001). Most notably, vWF showed additional prognostic value beyond that achievable with NT-proBNP indicated by improvements in C-Statistic (vWF×NT-proBNP: 0.65 versus NT-proBNP: 0.63; P for comparison, 0.004) and category-free net reclassification index (37.6%; P<0.001). CONCLUSIONS vWF is an independent predictor of long-term outcome in patients with HFpEF, which is in line with endothelial dysfunction as potential mediator in the pathophysiology of HFpEF. In particular, combined assessment of vWF and NT-proBNP improved risk prediction in this vulnerable group of patients.
Resumo:
High concentrations of HDL cholesterol are considered to indicate efficient reverse cholesterol transport and to protect from atherosclerosis. However, HDL has been suggested to be dysfunctional in ESRD. Hence, our main objective was to investigate the effect of HDL cholesterol on outcomes in maintenance hemodialysis patients with diabetes. Moreover, we investigated the associations between the major protein components of HDL (apoA1, apoA2, and apoC3) and end points. We performed an exploratory, post hoc analysis with 1255 participants (677 men and 578 women) of the German Diabetes Dialysis study. The mean age was 66.3 years and the mean body mass index was 28.0 kg/m(2). The primary end point was a composite of cardiac death, myocardial infarction, and stroke. The secondary end point included all-cause mortality. The mean duration of follow-up was 3.9 years. A total of 31.3% of the study participants reached the primary end point and 49.1% died from any cause. HDL cholesterol and apoA1 and apoC3 quartiles were not related to end points. However, there was a trend toward an inverse association between apoA2 and all-cause mortality. The hazard ratio for death from any cause in the fourth quartile compared with the first quartile of apoA2 was 0.63 (95% confidence interval, 0.40 to 0.89). The lack of an association between HDL cholesterol and cardiovascular risk may support the concept of dysfunctional HDL in hemodialysis. The possible beneficial effect of apoA2 on survival requires confirmation in future studies.