68 resultados para Proportional Hazards Models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Non-AIDS defining cancers (NADC) are an important cause of morbidity and mortality in HIV-positive individuals. Using data from a large international cohort of HIV-positive individuals, we described the incidence of NADC from 2004–2010, and described subsequent mortality and predictors of these. Methods Individuals were followed from 1st January 2004/enrolment in study, until the earliest of a new NADC, 1st February 2010, death or six months after the patient’s last visit. Incidence rates were estimated for each year of follow-up, overall and stratified by gender, age and mode of HIV acquisition. Cumulative risk of mortality following NADC diagnosis was summarised using Kaplan-Meier methods, with follow-up for these analyses from the date of NADC diagnosis until the patient’s death, 1st February 2010 or 6 months after the patient’s last visit. Factors associated with mortality following NADC diagnosis were identified using multivariable Cox proportional hazards regression. Results Over 176,775 person-years (PY), 880 (2.1%) patients developed a new NADC (incidence: 4.98/1000PY [95% confidence interval 4.65, 5.31]). Over a third of these patients (327, 37.2%) had died by 1st February 2010. Time trends for lung cancer, anal cancer and Hodgkin’s lymphoma were broadly consistent. Kaplan-Meier cumulative mortality estimates at 1, 3 and 5 years after NADC diagnosis were 28.2% [95% CI 25.1-31.2], 42.0% [38.2-45.8] and 47.3% [42.4-52.2], respectively. Significant predictors of poorer survival after diagnosis of NADC were lung cancer (compared to other cancer types), male gender, non-white ethnicity, and smoking status. Later year of diagnosis and higher CD4 count at NADC diagnosis were associated with improved survival. The incidence of NADC remained stable over the period 2004–2010 in this large observational cohort. Conclusions The prognosis after diagnosis of NADC, in particular lung cancer and disseminated cancer, is poor but has improved somewhat over time. Modifiable risk factors, such as smoking and low CD4 counts, were associated with mortality following a diagnosis of NADC.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Switzerland, group-housing for breeding rabbit does is not explicitly required by law, but label programmes, as well as the general public and animal welfare groups, are advocating it. Although group-housing is of great benefit to the gregariously living rabbits, the establishment of a social hierarchy within the group might lead to stress and lesions. In the present epidemiological study, lesions were scored twice on 30% of the breeding does on all 28 commercial Swiss farms with group-housed breeding does. Additionally, a detailed questionnaire was filled out with all producers to determine risk factors potentially associated with lesions. Data were analysed using hierarchical proportional odds models. About 33% of the does examined had lesions, including wounds that were almost healed and small scratches. Severe lesions were counted on 9% of the animals. Differences between seasons in lesions score were identified, with the extent of lesions being higher in summer than in spring. Fewer lesions occurred on farms on which mastitis was more common. More lesions were found on farms where the does were isolated between littering and artificial insemination than on farms without isolation. According to the producers, most of the aggression occurred directly after the isolation phase when the does were regrouped again. We conclude that lesions in group-housed breeding does might be reduced by appropriate reproductive management.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Patients with prior coronary artery bypass graft surgery (CABG) who present with an acute coronary syndrome have a high risk for recurrent events. Whether intensive antiplatelet therapy with ticagrelor might be beneficial compared with clopidogrel is unknown. In this substudy of the PLATO trial, we studied the effects of randomized treatment dependent on history of CABG. METHODS Patients participating in PLATO were classified according to whether they had undergone prior CABG. The trial's primary and secondary end points were compared using Cox proportional hazards regression. RESULTS Of the 18,613 study patients, 1,133 (6.1%) had prior CABG. Prior-CABG patients had more high-risk characteristics at study entry and a 2-fold increase in clinical events during follow-up, but less major bleeding. The primary end point (composite of cardiovascular death, myocardial infarction, and stroke) was reduced to a similar extent by ticagrelor among patients with (19.6% vs 21.4%; adjusted hazard ratio [HR], 0.91 [0.67, 1.24]) and without (9.2% vs 11.0%; adjusted HR, 0.86 [0.77, 0.96]; P(interaction) = .73) prior CABG. Major bleeding was similar with ticagrelor versus clopidogrel among patients with (8.1% vs 8.7%; adjusted HR, 0.89 [0.55, 1.47]) and without (11.8% vs 11.4%; HR, 1.08 [0.98, 1.20]; P(interaction) = .46) prior CABG. CONCLUSIONS Prior-CABG patients presenting with acute coronary syndrome are a high-risk cohort for death and recurrent cardiovascular events but have a lower risk for major bleeding. Similar to the results in no-prior-CABG patients, ticagrelor was associated with a reduction in ischemic events without an increase in major bleeding.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION: We evaluated treatment patterns of elderly patients with stage IIIA (N2) non-small-cell lung cancer (NSCLC). METHODS: The use of surgery, chemotherapy, and radiation for patients with stage IIIA (T1-T3N2M0) NSCLC in the Surveillance, Epidemiology, and End Results-Medicare database from 2004 to 2007 was analyzed. Treatment variability was assessed using a multivariable logistic regression model that included treatment, patient, tumor, and census track variables. Overall survival was analyzed using the Kaplan-Meier approach and Cox proportional hazard models. RESULTS: The most common treatments for 2958 patients with stage IIIA (N2) NSCLC were radiation with chemotherapy (n = 1065, 36%), no treatment (n = 534, 18%), and radiation alone (n = 383, 13%). Surgery was performed in 709 patients (24%): 235 patients (8%) had surgery alone, 40 patients (1%) had surgery with radiation, 222 patients had surgery with chemotherapy (8%), and 212 patients (7%) had surgery, chemotherapy, and radiation. Younger age (p < 0.0001), lower T-status (p < 0.0001), female sex (p = 0.04), and living in a census track with a higher median income (p = 0.03) predicted surgery use. Older age (p < 0.0001) was the only factor that predicted that patients did not get any therapy. The 3-year overall survival was 21.8 ± 1.5% for all patients, 42.1 ± 3.8% for patients that had surgery, and 15.4 ± 1.5% for patients that did not have surgery. Increasing age, higher T-stage and Charlson Comorbidity Index, and not having surgery, radiation, or chemotherapy were all risk factors for worse survival (all p values < 0.001). CONCLUSIONS: Treatment of elderly patients with stage IIIA (N2) NSCLC is highly variable and varies not only with specific patient and tumor characteristics but also with regional income level.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We examined outcomes and trends in surgery and radiation use for patients with locally advanced esophageal cancer, for whom optimal treatment isn't clear. Trends in surgery and radiation for patients with T1-T3N1M0 squamous cell or adenocarcinoma of the mid or distal esophagus in the Surveillance, Epidemiology, and End Results database from 1998 to 2008 were analyzed using generalized linear models including year as predictor; Surveillance, Epidemiology, and End Results doesn't record chemotherapy data. Local treatment was unimodal if patients had only surgery or radiation and bimodal if they had both. Five-year cancer-specific survival (CSS) and overall survival (OS) were analyzed using propensity-score adjusted Cox proportional-hazard models. Overall 5-year survival for the 3295 patients identified (mean age 65.1 years, standard deviation 11.0) was 18.9% (95% confidence interval: 17.3-20.7). Local treatment was bimodal for 1274 (38.7%) and unimodal for 2021 (61.3%) patients; 1325 (40.2%) had radiation alone and 696 (21.1%) underwent only surgery. The use of bimodal therapy (32.8-42.5%, P = 0.01) and radiation alone (29.3-44.5%, P < 0.001) increased significantly from 1998 to 2008. Bimodal therapy predicted improved CSS (hazard ratios [HR]: 0.68, P < 0.001) and OS (HR: 0.58, P < 0.001) compared with unimodal therapy. For the first 7 months (before survival curve crossing), CSS after radiation therapy alone was similar to surgery alone (HR: 0.86, P = 0.12) while OS was worse for surgery only (HR: 0.70, P = 0.001). However, worse CSS (HR: 1.43, P < 0.001) and OS (HR: 1.46, P < 0.001) after that initial timeframe were found for radiation therapy only. The use of radiation to treat locally advanced mid and distal esophageal cancers increased from 1998 to 2008. Survival was best when both surgery and radiation were used.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the general population, HDL cholesterol (HDL-C) is associated with reduced cardiovascular events. However, recent experimental data suggest that the vascular effects of HDL can be heterogeneous. We examined the association of HDL-C with all-cause and cardiovascular mortality in the Ludwigshafen Risk and Cardiovascular Health study comprising 3307 patients undergoing coronary angiography. Patients were followed for a median of 9.9 years. Estimated GFR (eGFR) was calculated using the Chronic Kidney Disease Epidemiology Collaboration eGFR creatinine-cystatin C (eGFRcreat-cys) equation. The effect of increasing HDL-C serum levels was assessed using Cox proportional hazard models. In participants with normal kidney function (eGFR>90 ml/min per 1.73 m(2)), higher HDL-C was associated with reduced risk of all-cause and cardiovascular mortality and coronary artery disease severity (hazard ratio [HR], 0.51, 95% confidence interval [95% CI], 0.26-0.92 [P=0.03]; HR, 0.30, 95% CI, 0.13-0.73 [P=0.01]). Conversely, in patients with mild (eGFR=60-89 ml/min per 1.73 m(2)) and more advanced reduced kidney function (eGFR<60 ml/min per 1.73 m(2)), higher HDL-C did not associate with lower risk for mortality (eGFR=60-89 ml/min per 1.73 m(2): HR, 0.68, 95% CI, 0.45-1.04 [P=0.07]; HR, 0.84, 95% CI, 0.50-1.40 [P=0.50]; eGFR<60 ml/min per 1.73 m(2): HR, 1.18, 95% CI, 0.60-1.81 [P=0.88]; HR, 0.82, 95% CI, 0.40-1.69 [P=0.60]). Moreover, Cox regression analyses revealed interaction between HDL-C and eGFR in predicting all-cause and cardiovascular mortality (P=0.04 and P=0.02, respectively). We confirmed a lack of association between higher HDL-C and lower mortality in an independent cohort of patients with definite CKD (P=0.63). In summary, higher HDL-C levels did not associate with reduced mortality risk and coronary artery disease severity in patients with reduced kidney function. Indeed, abnormal HDL function might confound the outcome of HDL-targeted therapies in these patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Amyotrophic lateral sclerosis (ALS) has been associated with exposures in so-called 'electrical occupations'. It is unclear if this possible link may be explained by exposure to extremely low-frequency magnetic fields (ELF-MF) or by electrical shocks. We evaluated ALS mortality in 2000-2008 and exposure to ELF-MF and electrical shocks in the Swiss National Cohort, using job exposure matrices for occupations at censuses 1990 and 2000. We compared 2.2 million workers with high or medium vs. low exposure to ELF-MF and electrical shocks using Cox proportional hazard models. Results showed that mortality from ALS was higher in people who had medium or high ELF-MF exposure in both censuses (HR 1.55 (95% CI 1.11-2.15)), but closer to unity for electrical shocks (HR 1.17 (95% CI 0.83-1.65)). When both exposures were included in the same model, the HR for ELF-MF changed little (HR 1.56), but the HR for electric shocks was attenuated to 0.97. In conclusion, there was an association between exposure to ELF-MF and mortality from ALS among workers with a higher likelihood of long-term exposure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND AND AIMS We investigated the association between significant liver fibrosis, determined by AST-to-platelet ratio index (APRI), and all-cause mortality among HIV-infected patients prescribed antiretroviral therapy (ART) in Zambia METHODS: Among HIV-infected adults who initiated ART, we categorized baseline APRI scores according to established thresholds for significant hepatic fibrosis (APRI ≥1.5) and cirrhosis (APRI ≥2.0). Using multivariable logistic regression we identified risk factors for elevated APRI including demographic characteristics, body mass index (BMI), HIV clinical and immunologic status, and tuberculosis. In the subset tested for hepatitis B surface antigen (HBsAg), we investigated the association of hepatitis B virus co-infection with APRI score. Using Kaplan-Meier analysis and Cox proportional hazards regression we determined the association of elevated APRI with death during ART. RESULTS Among 20,308 adults in the analysis cohort, 1,027 (5.1%) had significant liver fibrosis at ART initiation including 616 (3.0%) with cirrhosis. Risk factors for significant fibrosis or cirrhosis included male sex, BMI <18, WHO clinical stage 3 or 4, CD4+ count <200 cells/mm(3) , and tuberculosis. Among the 237 (1.2%) who were tested, HBsAg-positive patients had four times the odds (adjusted odds ratio, 4.15; 95% CI, 1.71-10.04) of significant fibrosis compared HBsAg-negatives. Both significant fibrosis (adjusted hazard ratio 1.41, 95% CI, 1.21-1.64) and cirrhosis (adjusted hazard ratio 1.57, 95% CI, 1.31-1.89) were associated with increased all-cause mortality. CONCLUSION Liver fibrosis may be a risk factor for mortality during ART among HIV-infected individuals in Africa. APRI is an inexpensive and potentially useful test for liver fibrosis in resource-constrained settings. This article is protected by copyright. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Extensive coronary artery disease (CAD) is associated with higher risk. In this substudy of the PLATO trial, we examined the effects of randomized treatment on outcome events and safety in relation to the extent of CAD. METHODS Patients were classified according to presence of extensive CAD (defined as 3-vessel disease, left main disease, or prior coronary artery bypass graft surgery). The trial's primary and secondary end points were compared using Cox proportional hazards regression. RESULTS Among 15,388 study patients for whom the extent of CAD was known, 4,646 (30%) had extensive CAD. Patients with extensive CAD had more high-risk characteristics and experienced more clinical events during follow-up. They were less likely to undergo percutaneous coronary intervention (58% vs 79%, P < .001) but more likely to undergo coronary artery bypass graft surgery (16% vs 2%, P < .001). Ticagrelor, compared with clopidogrel, reduced the composite of cardiovascular death, myocardial infarction, and stroke in patients with extensive CAD (14.9% vs 17.6%, hazard ratio [HR] 0.85 [0.73-0.98]) similar to its reduction in those without extensive CAD (6.8% vs 8.0%, HR 0.85 [0.74-0.98], Pinteraction = .99). Major bleeding was similar with ticagrelor vs clopidogrel among patients with (25.7% vs 25.5%, HR 1.02 [0.90-1.15]) and without (7.3% vs 6.4%, HR 1.14 [0.98-1.33], Pinteraction = .24) extensive CAD. CONCLUSIONS Patients with extensive CAD have higher rates of recurrent cardiovascular events and bleeding. Ticagrelor reduced ischemic events to a similar extent both in patients with and without extensive CAD, with bleeding rates similar to clopidogrel.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES Fontan failure (FF) represents a growing and challenging indication for paediatric orthotopic heart transplantation (OHT). The aim of this study was to identify predictors of the best mid-term outcome in OHT after FF. METHODS Twenty-year multi-institutional retrospective analysis on OHT for FF. RESULTS Between 1991 and 2011, 61 patients, mean age 15.0 ± 9.7 years, underwent OHT for failing atriopulmonary connection (17 patients = 27.8%) or total cavopulmonary connection (44 patients = 72.2%). Modality of FF included arrhythmia (14.8%), complex obstructions in the Fontan circuit (16.4%), protein-losing enteropathy (PLE) (22.9%), impaired ventricular function (31.1%) or a combination of the above (14.8%). The mean time interval between Fontan completion and OHT was 10.7 ± 6.6 years. Early FF occurred in 18%, requiring OHT 0.8 ± 0.5 years after Fontan. The hospital mortality rate was 18.3%, mainly secondary to infection (36.4%) and graft failure (27.3%). The mean follow-up was 66.8 ± 54.2 months. The overall Kaplan-Meier survival estimate was 81.9 ± 1.8% at 1 year, 73 ± 2.7% at 5 years and 56.8 ± 4.3% at 10 years. The Kaplan-Meier 5-year survival estimate was 82.3 ± 5.9% in late FF and 32.7 ± 15.0% in early FF (P = 0.0007). Late FF with poor ventricular function exhibited a 91.5 ± 5.8% 5-year OHT survival. PLE was cured in 77.7% of hospital survivors, but the 5-year Kaplan-Meier survival estimate in PLE was 46.3 ± 14.4 vs 84.3 ± 5.5% in non-PLE (P = 0.0147). Cox proportional hazards identified early FF (P = 0.0005), complex Fontan pathway obstruction (P = 0.0043) and PLE (P = 0.0033) as independent predictors of 5-year mortality. CONCLUSIONS OHT is an excellent surgical option for late FF with impaired ventricular function. Protein dispersion improves with OHT, but PLE negatively affects the mid-term OHT outcome, mainly for early infective complications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES To report on trends of tuberculosis ascertainment among HIV patients in a rural HIV cohort in Tanzania, and assessing the impact of a bundle of services implemented in December 2012, consisting of three components:(i)integration of HIV and tuberculosis services; (ii)GeneXpert for tuberculosis diagnosis; and (iii)electronic data collection. DESIGN Retrospective cohort study of patients enrolled in the Kilombero Ulanga Antiretroviral Cohort (KIULARCO), Tanzania.). METHODS HIV patients without prior history of tuberculosis enrolled in the KIULARCO cohort between 2005 and 2013 were included.Cox proportional hazard models were used to estimate rates and predictors of tuberculosis ascertainment. RESULTS Of 7114 HIV positive patients enrolled, 5123(72%) had no history of tuberculosis. Of these, 66% were female, median age was 38 years, median baseline CD4+ cell count was 243 cells/µl, and 43% had WHO clinical stage 3 or 4. During follow-up, 421 incident tuberculosis cases were notified with an estimated incidence of 3.6 per 100 person-years(p-y)[95% confidence interval(CI)3.26-3.97]. The incidence rate varied over time and increased significantly from 2.96 to 43.98 cases per 100 p-y after the introduction of the bundle of services in December 2012. Four independent predictors of tuberculosis ascertainment were identified:poor clinical condition at baseline (Hazard Ratio (HR) 3.89, 95% CI 2.87-5.28), WHO clinical stage 3 or 4 (HR 2.48, 95% CI 1.88-3.26), being antiretroviralnaïve (HR 2.97, 95% CI 2.25-3.94), and registration in 2013(HR 6.07, 95% CI 4.39-8.38). CONCLUSION The integration of tuberculosis and HIV services together with comprehensive electronic data collection and use of GeneXpert increased dramatically the ascertainment of tuberculosis in this rural African HIV cohort.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION Community acquired pneumonia (CAP) is the most common infectious reason for admission to the Intensive Care Unit (ICU). The GenOSept study was designed to determine genetic influences on sepsis outcome. Phenotypic data was recorded using a robust clinical database allowing a contemporary analysis of the clinical characteristics, microbiology, outcomes and independent risk factors in patients with severe CAP admitted to ICUs across Europe. METHODS Kaplan-Meier analysis was used to determine mortality rates. A Cox Proportional Hazards (PH) model was used to identify variables independently associated with 28-day and six-month mortality. RESULTS Data from 1166 patients admitted to 102 centres across 17 countries was extracted. Median age was 64 years, 62% were male. Mortality rate at 28 days was 17%, rising to 27% at six months. Streptococcus pneumoniae was the commonest organism isolated (28% of cases) with no organism identified in 36%. Independent risk factors associated with an increased risk of death at six months included APACHE II score (hazard ratio, HR, 1.03; confidence interval, CI, 1.01-1.05), bilateral pulmonary infiltrates (HR1.44; CI 1.11-1.87) and ventilator support (HR 3.04; CI 1.64-5.62). Haematocrit, pH and urine volume on day one were all associated with a worse outcome. CONCLUSIONS The mortality rate in patients with severe CAP admitted to European ICUs was 27% at six months. Streptococcus pneumoniae was the commonest organism isolated. In many cases the infecting organism was not identified. Ventilator support, the presence of diffuse pulmonary infiltrates, lower haematocrit, urine volume and pH on admission were independent predictors of a worse outcome.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION Faecal peritonitis (FP) is a common cause of sepsis and admission to the intensive care unit (ICU). The Genetics of Sepsis and Septic Shock in Europe (GenOSept) project is investigating the influence of genetic variation on the host response and outcomes in a large cohort of patients with sepsis admitted to ICUs across Europe. Here we report an epidemiological survey of the subset of patients with FP. OBJECTIVES To define the clinical characteristics, outcomes and risk factors for mortality in patients with FP admitted to ICUs across Europe. METHODS Data was extracted from electronic case report forms. Phenotypic data was recorded using a detailed, quality-assured clinical database. The primary outcome measure was 6-month mortality. Patients were followed for 6 months. Kaplan-Meier analysis was used to determine mortality rates. Cox proportional hazards regression analysis was employed to identify independent risk factors for mortality. RESULTS Data for 977 FP patients admitted to 102 centres across 16 countries between 29 September 2005 and 5 January 2011 was extracted. The median age was 69.2 years (IQR 58.3-77.1), with a male preponderance (54.3%). The most common causes of FP were perforated diverticular disease (32.1%) and surgical anastomotic breakdown (31.1%). The ICU mortality rate at 28 days was 19.1%, increasing to 31.6% at 6 months. The cause of FP, pre-existing co-morbidities and time from estimated onset of symptoms to surgery did not impact on survival. The strongest independent risk factors associated with an increased rate of death at 6 months included age, higher APACHE II score, acute renal and cardiovascular dysfunction within 1 week of admission to ICU, hypothermia, lower haematocrit and bradycardia on day 1 of ICU stay. CONCLUSIONS In this large cohort of patients admitted to European ICUs with FP the 6 month mortality was 31.6%. The most consistent predictors of mortality across all time points were increased age, development of acute renal dysfunction during the first week of admission, lower haematocrit and hypothermia on day 1 of ICU admission.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION Patients admitted to intensive care following surgery for faecal peritonitis present particular challenges in terms of clinical management and risk assessment. Collaborating surgical and intensive care teams need shared perspectives on prognosis. We aimed to determine the relationship between dynamic assessment of trends in selected variables and outcomes. METHODS We analysed trends in physiological and laboratory variables during the first week of intensive care unit (ICU) stay in 977 patients at 102 centres across 16 European countries. The primary outcome was 6-month mortality. Secondary endpoints were ICU, hospital and 28-day mortality. For each trend, Cox proportional hazards (PH) regression analyses, adjusted for age and sex, were performed for each endpoint. RESULTS Trends over the first 7 days of the ICU stay independently associated with 6-month mortality were worsening thrombocytopaenia (mortality: hazard ratio (HR) = 1.02; 95% confidence interval (CI), 1.01 to 1.03; P <0.001) and renal function (total daily urine output: HR =1.02; 95% CI, 1.01 to 1.03; P <0.001; Sequential Organ Failure Assessment (SOFA) renal subscore: HR = 0.87; 95% CI, 0.75 to 0.99; P = 0.047), maximum bilirubin level (HR = 0.99; 95% CI, 0.99 to 0.99; P = 0.02) and Glasgow Coma Scale (GCS) SOFA subscore (HR = 0.81; 95% CI, 0.68 to 0.98; P = 0.028). Changes in renal function (total daily urine output and renal component of the SOFA score), GCS component of the SOFA score, total SOFA score and worsening thrombocytopaenia were also independently associated with secondary outcomes (ICU, hospital and 28-day mortality). We detected the same pattern when we analysed trends on days 2, 3 and 5. Dynamic trends in all other measured laboratory and physiological variables, and in radiological findings, changes inrespiratory support, renal replacement therapy and inotrope and/or vasopressor requirements failed to be retained as independently associated with outcome in multivariate analysis. CONCLUSIONS Only deterioration in renal function, thrombocytopaenia and SOFA score over the first 2, 3, 5 and 7 days of the ICU stay were consistently associated with mortality at all endpoints. These findings may help to inform clinical decision making in patients with this common cause of critical illness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Whether screening for thrombophilia is useful for patients after a first episode of venous thromboembolism (VTE) is a controversial issue. However, the impact of thrombophilia on the risk of recurrence may vary depending on the patient's age at the time of the first VTE. PATIENTS AND METHODS Of 1221 VTE patients (42 % males) registered in the MAISTHRO (MAin-ISar-THROmbosis) registry, 261 experienced VTE recurrence during a 5-year follow-up after the discontinuation of anticoagulant therapy. RESULTS Thrombophilia was more common among patients with VTE recurrence than those without (58.6 % vs. 50.3 %; p = 0.017). Stratifying patients by the age at the time of their initial VTE, Cox proportional hazards analyses adjusted for age, sex and the presence or absence of established risk factors revealed a heterozygous prothrombin (PT) G20210A mutation (hazard ratio (HR) 2.65; 95 %-confidence interval (CI) 1.71 - 4.12; p < 0.001), homozygosity/double heterozygosity for the factor V Leiden and/or PT mutation (HR 2.35; 95 %-CI 1.09 - 5.07, p = 0.030), and an antithrombin deficiency (HR 2.12; 95 %-CI 1.12 - 4.10; p = 0.021) to predict recurrent VTE in patients aged 40 years or older, whereas lupus anticoagulants (HR 3.05; 95%-CI 1.40 - 6.66; p = 0.005) increased the risk of recurrence in younger patients. Subgroup analyses revealed an increased risk of recurrence for a heterozygous factor V Leiden mutation only in young females without hormonal treatment whereas the predictive value of a heterozygous PT mutation was restricted to males over the age of 40 years. CONCLUSIONS Our data do not support a preference of younger patients for thrombophilia testing after a first venous thromboembolic event.