170 resultados para RETROSPECTIVE ANALYSIS
Resumo:
OBJECTIVE: Thrombocytosis is an adverse prognostic factor in many types of cancer. We investigated if pre-treatment increased platelet counts provide prognostic information specifically in patients with stage III and IV serous ovarian cancer which is the most common clinical presentation of ovarian cancer. METHODS: Platelet number on diagnosis of stage III and IV serous ovarian adenocarcinoma was evaluated in 91 patients for whom there were complete follow-up data on progression and survival. Survival and progression free survival of patients with normal platelet counts (150-350 ×10(9)/L) was compared with that of patients with thrombocytosis (>350×10(9)/L) by χ(2) and logrank tests. RESULTS: The median age of the patients was 66 years-old. From the 91 patients, 52 (57.1%) had normal platelet counts (median, 273×10(9)/L; range, 153-350) at diagnosis of their disease and 39 patients (42.9%) had thrombocytosis (median, 463×10(9)/L; range, 354-631). In the group of patients with normal platelet counts, 24 of the 52 patients had died with a median survival of 43 months (range, 3-100). In the group of patients with thrombocytosis, 24 of the 39 patients had died with a median survival of 23 months (range, 4-79). In the entire group of 91 patients there was a statistically significant difference of the overall survival and progression-free survival between the two groups (logrank test P=0.02 and P=0.007, respectively). CONCLUSION: In this retrospective analysis of stage III and IV ovarian cancer patients, thrombocytosis at the time of diagnosis had prognostic value regarding overall survival and progression-free survival.
Resumo:
PURPOSE: To identify risk factors associated with mortality in patients with severe community-acquired pneumonia (CAP) caused by S. pneumoniae who require intensive care unit (ICU) management, and to assess the prognostic values of these risk factors at the time of admission. METHODS: Retrospective analysis of all consecutive patients with CAP caused by S. pneumoniae who were admitted to the 32-bed medico-surgical ICU of a community and referral university hospital between 2002 and 2011. Univariate and multivariate analyses were performed on variables available at admission. RESULTS: Among the 77 adult patients with severe CAP caused by S. pneumoniae who required ICU management, 12 patients died (observed mortality rate 15.6 %). Univariate analysis indicated that septic shock and low C-reactive protein (CRP) values at admission were associated with an increased risk of death. In a multivariate model, after adjustment for age and gender, septic shock [odds ratio (OR), confidence interval 95 %; 4.96, 1.11-22.25; p = 0.036], and CRP (OR 0.99, 0.98-0.99 p = 0.034) remained significantly associated with death. Finally, we assessed the discriminative ability of CRP to predict mortality by computing its receiver operating characteristic curve. The CRP value cut-off for the best sensitivity and specificity was 169.5 mg/L to predict hospital mortality with an area under the curve of 0.72 (0.55-0.89). CONCLUSIONS: The mortality of patients with S. pneumoniae CAP requiring ICU management was much lower than predicted by severity scores. The presence of septic shock and a CRP value at admission <169.5 mg/L predicted a fatal outcome.
Resumo:
BACKGROUND/OBJECTIVES: Preoperative nutrition has been shown to reduce morbidity after major gastrointestinal (GI) surgery in selected patients at risk. In a randomized trial performed recently (NCT00512213), almost half of the patients, however, did not consume the recommended dose of nutritional intervention. The present study aimed to identify the risk factors for noncompliance. SUBJECTS/METHODS: Demographic (n=5) and nutritional (n=21) parameters for this retrospective analysis were obtained from a prospectively maintained database. The outcome of interest was compliance with the allocated intervention (ingestion of ⩾11/15 preoperative oral nutritional supplement units). Uni- and multivariate analyses of potential risk factors for noncompliance were performed. RESULTS: The final analysis included 141 patients with complete data sets for the purpose of the study. Fifty-nine patients (42%) were considered noncompliant. Univariate analysis identified low C-reactive protein levels (P=0.015), decreased recent food intake (P=0.032) and, as a trend, low hemoglobin (P=0.065) and low pre-albumin (P=0.056) levels as risk factors for decreased compliance. However, none of them was retained as an independent risk factor after multivariate analysis. Interestingly, 17 potential explanatory parameters, such as upper GI cancer, weight loss, reduced appetite or co-morbidities, did not show any significant correlation with reduced intake of nutritional supplements. CONCLUSIONS: Reduced compliance with preoperative nutritional interventions remains a major issue because the expected benefit depends on the actual intake. Seemingly, obvious reasons could not be retained as valid explanations. Compliance seems thus to be primarily a question of will and information; the importance of nutritional supplementation needs to be emphasized by specific patients' education.
Resumo:
BACKGROUND: Normobaric oxygen therapy is frequently applied in neurocritical care, however, whether supplemental FiO2 has beneficial cerebral effects is still controversial. We examined in patients with severe traumatic brain injury (TBI) the effect of incremental FiO2 on cerebral excitotoxicity, quantified by cerebral microdialysis (CMD) glutamate. METHODS: This was a retrospective analysis of a database of severe TBI patients monitored with CMD and brain tissue oxygen (PbtO2). The relationship of FiO2-categorized into four separate ranges (<40, 41-60, 61-80, and >80 %)-with CMD glutamate was examined using ANOVA with Tukey's post hoc test. RESULTS: A total of 1,130 CMD samples from 36 patients-monitored for a median of 4 days-were examined. After adjusting for brain (PbtO2, intracranial pressure, cerebral perfusion pressure, lactate/pyruvate ratio, Marshall CT score) and systemic (PaCO2, PaO2, hemoglobin, APACHE score) covariates, high FiO2 was associated with a progressive increase in CMD glutamate [8.8 (95 % confidence interval 7.4-10.2) µmol/L at FiO2 < 40 % vs. 12.8 (10.9-14.7) µmol/L at 41-60 % FiO2, 19.3 (15.6-23) µmol/L at 61-80 % FiO2, and 22.6 (16.7-28.5) µmol/L at FiO2 > 80 %; multivariate-adjusted p < 0.05]. The threshold of FiO2-related increase in CMD glutamate was lower for samples with normal versus low PbtO2 < 20 mmHg (FiO2 > 40 % vs. FiO2 > 60 %). Hyperoxia (PaO2 > 150 mmHg) was also associated with increased CMD glutamate (adjusted p < 0.001). CONCLUSIONS: Incremental normobaric FiO2 levels were associated with increased cerebral excitotoxicity in patients with severe TBI, independent from PbtO2 and other important cerebral and systemic determinants. These data suggest that supra-normal oxygen may aggravate secondary brain damage after severe TBI.
Resumo:
BACKGROUND & AIMS: Trace elements (TE) are involved in the immune and antioxidant defences which are of particular importance during critical illness. Determining plasma TE levels is costly. The present quality control study aimed at assessing the economic impact of a computer reminded blood sampling versus a risk guided on-demand monitoring of plasma concentrations of selenium, copper, and zinc. METHODS: Retrospective analysis of 2 cohorts of patients admitted during 6 months periods in 2006 and 2009 to the ICU of a University hospital. INCLUSION CRITERIA: to receive intravenous micronutrient supplements and/or to have a TE sampling during ICU stay. The TE samplings were triggered by computerized reminder in 2006 versus guided by nutritionists in 2009. RESULTS: During the 2 periods 636 patients met the inclusion criteria out of 2406 consecutive admissions, representing 29.7% and 24.9% respectively of the periods' admissions. The 2009 patients had higher SAPS2 scores (p = 0.02) and lower BMI compared to 2006 (p = 0.007). The number of laboratory determinations was drastically reduced in 2009, particularly during the first week, despite the higher severity of the cohort, resulting in à 55% cost reduction. CONCLUSIONS: The monitoring of TE concentrations guided by a nutritionist resulted in a reduction of the sampling frequency, and targeting on the sickest high risk patients, requiring a nutritional prescription adaptation. This control leads to cost reduction compared to an automated sampling prescription.
Resumo:
BACKGROUND: The aims of the study were to evaluate the prevalence of acute coronary syndrome (ACS) among patients presenting with atypical chest pain who are evaluated for acute aortic syndrome (AAS) or pulmonary embolism (PE) with computed tomoangiography (CTA) and discuss the rationale for the use of triple rule-out (TRO) protocol for triaging these patients. METHODS: This study is a retrospective analysis of patients presenting with atypical chest pain and evaluated with thoracic (CTA), for suspicion of AAS/PE. Two physicians reviewed patient files for demographic characteristics, initial CT and final clinical diagnosis. Patients were classified according to CTA finding into AAS, PE and other diagnoses and according to final clinical diagnosis into AAS, PE, ACS and other diagnoses. RESULTS: Four hundred and sixty-seven patients were evaluated: 396 (84.8%) patients for clinical suspicion of PE and 71 (15.2%) patients for suspicion of AAS. The prevalence of ACS and AAS was low among the PE patients: 5.5% and 0.5% respectively (P = 0.0001), while the prevalence of ACS and PE was 18.3% and 5.6% among AAS patients (P = 0.14 and P = 0.34 respectively). CONCLUSION: The prevalence of ACS and AAS among patients suspected clinically of having PE is limited while the prevalence of ACS and PE among patients suspected clinically of having AAS is significant. Accordingly patients suspected for PE could be evaluated with dedicated PE CTA while those suspected for AAS should still be triaged using TRO protocol.
Resumo:
BACKGROUND: Due to the underlying diseases and the need for immunosuppression, patients after lung transplantation are particularly at risk for gastrointestinal (GI) complications that may negatively influence long-term outcome. The present study assessed the incidences and impact of GI complications after lung transplantation and aimed to identify risk factors. METHODS: Retrospective analysis of all 227 consecutively performed single- and double-lung transplantations at the University hospitals of Lausanne and Geneva was performed between January 1993 and December 2010. Logistic regressions were used to test the effect of potentially influencing variables on the binary outcomes overall, severe, and surgery-requiring complications, followed by a multiple logistic regression model. RESULTS: Final analysis included 205 patients for the purpose of the present study, and 22 patients were excluded due to re-transplantation, multiorgan transplantation, or incomplete datasets. GI complications were observed in 127 patients (62 %). Gastro-esophageal reflux disease was the most commonly observed complication (22.9 %), followed by inflammatory or infectious colitis (20.5 %) and gastroparesis (10.7 %). Major GI complications (Dindo/Clavien III-V) were observed in 83 (40.5 %) patients and were fatal in 4 patients (2.0 %). Multivariate analysis identified double-lung transplantation (p = 0.012) and early (1993-1998) transplantation period (p = 0.008) as independent risk factors for developing major GI complications. Forty-three (21 %) patients required surgery such as colectomy, cholecystectomy, and fundoplication in 6.8, 6.3, and 3.9 % of the patients, respectively. Multivariate analysis identified Charlson comorbidity index of ≥3 as an independent risk factor for developing GI complications requiring surgery (p = 0.015). CONCLUSION: GI complications after lung transplantation are common. Outcome was rather encouraging in the setting of our transplant center.
Resumo:
BACKGROUND: Several guidelines recommend computed tomography scans for populations with high-risk for lung cancer. The number of individuals evaluated for peripheral pulmonary lesions (PPL) will probably increase, and with it non-surgical biopsies. Associating a guidance method with a target confirmation technique has been shown to achieve the highest diagnostic yield, but the utility of bronchoscopy with radial probe endobronchial ultrasound using fluoroscopy as guidance without a guide sheath has not been reported. METHODS: We conducted a retrospective analysis of bronchoscopy with radial probe endobronchial ultrasound using fluoroscopy procedures for the investigation of PPL performed by experienced bronchoscopists with no specific previous training in this particular technique. Operator learning curves and radiological predictors were assessed for all consecutive patients examined during the first year of application of the technique. RESULTS: Fifty-one PPL were investigated. Diagnostic yield and visualization yield were 72.5 and 82.3% respectively. The diagnostic yield was 64.0% for PPL ≤20mm, and 80.8% for PPL>20mm. No false-positive results were recorded. The learning curve of all diagnostic tools showed a DY of 72.7% for the first sub-group of patients, 81.8% for the second, 72.7% for the third, and 81.8% for the last. CONCLUSION: Bronchoscopy with radial probe endobronchial ultrasound using fluoroscopy as guidance is safe and simple to perform, even without specific prior training, and diagnostic yield is high for PPL>and ≤20mm. Based on these findings, this method could be introduced as a first-line procedure for the investigation of PPL, particularly in centers with limited resources.
Resumo:
Extensive defects of the pelvis and genitoperineal region are a reconstructive challenge. We discuss a consecutive series of 25 reconstructions with the pedicled anterolateral thigh (ALT) flap including muscle part of the vastus lateralis (VL) in 23 patients from October 1999 to September 2012.Only surface defects larger than 100 cm and reconstructions by composite ALT + VL were included in this retrospective analysis. Of the 23 patients, 19 underwent oncologic resection, whereas 4 cases presented Fournier gangrene. Three patients did not reach 6 months of follow-up and were excluded from further data analysis. Among the remaining 20 patients (22 reconstructions), average follow-up period was 14 months (range, 10-18 months). Patient's average age was 60 years. Average size of the defect was 182 cm.Postoperative complications included 1 (4.5%) flap necrosis out of 22 raised flaps, 1 partial flap necrosis after venous congestion, and 2 cases where a complementary reconstructive procedure was performed due to remaining defect or partial flap failure. In 6 cases, peripheral wound dehiscence (27%) was treated by debridement followed by split-thickness skin graft or advancement local flaps. Defect size was significantly related to postoperative complications and increased hospital stay, especially in those patients who underwent preoperative radiotherapy. At the end of the follow-up period, a long-term and satisfactory coverage was obtained in all patients without functional deficits.This consecutive series of composite ALT + VL flap shows that, in case of extended defects, the flap provides an excellent and adjustable muscle mass, is reliable with minimal donor-site morbidity, and can even be designed as a sensate flap.
Resumo:
OBJECTIVE: Inflammation-related epilepsy is increasingly recognized; however, studies on status epilepticus (SE) are very infrequent. We therefore aimed to determine the frequency of inflammatory etiologies in adult SE, and to assess related demographic features and outcomes. METHODS: This was a retrospective analysis of a prospective registry of adult patients with SE treated in our center, from January 2008 to June 2014, excluding postanoxic causes. We classified SE episodes into 3 etiologic categories: infectious, autoimmune, and noninflammatory. Demographic and clinical variables were analyzed regarding their relationship to etiologies and functional outcome. RESULTS: Among the 570 SE consecutive episodes, 33 (6%) were inflammatory (2.5% autoimmune; 3.3% infectious), without any change in frequency over the study period. Inflammatory SE episodes involved younger patients (mean age 53 vs 61 years, p = 0.015) and were more often refractory to initial antiepileptic treatment (58% vs 38%, odds ratio = 2.19, 95% confidence interval = 1.07-4.47, p = 0.041), despite similar clinical outcome. Subgroup analysis showed that, compared with infectious SE episodes, autoimmune SE involved younger adults (mean age 44 vs 60 years, p = 0.017) and was associated with lower morbidity (return to baseline conditions in 71% vs 32%, odds ratio = 5.41, 95% confidence interval = 1.19-24.52, p = 0.043) without any difference in mortality. CONCLUSIONS: Despite increasing awareness, inflammatory SE etiologies were relatively rare; their occurrence in younger individuals and higher refractoriness to treatment did not have any effect on outcome. Autoimmune SE episodes also occurred in younger patients, but tended to have better outcomes in survivors than infectious SE.
Resumo:
INTRODUCTION: Hyperglycemia is a metabolic alteration in major burn patients associated with complications. The study aimed at evaluating the safety of general ICU glucose control protocols applied in major burns receiving prolonged ICU treatment. METHODS: 15year retrospective analysis of consecutive, adult burn patients admitted to a single specialized centre. EXCLUSION CRITERIA: death or length of stay <10 days, age <16years. VARIABLES: demographic variables, burned surface (TBSA), severity scores, infections, ICU stay, outcome. Metabolic variables: total energy, carbohydrate and insulin delivery/24h, arterial blood glucose and CRP values. Analysis of 4 periods: 1, before protocol; 2, tight doctor driven; 3, tight nurse driven; 4, moderate nurse driven. RESULTS: 229 patients, aged 45±20 years (mean±SD), burned 32±20% TBSA were analyzed. SAPSII was 35±13. TBSA, Ryan and ABSI remained stable. Inhalation injury increased. A total of 28,690 blood glucose samples were analyzed: the median value remained unchanged with a narrower distribution over time. After the protocol initiation, the normoglycemic values increased from 34.7% to 65.9%, with a reduction of hypoglycaemic events (no extreme hypoglycemia in period 4). Severe hyperglycemia persisted throughout with a decrease in period 4 (9.25% in period 4). Energy and glucose deliveries decreased in periods 3 and 4 (p<0.0001). Infectious complications increased during the last 2 periods (p=0.01). CONCLUSION: A standardized ICU glucose control protocol improved the glycemic control in adult burn patients, reducing glucose variability. Moderate glycemic control in burns was safe specifically related to hypoglycemia, reducing the incidence of hypoglycaemic events compared to the period before. Hyperglycemia persisted at a lower level.
Resumo:
Background: EATL is a rare subtype of peripheral T-cell lymphomas characterized by primarily intestinal localization and a frequent association with celiac disease. The prognosis is considered to be poor with conventional chemotherapy. Limited data is available on the efficacy of ASCT in this lymphoma subtype. Primary objective: was to study the outcome of ASCT as a consolidation or salvage strategy for EATL. The primary endpoint was overall survival (OS) and progression-free survival (PFS). Eligible patients were > 18 years who had received ASCT between 2000-2010 for EATL that was confirmed by review of written histopathology reports, and had sufficient information on disease history and follow-up available. The search strategy used the EBMT database to identify patients potentially fulfilling the eligibility criteria. An additional questionnaire was sent to individual transplant centres to confirm histological diagnosis (histopathology report or pathology review) as well as updated follow-up data. Patients and transplant characteristics were compared between groups using X2 test or Fisher's exact test for categorical variables and t-test or Mann-Whiney U-test for continuous variables. OS and PFS were estimated using the Kaplan-Meier product-limit estimate and compared by the log-rank test. Estimates for non-relapse mortality (NRM) and relapse or progression were calculated using cumulative incidence rates to accommodate competing risk and compared to Gray's test. Results: Altogether 138 patients were identified. Updated follow-up data was received from 74 patients (54 %) and histology report from 54 patients (39 %). In ten patients the diagnosis of EATL could not be adequately verified. Thus the final analysis included 44. There were 24 males and 20 females with a median age of 56 (35-72) years at the time of transplant. Twenty-five patients (57 %) had a history of celiac disease. Disease stage was I in nine patients (21 %), II in 14 patients (33 %) and IV in 19 patients (45 %). Twenty-four patients (55 %) were in the first CR or PR at the time of transplant. BEAM was used as a high-dose regimen in 36 patients (82 %) and all patients received peripheral blood grafts. The median follow-up for survivors was 46 (2-108) months from ASCT. Three patients died early from transplant-related reasons translating into a 2-year non-relapse mortality of 7 %. Relapse incidence at 4 years after ASCT was 39 %, with no events occurring beyond 2.5 years after ASCT. PFS and OS were 54 % and 59 % at four years, respectively. There was a trend for better OS in patients transplanted in the first CR or PR compared to more advanced disease status (70 % vs. 43 %, p=0.053). Of note, patients with a history of celiac disease had superior PFS (70 % vs. 35 %, p=0.02) and OS (70 % vs. 45 %, p=0.052) whilst age, gender, disease stage, B-symptoms at diagnosis or high-dose regimen were not associated with OS or PFS. Conclusions: This study shows for the first time in a larger patient sample that ASCT is feasible in selected patients with EATL and can yield durable disease control in a significant proportion of the patients. Patients transplanted in first CR or PR appear to do better than those transplanted later. ASCT should be considered in EATL patients responding to initial therapy.
Resumo:
BACKGROUND: Chronic lateral ankle instability accounts for 20% of the ankle injuries. This study evaluates functional outcome of the modified Broström-Gould technique using suture anchors, with 4 different clinical scores. METHODS: A consecutive series of 41 patients were included with a minimum follow-up of one year. The function was assessed using 4 clinical scores including: the AOFAS for hind foot; the FAAM; the CAIT and the CAIS. RESULTS: Out of 41 patients; 27 patients were very satisfied, 11 satisfied and 3 were not satisfied. Ankle mobility returned to normal in 93% of patients. At follow-up the AOFAS was 89/100 (37-100), the FAAM 85/100% (35-100%), the CAIT 20/30 (5-30), and the CAIS 74/100% (27-100%). CONCLUSION: Outcome of modified Broström-Gould procedure is good with high satisfaction rate in terms of ankle mobility. The disparity in outcome of scores, signals towards the need of a standard evaluation system.
Resumo:
Background Following the discovery that mutant KRAS is associated with resistance to anti-epidermal growth factor receptor (EGFR) antibodies, the tumours of patients with metastatic colorectal cancer are now profiled for seven KRAS mutations before receiving cetuximab or panitumumab. However, most patients with KRAS wild-type tumours still do not respond. We studied the effect of other downstream mutations on the efficacy of cetuximab in, to our knowledge, the largest cohort to date of patients with chemotherapy-refractory metastatic colorectal cancer treated with cetuximab plus chemotherapy in the pre-KRAS selection era. Methods 1022 tumour DNA samples (73 from fresh-frozen and 949 from formalin-fixed, paraffin-embedded tissue) from patients treated with cetuximab between 2001 and 2008 were gathered from 11 centres in seven European countries. 773 primary tumour samples had sufficient quality DNA and were included in mutation frequency analyses; mass spectrometry genotyping of tumour samples for KRAS, BRAF, NRAS, and PIK3CA was done centrally. We analysed objective response, progression-free survival (PFS), and overall survival in molecularly defined subgroups of the 649 chemotherapy-refractory patients treated with cetuximab plus chemotherapy. Findings 40.0% (299/747) of the tumours harboured a KRAS mutation, 14.5% (108/743) harboured a PIK3CA mutation (of which 68.5% [74/108] were located in exon 9 and 20.4% [22/108] in exon 20), 4.7% (36/761) harboured a BRAF mutation, and 2.6% (17/644) harboured an NRAS mutation. KRAS mutants did not derive benefit compared with wild types, with a response rate of 6.7% (17/253) versus 35.8% (126/352; odds ratio [OR] 0.13, 95% CI 0.07-0.22; p<0.0001), a median PFS of 12. weeks versus 24 weeks (hazard ratio [HR] 1 98, 1.66-2.36; p<0.0001), and a median overall survival of 32 weeks versus 50 weeks (1.75, 1.47-2.09; p<0.0001). In KRAS wild types, carriers of BRAF and NRAS mutations had a significantly lower response rate than did BRAF and NRAS wild types, with a response rate of 8.3% (2/24) in carriers of BRAF mutations versus 38.0% in BRAF wild types (124/326; OR 0.15, 95% CI 0.02-0.51; p=0.0012); and 7.7% (1/13) in carriers of NRAS mutations versus 38.1% in NRAS wild types (110/289; OR 0.14, 0.007-0.70; p=0.013). PIK3CA exon 9 mutations had no effect, whereas exon 20 mutations were associated with a worse outcome compared with wild types, with a response rate of 0.0% (0/9) versus 36.8% (121/329; OR 0.00,0.00-0.89; p=0.029), a median PFS of 11.5 weeks versus 24 weeks (HR 2.52, 1.33-4.78; p=0.013), and a median overall survival of 34 weeks versus 51 weeks (3.29, 1.60-6.74; p=0.0057). Multivariate analysis and conditional inference trees confirmed that, if KRAS is not mutated, assessing BRAF, NRAS, and PIK3CA exon 20 mutations (in that order) gives additional information about outcome. Objective response rates in our series were 24.4% in the unselected population, 36.3% in the KRAS wild-type selected population, and 41.2% in the KRAS, BRAF, NRAS, and PIK3CA exon 20 wild-type population. Interpretation While confirming the negative effect of KRAS mutations on outcome after cetuximab, we show that BRAF, NRAS, and PIK3CA,exon 20 mutations are significantly associated with a low response rate. Objective response rates could be improved by additional genotyping of BRAF, NRAS, and PIK3CA exon 20 mutations in a KRAS wild-type population.
Resumo:
De Gottardi A, Hilleret M-N, Gelez P, La Mura V, Guillaud O, Majno P, Hadengue A, Morel P, Zarski J-P, Fontana M, Moradpour D, Mentha G, Boillot O, Leroy V, Giostra E, Dumortier J. Injection drug use before and after liver transplantation: a retrospective multicenter analysis on incidence and outcome. Clin Transplant 2009 DOI: 10.1111/j.1399-0012.2009.01121.x.Background and aims: Injecting drug use (IDU) before and after liver transplantation (LT) is poorly described. The aim of this study was to quantify relapse and survival in this population and to describe the causes of mortality after LT. Methods: Past injection drug users were identified from the LT listing protocols from four centers in Switzerland and France. Data on survival and relapse were collected and used for uni- and multivariate analysis. Results: Between 1988 and 2006, we identified 59 patients with a past history of IDU. The mean age at transplantation was 42.4 yr and the majority of patients were men (84.7%). The indication for LT was for the vast majority viral cirrhosis accounting for 91.5% of cases, while alcoholic cirrhosis was 5.1%. There were 16.9% of patients who had a substitution therapy before and 6.8% who continued after LT. Two patients (3.4%) relapsed into IDU after LT and died at 18 and 41 months. The mean follow-up was 51 months. Overall survival was 84%, 66%, and 61% at 1, 5, and 10 yr after transplantation. Conclusions: Documented IDU was rare in liver transplanted patients. Past IDU was not associated with poorer survival after LT, and relapse after LT occurred in 3.4%.