75 resultados para Transplant Candidates
Resumo:
BACKGROUND After heart transplantation (HTx), the interindividual pharmacokinetic variability of immunosuppressive drugs represents a major therapeutic challenge due to the narrow therapeutic window between over-immunosuppression causing toxicity and under-immunosuppression leading to graft rejection. Although genetic polymorphisms have been shown to influence pharmacokinetics of immunosuppressants, data in the context of HTx are scarce. We thus assessed the role of genetic variation in CYP3A4, CYP3A5, POR, NR1I2, and ABCB1 acting jointly in immunosuppressive drug pathways in tacrolimus (TAC) and ciclosporin (CSA) dose requirement in HTx recipients. METHODS Associations between 7 functional genetic variants and blood dose-adjusted trough (C0) concentrations of TAC and CSA at 1, 3, 6, and 12 months after HTx were evaluated in cohorts of 52 and 45 patients, respectively. RESULTS Compared with CYP3A5 nonexpressors (*3/*3 genotype), CYP3A5 expressors (*1/*3 or *1/*1 genotype) required around 2.2- to 2.6-fold higher daily TAC doses to reach the targeted C0 concentration at all studied time points (P ≤ 0.003). Additionally, the POR*28 variant carriers showed higher dose-adjusted TAC-C0 concentrations at all time points resulting in significant differences at 3 (P = 0.025) and 6 months (P = 0.047) after HTx. No significant associations were observed between the genetic variants and the CSA dose requirement. CONCLUSIONS The CYP3A5*3 variant has a major influence on the required TAC dose in HTx recipients, whereas the POR*28 may additionally contribute to the observed variability. These results support the importance of genetic markers in TAC dose optimization after HTx.
Resumo:
Abstract Conclusions: Specific requests for cochlear implantations by persons with psychogenic hearing loss are a relatively new phenomenon. A number of features seems to be over-represented in this group of patients. The existence of these requests stresses the importance of auditory brainstem response (ABR) measurements before cochlear implantation. Objective: To describe the phenomenon of patients with psychogenic hearing losses specifically requesting cochlear implantation, and to gain first insights into the characteristics of this group. Methods: Analysis of all cases seen between 2004 and 2013 at the University Hospital of Bern, Switzerland. Results: Four cochlear implant candidates with psychogenic hearing loss were identified. All were female, aged 23-51 years. Hearing thresholds ranged from 86 dB to 112 dB HL (pure-tone average 500-4000 Hz). ABRs and otoacoustic emissions (OAEs) showed bilaterally normal hearing in two subjects, and hearing thresholds between 30 and 50 dB in the other two subjects. Three subjects suffered from depression and one from a pathologic fear of cancer. Three had a history of five or more previous surgeries. Three were smokers and three reported other close family members with hearing losses. All four were hearing aid users at the time of presentation.
Resumo:
This study analysed the outcome of 563 Aplastic Anaemia (AA) children aged 0-12 years reported to the Severe Aplastic Anaemia Working Party database of the European Society for Blood and Marrow Transplantation, according to treatment received. Overall survival (OS) after upfront human leucocyte antigen-matched family donor (MFD) haematopoietic stem cell transplantation (HSCT) or immunosuppressive treatment (IST) was 91% vs. 87% (P 0·18). Event-free survival (EFS) after upfront MFD HSCT or IST was 87% vs. 33% (P 0·001). Ninety-one of 167 patients (55%) failed front-line IST and underwent rescue HSCT. The OS of this rescue group was 83% compared with 91% for upfront MFD HSCT patients and 97% for those who did not fail IST up-front (P 0·017). Rejection was 2% for MFD HSCT and HSCT post-IST failure (P 0·73). Acute graft-versus-host disease (GVHD) grade II-IV was 8% in MFD graft vs. 25% for HSCT post-IST failure (P < 0·0001). Chronic GVHD was 6% in MFD HSCT vs. 20% in HSCT post-IST failure (P < 0·0001). MFD HSCT is an excellent therapy for children with AA. IST has a high failure rate, but remains a reasonable first-line choice if MFD HSCT is not available because high OS enables access to HSCT, which is a very good rescue option.
Resumo:
Prospective cohort studies significantly contribute to answering specific research questions in a defined population. Since 2008, the Swiss Transplant Cohort Study (STCS) systematically enrolled >95 % of all transplant recipients in Switzerland, collecting predefined data at determined time points. Designed as an open cohort, the STCS has included >3900 patients to date, with a median follow-up of 2.96 years (IQR 1.44-4.73). This review highlights some relevant findings in the field of transplant-associated infections gained by the STCS so far. Three key general aspects have crystallized: (i) Well-run cohort studies are a powerful tool to conduct genetic studies, which are crucially dependent on a meticulously described phenotype. (ii) Long-term real-life observations are adding a distinct layer of information that cannot be obtained during randomized studies. (iii) The systemic collection of data, close interdisciplinary collaboration, and continuous analysis of some key outcome data such as infectious diseases endpoints can improve patient care.
Resumo:
Building on theories of impression formation based on faces, this research investigates the impact of job candidates’ facial age appearance on hiring as well as the underlying mechanism. In an experiment, participants decided whether to hire a fictitious candidate aged 50 years, 30 years or without age information. The candidate’s age was signaled either via chronological information (varied by date of birth) or via facial age appearance (varied by a photograph on the résumé). Findings showed that candidates with older-appearing faces—but not chronologically older candidates—triggered impressions of low health and fitness, compared to younger-appearing candidates. These impressions reduced perceptions of person-job fit, which lowered hiring probabilities for older-appearing candidates. These findings provide the first evidence that trait impressions from faces are a determinant of age discrimination in personnel selection. They call for an extension of current models of age discrimination by integrating the effects of face-based trait impressions, particularly with respect to health and fitness.
Resumo:
BACKGROUND Kidney recipients maintaining a prolonged allograft survival in the absence of immunosuppressive drugs and without evidence of rejection are supposed to be exceptional. The ERA-EDTA-DESCARTES working group together with Nantes University launched a European-wide survey to identify new patients, describe them and estimate their frequency for the first time. METHODS Seventeen coordinators distributed a questionnaire in 256 transplant centres and 28 countries in order to report as many 'operationally tolerant' patients (TOL; defined as having a serum creatinine <1.7 mg/dL and proteinuria <1 g/day or g/g creatinine despite at least 1 year without any immunosuppressive drug) and 'almost tolerant' patients (minimally immunosuppressed patients (MIS) receiving low-dose steroids) as possible. We reported their number and the total number of kidney transplants performed at each centre to calculate their frequency. RESULTS One hundred and forty-seven questionnaires were returned and we identified 66 TOL (61 with complete data) and 34 MIS patients. Of the 61 TOL patients, 26 were previously described by the Nantes group and 35 new patients are presented here. Most of them were noncompliant patients. At data collection, 31/35 patients were alive and 22/31 still TOL. For the remaining 9/31, 2 were restarted on immunosuppressive drugs and 7 had rising creatinine of whom 3 resumed dialysis. Considering all patients, 10-year death-censored graft survival post-immunosuppression weaning reached 85% in TOL patients and 100% in MIS patients. With 218 913 kidney recipients surveyed, cumulative incidences of operational tolerance and almost tolerance were estimated at 3 and 1.5 per 10 000 kidney recipients, respectively. CONCLUSIONS In kidney transplantation, operational tolerance and almost tolerance are infrequent findings associated with excellent long-term death-censored graft survival.
Resumo:
BACKGROUND Cytomegalovirus (CMV) is associated with an increased risk of cardiac allograft vasculopathy (CAV), the major limiting factor for long-term survival after heart transplantation (HTx). The purpose of this study was to evaluate the impact of CMV infection during long-term follow-up after HTx. METHODS A retrospective, single-centre study analyzed 226 HTx recipients (mean age 45 ± 13 years, 78 % men) who underwent transplantation between January 1988 and December 2000. The incidence and risk factors for CMV infection during the first year after transplantation were studied. Risk factors for CAV were included in an analyses of CAV-free survival within 10 years post-transplant. The effect of CMV infection on the grade of CAV was analyzed. RESULTS Survival to 10 years post-transplant was higher in patients with no CMV infection (69 %) compared with patients with CMV disease (55 %; p = 0.018) or asymptomatic CMV infection (54 %; p = 0.053). CAV-free survival time was higher in patients with no CMV infection (6.7 years; 95 % CI, 6.0-7.4) compared with CMV disease (4.2 years; CI, 3.2-5.2; p < 0.001) or asymptomatic CMV infection (5.4 years; CI, 4.3-6.4; p = 0.013). In univariate analysis, recipient age, donor age, coronary artery disease (CAD), asymptomatic CMV infection and CMV disease were significantly associated with CAV-free survival. In multivariate regression analysis, CMV disease, asymptomatic CMV infection, CAD and donor age remained independent predictors of CAV-free survival at 10 years post-transplant. CONCLUSIONS CAV-free survival was significantly reduced in patients with CMV disease and asymptomatic CMV infection compared to patients without CMV infection. These findings highlight the importance of close monitoring of CMV viral load and appropriate therapeutic strategies for preventing asymptomatic CMV infection.
Resumo:
In a randomized, open-label trial, everolimus was compared to cyclosporine in 115 de novo heart transplant recipients. Patients were assigned within 5 days posttransplant to low-exposure everolimus (3–6 ng/mL) with reduced-exposure cyclosporine (n = 56), or standard-exposure cyclosporine (n = 59), with both mycophenolate mofetil and corticosteroids. In the everolimus group, cyclosporine was withdrawn after 7–11 weeks and everolimus exposure increased (6–10 ng/mL). The primary efficacy end point, measured GFR at 12 months posttransplant, was significantly higher with everolimus versus cyclosporine (mean ± SD: 79.8 ± 17.7 mL/min/1.73 m2 vs. 61.5 ± 19.6 mL/min/1.73 m2; p < 0.001). Coronary intravascular ultrasound showed that the mean increase in maximal intimal thickness was smaller (0.03 mm [95% CI 0.01, 0.05 mm] vs. 0.08 mm [95% CI 0.05, 0.12 mm], p = 0.03), and the incidence of cardiac allograft vasculopathy (CAV) was lower (50.0% vs. 64.6%, p = 0.003), with everolimus versus cyclosporine at month 12. Biopsy-proven acute rejection after weeks 7–11 was more frequent with everolimus (p = 0.03). Left ventricular function was not inferior with everolimus versus cyclosporine. Cytomegalovirus infection was less common with everolimus (5.4% vs. 30.5%, p < 0.001); the incidence of bacterial infection was similar. In conclusion, everolimus-based immunosuppression with early elimination of cyclosporine markedly improved renal function after heart transplantation. Since postoperative safety was not jeopardized and development of CAV was attenuated, this strategy may benefit long-term outcome.
Resumo:
AIM Predictors of renal recovery following conversion from calcineurin inhibitor- to proliferation signal inhibitor-based therapy are lacking. We hypothesized that plasma NGAL (P-NGAL) could predict improvement in glomerular filtration rate (GFR) after conversion to everolimus. PATIENTS & METHODS P-NGAL was measured in 88 cardiac transplantation patients (median 5 years post-transplant) with renal dysfunction randomized to continuation of conventional calcineurin inhibitor-based immunosuppression or switching to an everolimus-based regimen. RESULTS P-NGAL correlated with measured GFR (mGFR) at baseline (R(2) = 0.21; p < 0.001). Randomization to everolimus improved mGFR after 1 year (median [25-75 % percentiles]: ΔmGFR 5.5 [-0.5-11.5] vs -1 [-7-4] ml/min/1.73 m(2); p = 0.006). Baseline P-NGAL predicted mGFR after 1 year (R(2) = 0.18; p < 0.001), but this association disappeared after controlling for baseline mGFR. CONCLUSION P-NGAL and GFR correlate with renal dysfunction in long-term heart transplantation recipients. P-NGAL did not predict improvement of renal function after conversion to everolimus-based immunosuppression.
Resumo:
BACKGROUND A single non-invasive gene expression profiling (GEP) test (AlloMap®) is often used to discriminate if a heart transplant recipient is at a low risk of acute cellular rejection at time of testing. In a randomized trial, use of the test (a GEP score from 0-40) has been shown to be non-inferior to a routine endomyocardial biopsy for surveillance after heart transplantation in selected low-risk patients with respect to clinical outcomes. Recently, it was suggested that the within-patient variability of consecutive GEP scores may be used to independently predict future clinical events; however, future studies were recommended. Here we performed an analysis of an independent patient population to determine the prognostic utility of within-patient variability of GEP scores in predicting future clinical events. METHODS We defined the GEP score variability as the standard deviation of four GEP scores collected ≥315 days post-transplantation. Of the 737 patients from the Cardiac Allograft Rejection Gene Expression Observational (CARGO) II trial, 36 were assigned to the composite event group (death, re-transplantation or graft failure ≥315 days post-transplantation and within 3 years of the final GEP test) and 55 were assigned to the control group (non-event patients). In this case-controlled study, the performance of GEP score variability to predict future events was evaluated by the area under the receiver operator characteristics curve (AUC ROC). The negative predictive values (NPV) and positive predictive values (PPV) including 95 % confidence intervals (CI) of GEP score variability were calculated. RESULTS The estimated prevalence of events was 17 %. Events occurred at a median of 391 (inter-quartile range 376) days after the final GEP test. The GEP variability AUC ROC for the prediction of a composite event was 0.72 (95 % CI 0.6-0.8). The NPV for GEP score variability of 0.6 was 97 % (95 % CI 91.4-100.0); the PPV for GEP score variability of 1.5 was 35.4 % (95 % CI 13.5-75.8). CONCLUSION In heart transplant recipients, a GEP score variability may be used to predict the probability that a composite event will occur within 3 years after the last GEP score. TRIAL REGISTRATION Clinicaltrials.gov identifier NCT00761787.
Resumo:
Hematopoietic stem cell transplantation (HSCT) plays a central role in patients with malignant and, increasingly, nonmalignant conditions. As the number of transplants increases and the survival rate improves, long-term complications are important to recognize and treat to maintain quality of life. Sexual dysfunction is a commonly described but relatively often underestimated complication after HSCT. Conditioning regimens, generalized or genital graft-versus-host disease, medications, and cardiovascular complications as well as psychosocial problems are known to contribute significantly to physical and psychological sexual dysfunction. Moreover, it is often a difficult topic for patients, their significant others, and health care providers to discuss. Early recognition and management of sexual dysfunction after HSCT can lead to improved quality of life and outcomes for patients and their partners. This review focuses on the risk factors for and treatment of sexual dysfunction after transplantation and provides guidance concerning how to approach and manage a patient with sexual dysfunction after HSCT.
Resumo:
BACKGROUND Racial disparities in kidney transplantation in children have been found in the United States, but have not been studied before in Europe. STUDY DESIGN Cohort study. SETTING & PARTICIPANTS Data were derived from the ESPN/ERA-EDTA Registry, an international pediatric renal registry collecting data from 36 European countries. This analysis included 1,134 young patients (aged ≤19 years) from 8 medium- to high-income countries who initiated renal replacement therapy (RRT) in 2006 to 2012. FACTOR Racial background. OUTCOMES & MEASUREMENTS Differences between racial groups in access to kidney transplantation, transplant survival, and overall survival on RRT were examined using Cox regression analysis while adjusting for age at RRT initiation, sex, and country of residence. RESULTS 868 (76.5%) patients were white; 59 (5.2%), black; 116 (10.2%), Asian; and 91 (8.0%), from other racial groups. After a median follow-up of 2.8 (range, 0.1-3.0) years, we found that black (HR, 0.49; 95% CI, 0.34-0.72) and Asian (HR, 0.54; 95% CI, 0.41-0.71) patients were less likely to receive a kidney transplant than white patients. These disparities persisted after adjustment for primary renal disease. Transplant survival rates were similar across racial groups. Asian patients had higher overall mortality risk on RRT compared with white patients (HR, 2.50; 95% CI, 1.14-5.49). Adjustment for primary kidney disease reduced the effect of Asian background, suggesting that part of the association may be explained by differences in the underlying kidney disease between racial groups. LIMITATIONS No data for socioeconomic status, blood group, and HLA profile. CONCLUSIONS We believe this is the first study examining racial differences in access to and outcomes of kidney transplantation in a large European population. We found important differences with less favorable outcomes for black and Asian patients. Further research is required to address the barriers to optimal treatment among racial minority groups.
Resumo:
BACKGROUND Nocardiosis is a rare, life-threatening opportunistic infection, affecting 0.04% to 3.5% of patients after solid organ transplantation (SOT). The aim of this study was to identify risk factors for Nocardia infection after SOT and to describe the presentation of nocardiosis in these patients. METHODS We performed a retrospective case-control study of adult patients diagnosed with nocardiosis after SOT between 2000 and 2014 in 36 European (France, Belgium, Switzerland, Netherlands, Spain) centers. Two control subjects per case were matched by institution, transplant date and transplanted organ. A multivariable analysis was performed using conditional logistic regression to identify risk factors for nocardiosis. RESULTS One hundred and seventeen cases of nocardiosis and 234 control patients were included. Nocardiosis occurred at a median of 17.5 [range 2-244] months after transplantation. In multivariable analysis, high calcineurin inhibitor trough levels in the month before diagnosis (OR=6.11 [2.58-14.51]), use of tacrolimus (OR=2.65 [1.17-6.00]) and corticosteroid dose (OR=1.12 [1.03-1.22]) at the time of diagnosis, patient age (OR=1.04 [1.02-1.07]) and length of stay in intensive care unit after SOT (OR=1.04 [1.00-1.09]) were independently associated with development of nocardiosis; low-dose cotrimoxazole prophylaxis was not found to prevent nocardiosis. Nocardia farcinica was more frequently associated with brain, skin and subcutaneous tissue infections than were other Nocardia species. Among the 30 cases with central nervous system nocardiosis, 13 (43.3%) had no neurological symptoms. CONCLUSIONS We identified five risk factors for nocardiosis after SOT. Low-dose cotrimoxazole was not found to prevent Nocardia infection. These findings may help improve management of transplant recipients.