986 resultados para Organ Transplantation
Resumo:
Introduction: The Thalidomide-Dexamethasone (TD) regimen has provided encouraging results in relapsed MM. To improve results, bortezomib (Velcade) has been added to the combination in previous phase II studies, the so called VTD regimen. In January 2006, the European Group for Blood and Marrow Transplantation (EBMT) and the Intergroupe Francophone du Myélome (IFM) initiated a prospective, randomized, parallel-group, open-label phase III, multicenter study, comparing VTD (arm A) with TD (arm B) for MM patients progressing or relapsing after autologous transplantation. Patients and Methods: Inclusion criteria: patients in first progression or relapse after at least one autologous transplantation, including those who had received bortezomib or thalidomide before transplant. Exclusion criteria: subjects with neuropathy above grade 1 or non secretory MM. Primary study end point was time to progression (TTP). Secondary end points included safety, response rate, progression-free survival (PFS) and overall survival (OS). Treatment was scheduled as follows: bortezomib 1.3 mg/m2 was given as an i.v bolus on Days 1, 4, 8 and 11 followed by a 10-Day rest period (days 12 to 21) for 8 cycles (6 months) and then on Days 1, 8, 15, 22 followed by a 20-Day rest period (days 23 to 42) for 4 cycles (6 months). In both arms, thalidomide was scheduled at 200 mg/Day orally for one year and dexamethasone 40 mg/Day orally four days every three weeks for one year. Patients reaching remission could proceed to a new stem cell harvest. However, transplantation, either autologous or allogeneic, could only be performed in patients who completed the planned one year treatment period. Response was assessed by EBMT criteria, with additional category of near complete remission (nCR). Adverse events were graded by the NCI-CTCAE, Version 3.0.The trial was based on a group sequential design, with 4 planned interim analyses and one final analysis that allowed stopping for efficacy as well as futility. The overall alpha and power were set equal to 0.025 and 0.90 respectively. The test for decision making was based on the comparison in terms of the ratio of the cause-specific hazards of relapse/progression, estimated in a Cox model stratified on the number of previous autologous transplantations. Relapse/progression cumulative incidence was estimated using the proper nonparametric estimator, the comparison was done by the Gray test. PFS and OS probabilities were estimated by the Kaplan-Meier curves, the comparison was performed by the Log-Rank test. An interim safety analysis was performed when the first hundred patients had been included. The safety committee recommended to continue the trial. Results: As of 1st July 2010, 269 patients had been enrolled in the study, 139 in France (IFM 2005-04 study), 21 in Italy, 38 in Germany, 19 in Switzerland (a SAKK study), 23 in Belgium, 8 in Austria, 8 in the Czech republic, 11 in Hungary, 1 in the UK and 1 in Israel. One hundred and sixty nine patients were males and 100 females; the median age was 61 yrs (range 29-76). One hundred and thirty six patients were randomized to receive VTD and 133 to receive TD. The current analysis is based on 246 patients (124 in arm A, 122 in arm B) included in the second interim analysis, carried out when 134 events were observed. Following this analysis, the trial was stopped because of significant superiority of VTD over TD. The remaining patients were too premature to contribute to the analysis. The number of previous autologous transplants was one in 63 vs 60 and two or more in 61 vs 62 patients in arm A vs B respectively. The median follow-up was 25 months. The median TTP was 20 months vs 15 months respectively in arm A and B, with cumulative incidence of relapse/progression at 2 years equal to 52% (95% CI: 42%-64%) vs 70% (95% CI: 61%-81%) (p=0.0004, Gray test). The same superiority of arm A was also observed when stratifying on the number of previous autologous transplantations. At 2 years, PFS was 39% (95% CI: 30%-51%) vs 23% (95% CI: 16%-34%) (A vs B, p=0.0006, Log-Rank test). OS in the first two years was comparable in the two groups. Conclusion: VTD resulted in significantly longer TTP and PFS in patients relapsing after ASCT. Analysis of response and safety data are on going and results will be presented at the meeting.
Resumo:
BACKGROUND: To determine the extent to which major histoincompatibilities are recognized after bone marrow transplantation, we characterized the specificity of the cytotoxic T lymphocytes isolated during graft-versus-host disease. We studied three patients transplanted with marrow from donors who were histoincompatible for different types of HLA antigens. METHODS: Patient 1 was mismatched for one "ABDR-antigen" (HLA-A2 versus A3). Two patients were mismatched for antigens that would usually not be taken into account by standard selection procedures: patient 2 was mismatched for an "HLA-A subtype" (A*0213 versus A*0201), whereas patient 3 was mismatched for HLA-C (HLA-C*0501 versus HLA-C*0701). All three HLA class I mismatches were detected by a pretransplant cytotoxic precursor test. RESULTS: Analysis of the specificity of the cytotoxic T lymphocyte clones isolated after transplantation showed that the incompatibilities detected by the pretransplant cytotoxic precursor assay were the targets recognized during graft-versus-host disease. CONCLUSIONS: Independent of whether the incompatibility consisted of a "full" mismatch, a "subtype" mismatch, or an HLA-C mismatch, all clones recognized the incompatible HLA molecule. In addition, some of these clones had undergone antigen selection and were clearly of higher specificity than the ones established before transplantation, indicating that they had been participating directly in the antihost immune response.
Resumo:
Purpose: The exact role of individual T cell-subsets in the development of rejection is not clearly defined. Given their distinct phenotypes, effector functions and trafficking patterns, naïve (CD45RBhiCD44lo) and memory (CD45RBloCD44hi) T cells may play distinct roles in anti-donor immunity after transplantation. Furthermore, only the CD4+CD45RBlo population contains CD4+CD25+ T cells, a subset with suppressive functions playing a major role in the maintenance of peripheral tolerance. The aim of this work was to study the contribution of these individual subsets in alloresponses via the direct and indirect pathways using a murine experimental model. Methods and materials: Purified naïve or memory CD4+ T cells were adoptively transferred into lymphopenic mice undergoing a skin allograft. Donor to recipient MHC combinations were chosen in order to study the direct and the indirect pathways of allorecognition separately. Graft survival and in vivo expansion, effector function and trafficking of the transferred T cells was assessed at different time points after transplantation. Results: We found that the cross-reactive CD4+CD45RBlo memory T-cell pool was heterogeneous and contained cells with regulatory potentials, both in the CD4+CD25+ and CD4+CD25-populations. CD4+ T cells capable of inducing strong primary alloreactive responses in vitro and rejection of a first allograft in vivo were mainly contained within the CD45RBhi naïve CD4+ T-cell compartment. CD4+CD45RBlo T cells proliferated less abundantly to allogeneic stimulation than their naïve counterparts both in vitro and in vivo, and allowed prolonged allograft survival even after the depletion of the CD4+CD25+ subset. Interestingly, CD4+CD25-CD45RBlo T cells were capable of prolonging allograft survival, mainly when the indirect pathway was the only mechanism of allorecognition. The indirect pathway response, which was shown to drive true chronic rejection and contribute to chronic allograft dysfunction, was predominantly mediated by naïve CD4+ T cells. Conclusion: This work provides new insights into the mechanisms that drive allograft rejection and should help develop new clinical immunosuppressive protocols. In particular, our results highlight the importance of selectively targeting individual T-cell subsets to prevent graft rejection but at the same time maintain immune protective responses to common pathogens.
Resumo:
BACKGROUND: In heart transplantation, antibody-mediated rejection (AMR) is diagnosed and graded on the basis of immunopathologic (C4d-CD68) and histopathologic criteria found on endomyocardial biopsies (EMB). Because some pathologic AMR (pAMR) grades may be associated with clinical AMR, and because humoral responses may be affected by the intensity of immunosuppression during the first posttransplantation year, we investigated the incidence and positive predictive values (PPV) of C4d-CD68 and pAMR grades for clinical AMR as a function of time. METHODS: All 564 EMB from 40 adult heart recipients were graded for pAMR during the first posttransplantation year. Clinical AMR was diagnosed by simultaneous occurrence of pAMR on EMB, donor specific antibodies and allograft dysfunction. RESULTS: One patient demonstrated clinical AMR at postoperative day 7 and one at 6 months (1-year incidence 5%). C4d-CD68 was found on 4,7% EMB with a "decrescendo" pattern over time (7% during the first 4 months vs. 1.2% during the last 8 months; P < 0.05). Histopathologic criteria of AMR occurred on 10.3% EMB with no particular time pattern. Only the infrequent (1.4%) pAMR2 grade (simultaneous histopathologic and immunopathologic markers) was predictive for clinical AMR, particularly after the initial postoperative period (first 4 months and last 8 months PPV = 33%-100%; P < 0.05). CONCLUSION: In the first posttransplantation year, AMR immunopathologic and histopathologic markers were relatively frequent, but only their simultaneous occurrence (pAMR2) was predictive of clinical AMR. Furthermore, posttransplantation time may modulate the occurrence of C4d-CD68 on EMB and thus the incidence of pAMR2 and its relevance to the diagnosis of clinical AMR.
Resumo:
Fractures due to osteoporosis are one of the major complications after heart transplantation, occurring mostly during the first 6 months after the graft, with an incidence ranging from 18% to 50% for vertebral fractures. Bone mineral density (BMD) decreases dramatically following the graft, at trabecular sites as well as cortical sites. This is explained by the relatively high doses of glucocorticoids used during the months following the graft, and by a long-term increase of bone turnover which is probably due to cyclosporine. There is some evidence for a beneficial effect on BMD of antiresorptive treatments after heart transplantation. The aim of this study was to assess prospectively the effect on BMD of a 3-year treatment of quarterly infusions of 60 mg of pamidronate, combined with 1 g calcium and 1000 U vitamin D per day, in osteoporotic heart transplant recipients, and that of a treatment with calcium and vitamin D in heart transplant recipients with no osteoporosis. BMD of the lumbar spine and the femoral neck was measured by dual-energy X-ray absorptiometry in all patients every 6 months for 2 years and after 3 years. Seventeen patients, (1 woman, 16 men) aged 46+/-4 years (mean +/- SEM) received only calcium and vitamin D. A significant decrease in BMD was observed after 6 months following the graft, at the lumbar spine (- 6.6%) as well as at the femoral neck (-7.8%). After 2 years, BMD tended to recover at the lumbar spine, whereas the loss persisted after 3 years at the femoral neck. Eleven patients (1 woman and 10 men) aged 46+/-4 years (mean +/- SEM) started treatment with pamidronate on average 6 months after the graft, because they had osteoporosis of the lumbar spine and/or femoral neck (BMD T-score below -2.5 SD). Over the whole treatment period, a continuous increase in BMD at the lumbar spine was noticed, reaching 18.3% after 3 years (14.3% compared with the BMD at the time of the graft). BMD at the femoral neck was lowered in the first year by -3.4%, but recovered totally after 3 years of treatment. In conclusion, a 3-year study of treatment with pamidronate given every 3 months to patients with existing osteoporosis led to a significant increase in lumbar spine BMD and prevented loss at the femoral neck. However, since some of these patients were treated up to 14 months after the transplant, they may already have passed through the phase of most rapid bone loss. In patients who were not osteoporotic at baseline, treatment with calcium and vitamin D alone was not able to prevent the rapid bone loss that occurs immediately after transplantation.
Resumo:
BACKGROUND: Macrophage migration inhibitory factor (MIF) is a proinflammatory cytokine produced by many tissues including pancreatic beta-cells. METHODS: This study investigates the impact of MIF on islet transplantation using MIF knock-out (MIFko) mice. RESULTS: Early islet function, assessed with a syngeneic marginal islet mass transplant model, was enhanced when using MIFko islets (P<0.05 compared with wild-type [WT] controls). This result was supported by increased in vitro resistance of MIFko islets to apoptosis (terminal deoxynucleotide tranferase-mediated dUTP nick-end labeling assay), and by improved glucose metabolism (lower blood glucose levels, reduced glucose areas under curve and higher insulin release during intraperitoneal glucose challenges, and in vitro in the absence of MIF, P<0.01). The beneficial impact of MIFko islets was insufficient to delay allogeneic islet rejection. However, the rejection of WT islet allografts was marginally delayed in MIFko recipients by 6 days when compared with WT recipient (P<0.05). This effect is supported by the lower activity of MIF-deficient macrophages, assessed in vitro and in vivo by cotransplantation of islet/macrophages. Leukocyte infiltration of the graft and donor-specific lymphocyte activity (mixed lymphocyte reaction, interferon gamma ELISPOT) were similar in both groups. CONCLUSION: These data indicate that targeting MIF has the potential to improve early function after syngeneic islet transplantation, but has only a marginal impact on allogeneic rejection.
Resumo:
Background: The pathogenic role of anti-HLA antibodies (AHA) after kidney transplantation is well established. However, its significance after liver transplantation remains unclear. The aim of our study was to determine the prevalence and significance of AHA after liver transplantation. Methods: Between January 2007 and November 2007, all liver transplant recipients who were greater than 6 months posttransplantation and followed regularly at our transplant outpatient clinic (n = 95) were screened for AHA. All clinical and electronic records were reviewed. Serum samples were tested using multiplex technology (Luminex). A liver biopsy had been performed in 55 out of the 95 patients based on clinical grounds but no routine protocol biopsies were performed. Immunosuppression was calcineurin inhibitor-based in 90 patients, sirolimus-based in 4 patients and one patient had no anti-rejection therapy (operationally tolerant recipient). Results: The mean time from transplantation to study was 85 months (range 6-248 months). Overall, AHA were found in 23/95 (24.2%) of patients (5 had anti-class I alone, 13 anti-class II alone, and 4 had both anti-class I and II). However, only 4/95 patients (4.2%) had donor-specific antibodies (DSA) (one anti-class I and 3 anti-class II). Twenty-one out of 95 patients (22.1%) had a history of past or current biopsy-proven or radiological biliary complications (chronic rejection, ischemic cholangitis, ischemic type biliary lesions or biliary anastomosis stricture). Among patients with AHA, 4/23 (17,4%) had biliary complications, while it was 17/72 (23.6%) in patients without AHA (NS). Among patients with DSA, 3/4 (75%) had biliary complications (two with biopsy-proven chronic rejection in association with biliary strictures and one with ischemic cholangitis following hepatic artery thrombosis), versus 1/19 (5.3%) patients with AHA but no DSA (p = 0.009), versus 16/72 (22.2%) patients without AHA (p = 0.046). In patients with DSA, immunosuppression was not different than in patients without DSA. Conclusions: We found a 24% AHA prevalence. The presence of DSA, but not of AHA, was significantly associated with an increased incidence of biliary complications including chronic liver allograft rejection. The exact mechanisms and possible causal relationship linking DSA to biliary complications remain to be studied. Larger prospective trials are thus needed to further define the role of AHA and in particular of DSA after liver transplantation.
Resumo:
We conducted a 12-year retrospective study to determine the effects that the community respiratory-virus species and the localization of respiratory-tract virus infection have on severe airflow decline, a serious and fatal complication occurring after hematopoietic cell transplantation (HCT). Of 132 HCT recipients with respiratory-tract virus infection during the initial 100 days after HCT, 50 (38%) developed airflow decline < or =1 year after HCT. Lower-respiratory-tract infection with parainfluenza (odds ratio [OR], 17.9 [95% confidence interval {CI}, 2.0-160]; P=.01) and respiratory syncytial virus (OR, 3.6 [95% CI, 1.0-13]; P=.05) independently increased the risk of development of airflow decline < or =1 year after HCT. The airflow decline was immediately detectable after infection and was strongest for lower-respiratory-tract infection with parainfluenza virus; it stabilized during the months after the respiratory-tract virus infection, but, at < or =1 year after HCT, the initial lung function was not restored. Thus, community respiratory virus-associated airflow decline seems to be specific to viral species and infection localization.
Resumo:
Enjeux La réussite d'une transplantation rénale ou hépatique intègre non seulement le succès de l'acte chirurgical et sa prise en charge médicale post-greffe, mais également pour le patient greffé l'assurance d'une nouvelle qualité de vie dont le retour au travail fait partie. Le retour au travail après greffe est cependant généralement peu étudié dans la littérature et peu discuté semble-t-il au niveau médical que ce soit avant ou après greffe, quand bien même d'un point de vue médico-socio-économique il s'agit d'un sujet important compte tenu des coûts liés à un arrêt de travail prolongé ou à une rente pour incapacité de travail. Contexte Après une greffe réussie, il n'existe théoriquement plus de facteur limitant et peu de contre-indications médicales liées à la greffe pour empêcher une reprise progressive de l'activité professionnelle. Or en réalité le taux de retour au travail après greffe est généralement faible, variant entre 30 et 60% selon la littérature. Ceci a incité l'auteur de cette présente étude à essayer d'en connaître les raisons. Ainsi, l'hypothèse a été que certains facteurs professionnels (« en activité professionnelle ou non avant greffe », « être diplômé ou non»), individuels («âge», «genre», «type d'organe greffé») ou médicaux (« complications médicales aiguës de moins de 6 mois post greffe ») influencent négativement ou positivement le retour au travail des patients greffés. L'étude a porté sur un petit collectif volontaire de patients greffés rénaux et hépatiques opérés entre 1993 et 2003 et suivis depuis lors au centre de transplantation d'organes (CTO) du CHUV. Il leur a été administré en face à face un questionnaire médical spécialement conçu pour l'étude. Conclusion L'étude montre que le taux de retour au travail après greffe rénale ou hépatique s'élève à 39% et qu'il existe effectivement des facteurs pouvant influencer le retour au travail après greffe. Ainsi, être « âgé de moins de 45 ans », être « au bénéfice d'une formation (diplôme) » et « travailler les deux ans précédent la greffe » sont des facteurs significatifs de retour au travail après greffe. Perspective Gage de qualité de vie, le retour au travail après greffe doit être encouragé et favorisé quand il se peut, en s'aidant des facteurs connus qui l'influencent positivement ou négativement. Pour cela, il est indispensable qu'au stade pré greffe déjà s'instaure un dialogue sur le sujet entre patients en attente de greffe et personnel médico-social, afin de mettre en oeuvre le plus précocement possible les mesures socio-professionnelles adéquates comme cela se fait déjà en Allemagne ou aux Etats-Unis par exemple.
Resumo:
BACKGROUND: Stem cell labeling with iron oxide (ferumoxide) particles allows labeled cells to be detected by magnetic resonance imaging (MRI) and is commonly used to track stem cell engraftment. However, the validity of MRI for distinguishing surviving ferumoxide-labeled cells from other sources of MRI signal, for example, macrophages containing ferumoxides released from nonsurviving cells, has not been thoroughly investigated. We sought to determine the relationship between the persistence of iron-dependent MRI signals and cell survival 3 weeks after injection of syngeneic or xenogeneic ferumoxides-labeled stem cells (cardiac-derived stem cells) in rats. METHODS AND RESULTS: We studied nonimmunoprivileged human and rat cardiac-derived stem cells and human mesenchymal stem cells doubly labeled with ferumoxides and beta-galactosidase and injected intramyocardially into immunocompetent Wistar-Kyoto rats. Animals were imaged at 2 days and 3 weeks after stem cell injection in a clinical 3-T MRI scanner. At 2 days, injection sites of xenogeneic and syngeneic cells (cardiac-derived stem cells and mesenchymal stem cells) were identified by MRI as large intramyocardial signal voids that persisted at 3 weeks (50% to 90% of initial signal). Histology (at 3 weeks) revealed the presence of iron-containing macrophages at the injection site, identified by CD68 staining, but very few or no beta-galactosidase-positive stem cells in the animals transplanted with syngeneic or xenogeneic cells, respectively. CONCLUSIONS: The persistence of significant iron-dependent MRI signal derived from ferumoxide-containing macrophages despite few or no viable stem cells 3 weeks after transplantation indicates that MRI of ferumoxide-labeled cells does not reliably report long-term stem cell engraftment in the heart.
Resumo:
OBJECTIVE: To assess the accuracy of a semiautomated 3D volume reconstruction method for organ volume measurement by postmortem MRI. METHODS: This prospective study was approved by the institutional review board and the infants' parents gave their consent. Postmortem MRI was performed in 16 infants (1 month to 1 year of age) at 1.5 T within 48 h of their sudden death. Virtual organ volumes were estimated using the Myrian software. Real volumes were recorded at autopsy by water displacement. The agreement between virtual and real volumes was quantified following the Bland and Altman's method. RESULTS: There was a good agreement between virtual and real volumes for brain (mean difference: -0.03% (-13.6 to +7.1)), liver (+8.3% (-9.6 to +26.2)) and lungs (+5.5% (-26.6 to +37.6)). For kidneys, spleen and thymus, the MRI/autopsy volume ratio was close to 1 (kidney: 0.87±0.1; spleen: 0.99±0.17; thymus: 0.94±0.25), but with a less good agreement. For heart, the MRI/real volume ratio was 1.29±0.76, possibly due to the presence of residual blood within the heart. The virtual volumes of adrenal glands were significantly underestimated (p=0.04), possibly due to their very small size during the first year of life. The percentage of interobserver and intraobserver variation was lower or equal to 10%, but for thymus (15.9% and 12.6%, respectively) and adrenal glands (69% and 25.9%). CONCLUSIONS: Virtual volumetry may provide significant information concerning the macroscopic features of the main organs and help pathologists in sampling organs that are more likely to yield histological findings.