933 resultados para MARROW-TRANSPLANTATION
Resumo:
Laboratory of Medical Investigation (LIM37), University of Sao Paulo School of Medicine
Resumo:
Introduction. Posttransplant thrombotic microangiopathy (TMA)/hemolytic uremic syndrome (HUS) can occur as a recurrent or de novo disease. Methods. A retrospective single-center observational study was applied in order to examine the incidence and outcomes of de novo TMA/HUS among transplantations performed between 2000 and 2010. Recurrent HUS or antibody-mediated rejections were excluded. Results. Seventeen (1.1%) among 1549 kidney transplant recipients fulfilled criteria for de novo TMA. The mean follow-up was 572 days (range, 69-1769). Maintenance immunosuppression was prednisone, tacrolimus (TAC), and mycophenolic acid in 14 (82%) patients. Mean age at onset was 40 +/- 15 years, and serum creatinine was 6.1 +/- 4.1 mg/dL. TMA occurred at a median of 25 days (range, 1-1755) after transplantation. Nine (53%) patients developed TMA within 1 month of transplantation and only 12% after 1 year. Clinical features were anemia (hemoglobin < 10 g/dL) in 9 (53%) patients, thrombocytopenia in 7 (41%), and increased lactate dehydrogenase in 12 (70%). Decreased haptoglobin was observed in 64% and schistocytes in 35%. Calcineurin inhibitor (CM) withdrawal or reduction was the first step in the management of 10/15 (66%) patients, and 6 (35%) received fresh frozen plasma (FFP) and/or plasmapheresis. TAC was successfully reintroduced in six patients after a median of 17 days. Eight (47%) patients needed dialytic support after TMA diagnosis and 75% remained on dialysis. At 4 years of follow-up, death-censored graft survival was worse for TMA group (43.0% versus 85.6%, log-rank = 0.001; hazard ratio = 3.74) and there was no difference in patient survival (53.1% versus 82.2%, log-rank = 0.24). Conclusion. De novo TMA after kidney transplantation is a rare but severe condition with poor graft outcomes. This syndrome may not be fully manifested, and clinical suspicion is essential for early diagnosis and treatment, based mainly in CM withdrawal and FFP infusions and/or plasmapheresis.
Resumo:
Background. This study evaluated the influence of circulating anti-HLA antibodies on outcomes of 97 liver allografts from deceased donors. Methods. Human leukocyte antigen (HLA) antibody screening was performed by both complement-dependent cytotoxicity (CDC) and multiparameter Luminex microsphere-based assays (Luminex assay). Results. The agreements between T- and B- cell CDC and Luminex assays were 67% and 77% for pre- and posttransplant specimens, respectively. Graft dysfunction was not associated with either positive pretransplant CDC or Luminex panel-reactive antibody (PRA) values. Likewise, positive posttransplant T- or B- cell CDC PRA values were not associated with graft dysfunction. In contrast, posttransplant Luminex PRA values were significantly higher among patients with graft dysfunction compared with subjects with good outcomes (P = .017). Conclusion. Posttransplant monitoring of HLA antibodies with Luminex methodology allowed identification of patients at high-risk for poor graft outcomes.
Resumo:
Transplanted individuals in operational tolerance (OT) maintain long-term stable graft function after completely stopping immunosuppression. Understanding the mechanisms involved in OT can provide valuable information about pathways to human transplantation tolerance. Here we report that operationally tolerant individuals display quantitative and functional preservation of the B-c ell compartment in renal transplantation. OT exhibited normal numbers of circulating total B cells, naive, memory and regulatory B cells (Bregs) as well as preserved B-cell receptor repertoire, similar to healthy individuals. In addition, OT also displayed conserved capacity to activate the cluster of differentiation 40 (CD40)/signal transducer and activator of transcription 3 (STAT3) signaling pathway in Bregs, in contrast, with chronic rejection. Rather than expansion or higher activation, we show that the preservation of the B-cell compartment favors OT. Online address: http://www.molmed.org doi: 10.2119/molmed.2011.00281
Resumo:
Introduction. Patients with terminal heart failure have increased more than the available organs leading to a high mortality rate on the waiting list. Use of Marginal and expanded criteria donors has increased due to the heart shortage. Objective. We analyzed all heart transplantations (HTx) in Sao Paulo state over 8 years for donor profile and recipient risk factors. Method. This multi-institutional review collected HTx data from all institutions in the state of Sao Paulo, Brazil. From 2002 to 2008 (6 years), only 512 (28.8%) of 1777 available heart donors were accepted for transplantation. All medical records were analyzed retrospectively; none of the used donors was excluded, even those considered to be nonstandard. Results. The hospital mortality rate was 27.9% (n = 143) and the average follow-up time was 29.4 +/- 28.4 months. The survival rate was 55.5% (n = 285) at 6 years after HTx. Univariate analysis showed the following factors to impact survival: age (P = .0004), arterial hypertension (P = .4620), norepinephrine (P = .0450), cardiac arrest (P = .8500), diabetes mellitus (P = .5120), infection (P = .1470), CKMB (creatine kinase MB) (P = .8694), creatinine (P = .7225), and Na+ (P = .3273). On multivariate analysis, only age showed significance; logistic regression showed a significant cut-off at 40 years: organs from donors older than 40 years showed a lower late survival rates (P = .0032). Conclusions. Donor age older than 40 years represents an important risk factor for survival after HTx. Neither donor gender nor norepinephrine use negatively affected early survival.
Resumo:
Background: Hepatitis B virus (HBV) infection is a major cause of morbidity and mortality worldwide. Chronic hepatitis B infection is associated with an increased risk of cirrhosis, hepatic decompensation, and hepatocellular carcinoma. Our aim is to analyze, through a mathematical model, the potential impact of anti-HBV vaccine in the long-term (that is, decades after vaccination) number of LT. Methods: The model simulated that the prevalence of HBV infection was 0.5% and that approximately 20% of all the liver transplantation carried out in the state of Sao Paulo are due to HBV infection. Results: The theoretical model suggests that a vaccination program that would cover 80% of the target population would reach a maximum of about 14% reduction in the LT program. Conclusion: Increasing the vaccination coverage against HBV in the state of Sao Paulo would have a relatively low impact on the number of liver transplantation. In addition, this impact would take several decades to materialize due to the long incubation period of liver failure due to HBV.
Resumo:
Introduction. Endomyocardial biopsy (EMB) plays an important role in allograft surveillance to screen an acute rejection episode after heart transplantation (HT), to diagnose an unknown cause of cardiomyopathies (CMP) or to reveal a cardiac tumor. However, the procedure is not risk free. Objective. The main objective of this research was to describe our experience with EMB during the last 33 years comparing surgical risk between FIT versus no-HT patients. Method. We analyzed retrospectively the data of 5347 EMBs performed from 1978 to 2011 (33 years). For surveillance of acute rejection episodes after HT we performed 3564 (66.7%), whereas 1777 (33.2%) for CMP diagnosis, and 6 (1.0%) for cardiac tumor identification. Results. The main complications due to EMB were divided into 2 groups to facilitate analysis: major complications associated with potential death risk, and minor complications. The variables that showed a significant difference in the HT group were as follows: tricuspid Injury (.0490) and coronary fistula (.0000). Among the no-HT cohort they were insufficient fragment (.0000), major complications (.0000) and total complications (.0000). Conclusions. EMB can be accomplished with a low risk of complications and high effectiveness to diagnose CMP and rejection after HT. However, the risk is great among patients with CMP due to their anatomic characteristics. Children also constitute a risk group for EMB due to their small size in addition to the heart disease. The risk of injury to the tricuspid valve was higher among the HT group.
Resumo:
Adipose tissue-derived stem cells (ASCs) are an attractive source of stem cells with regenerative properties that are similar to those of bone marrow stem cells. Here, we analyze the role of ASCs in reducing the progression of kidney fibrosis. Progressive renal fibrosis was achieved by unilateral clamping of the renal pedicle in mice for 1 h; after that, the kidney was reperfused immediately. Four hours after the surgery, 2 x 10(5) ASCs were intraperitoneally administered, and mice were followed for 24 h posttreatment and then at some other time interval for the next 6 weeks. Also, animals were treated with 2 x 10(5) ASCs at 6 weeks after reperfusion and sacrificed 4 weeks later to study their effect when interstitial fibrosis is already present. At 24 h after reperfusion, ASC-treated animals showed reduced renal dysfunction and enhanced regenerative tubular processes. Renal mRNA expression of IL-6 and TNF was decreased in ASC-treated animals, whereas IL-4. IL-10, and HO-1 expression increased despite a lack of ASCs in the kidneys as determined by SRY analysis. As expected, untreated kidneys shrank at 6 weeks, whereas the kidneys of ASC-treated animals remained normal in size, showed less collagen deposition, and decreased staining for FSP-1, type I collagen, and Hypoxyprobe. The renal protection seen in ASC-treated animals was followed by reduced serum levels of TNF-alpha, KC, RANTES, and IL-1 alpha. Surprisingly, treatment with ASCs at 6 weeks, when animals already showed installed fibrosis, demonstrated amelioration of functional parameters, with less tissue fibrosis observed and reduced mRNA expression of type I collagen and vimentin. ASC therapy can improve functional parameters and reduce progression of renal fibrosis at early and later times after injury, mostly due to early modulation of the inflammatory response and to less hypoxia, thereby reducing the epithelial-mesenchymal transition.
Resumo:
OBJECTIVE: Poor sleep quality is one of the factors that adversely affects patient quality of life after kidney transplantation, and sleep disorders represent a significant cardiovascular risk factor. The objective of this study was to investigate the prevalence of changes in sleep quality and their outcomes in kidney transplant recipients and analyze the variables affecting sleep quality in the first years after renal transplantation. METHODS: Kidney transplant recipients were evaluated at two time points after a successful transplantation: between three and six months (Phase 1) and between 12 and 15 months (Phase 2). The following tools were used for assessment: the Pittsburgh Sleep Quality Index; the quality of life questionnaire Short-Form-36; the Hospital Anxiety and Depression scale; the Karnofsky scale; and assessments of social and demographic data. The prevalence of poor sleep was 36.7% in Phase 1 and 38.3% in Phase 2 of the study. RESULTS: There were no significant differences between patients with and without changes in sleep quality between the two phases. We found no changes in sleep patterns throughout the study. Both the physical and mental health scores worsened from Phase 1 to Phase 2. CONCLUSION: Sleep quality in kidney transplant recipients did not change during the first year after a successful renal transplantation.
Resumo:
Background: Pulmonary hypertension is associated with a worse prognosis after cardiac transplantation. The pulmonary hypertension reversibility test with sodium nitroprusside (SNP) is associated with a high rate of systemic arterial hypotension, ventricular dysfunction of the transplanted graft and high rates of disqualification from transplantation. Objective: This study was aimed at comparing the effects of sildenafil (SIL) and SNP on hemodynamic, neurohormonal and echocardiographic variables during the pulmonary reversibility test. Methods: The patients underwent simultaneously right cardiac catheterization, echocardiography, BNP measurement, and venous blood gas analysis before and after receiving either SNP (1 - 2 mu g/kg/min) or SIL (100 mg, single dose). Results: Both drugs reduced pulmonary hypertension, but SNP caused a significant systemic hypotension (mean blood pressure - MBP: 85.2 vs. 69.8 mm Hg; p < 0.001). Both drugs reduced cardiac dimensions and improved left cardiac function (SNP: 23.5 vs. 24.8%, p = 0.02; SIL: 23.8 vs. 26%, p < 0.001) and right cardiac function (SIL: 6.57 +/- 2.08 vs. 8.11 +/- 1.81 cm/s, p = 0.002; SNP: 6.64 +/- 1.51 vs. 7.72 +/- 1.44 cm/s, p = 0.003), measured through left ventricular ejection fraction and tissue Doppler, respectively. Sildenafil, contrary to SNP, improved venous oxygen saturation, measured on venous blood gas analysis. Conclusion: Sildenafil and SNP are vasodilators that significantly reduce pulmonary hypertension and cardiac geometry, in addition to improving biventricular function. Sodium nitroprusside, contrary to SIL, was associated with systemic arterial hypotension and worsening of venous oxygen saturation. (Arq Bras Cardiol 2012;99(3):848-856)
Resumo:
Purpose: To analyze the outcome of deceased donor recipients given priority in allocation due to lack of access for dialysis and compare this data to the one obtained from non-prioritized deceased donor kidney transplant recipients. Materials and Methods: we reviewed electronic charts of 31 patients submitted to kidney transplantation that were given priority in transplantation program due to lack of access for dialysis from January 2005 to December 2008. Immunological and surgical complications rates, and grafts and patients survival rates were analyzed. These data were compared to those obtained from 100 regular patients who underwent kidney transplantation without allocation priority during the same period. Results: Overall surgical complication rate was 25.8% and 27% in the patients with priority in allocation and in the non-prioritized patients, respectively. There was no statistical significant difference for surgical complications (p = 1.0), immunological complications (p = 0.21) and graft survival (p = 0.19) rates between the groups. However, patient survival rate was statistically significant worse in prioritized patients (p = 0.05). Conclusions: patients given priority in allocation owing to lack of access for dialysis have higher mortality rate when compared to those non-prioritized.
Resumo:
OBJECTIVE: To analyze the nutritional status of pediatric patients after orthotopic liver transplantation and the relationship with short-term clinical outcome. METHOD: Anthropometric evaluations of 60 children and adolescents after orthotopic liver transplantation, during the first 24 hours in a tertiary pediatric intensive care unit. Nutritional status was determined from the Z score for the following indices: weight/age, height/age or length/age, weight/height or weight/length, body mass index/age, arm circumference/age and triceps skinfold/age. The severity of liver disease was evaluated using one of the two models which was adequated to the patients' age: 1. Pediatric End-stage Liver Disease, 2. Model for End-Stage Liver Disease. RESULTS: We found 50.0% undernutrition by height/age; 27.3% by weight/age; 11.1% by weight/height or weight/length; 10.0% by body mass index/age; 61.6% by arm circumference/age and 51.0% by triceps skinfold/age. There was no correlation between nutritional status and Pediatric End-stage Liver Disease or mortality. We found a negative correlation between arm circumference/age and length of hospitalization. CONCLUSION: Children with chronic liver diseases experience a significant degree of undernutrition, which makes nutritional support an important aspect of therapy. Despite the difficulties in assessment, anthropometric evaluation of the upper limbs is useful to evaluate nutritional status of children before or after liver transplantation.
Resumo:
The Epstein-Barr virus (EBV) is associated with a large spectrum of lymphoproliferative diseases. Traditional methods of EBV detection include the immunohistochemical identification of viral proteins and DNA probes to the viral genome in tumoral tissue. The present study explored the detection of the EBV genome, using the BALF5 gene, in the bone marrow or blood mononuclear cells of patients with diffuse large B-cell lymphomas (DLBCL) and related its presence to the clinical variables and risk factors. The results show that EBV detection in 21.5% of patients is not associated with age, gender, staging, B symptoms, international prognostic index scores or any analytical parameters, including lactate dehydrogenase (LDH) or beta-2 microglobulin (B2M). The majority of patients were treated with R-CHOP-like (rituximab. cyclophosphamide, doxorubicin, vincristine and prednisolone or an equivalent combination) and some with CHOP-like chemotherapy. Response rates [complete response (CR) + partial response (PR)] were not significantly different between EBV-negative and -positive cases, with 93.2 and 88.9%, respectively. The survival rate was also similar in the two groups, with 5-year overall survival (OS) rates of 64.3 and 76.7%, respectively. However, when analyzing the treatment groups separately there was a trend in EBV-positive patients for a worse prognosis in patients treated with CHOP-like regimens that was not identified in patients treated with R-CHOP-like regimens. We conclude that EBV detection in the bone marrow and blood mononuclear cells of DLBC patients has the same frequency of EBV detection on tumoral lymphoma tissue but is not associated with the risk factors, response rate and survival in patients treated mainly with immunochemotherapy plus rituximab. These results also suggest that the addition of rituximab to chemotherapy improves the prognosis associated with EBV detection in DLBCL.
Resumo:
The usefulness of stress myocardial perfusion scintigraphy for cardiovascular (CV) risk stratification in chronic kidney disease remains controversial. We tested the hypothesis that different clinical risk profiles influence the test. We assessed the prognostic value of myocardial scintigraphy in 892 consecutive renal transplant candidates classified into four risk groups: very high (aged epsilon 50 years, diabetes and CV disease), high (two factors), intermediate (one factor) and low (no factor). The incidence of CV events and death was 20 and 18, respectively (median follow-up 22 months). Altered stress testing was associated with an increased probability of cardiovascular events only in intermediate-risk (one risk factor) patients [30.3 versus 10, hazard ratio (HR) 2.37, confidence interval (CI) 1.693.33, P 0.0001]. Low-risk patients did well regardless of scan results. In patients with two or three risk factors, an altered stress test did not add to the already increased CV risk. Myocardial scintigraphy was related to overall mortality only in intermediate-risk patients (HR 2.8, CI 1.55.1, P 0.007). CV risk stratification based on myocardial stress testing is useful only in patients with just one risk factor. Screening may avoid unnecessary testing in 60 of patients, help stratifying for risk of events and provide an explanation for the inconsistent performance of myocardial scintigraphy.
Resumo:
Background. Lung transplantation has become a standard procedure for some end-stage lung diseases, but primary graft dysfunction (PGD) is an inherent problem that impacts early and late outcomes. The aim of this study was to define the incidence, risk factors, and impact of mechanical ventilation time on mortality rates among a retrospective cohort of lung transplantations performed in a single institution. Methods. We performed a retrospective study of 118 lung transplantations performed between January 2003 and July 2010. The most severe form of PGD (grade III) as defined at 48 and 72 hours was examined for risk factors by multivariable logistic regression models using donor, recipient, and transplant variables. Results. The overall incidence of PGD at 48 hours was 19.8%, and 15.4% at 72 hours. According multivariate analysis, risk factors associated with PGD were donor smoking history for 48 hours (adjusted odds ratio [OR], 4.83; 95% confidence interval [CI], 1.236-18.896; P = .022) and older donors for 72 hours (adjusted OR, 1.046; 95% CI, 0.997-1.098; P = .022). The operative mortality was 52.9% among patients with PGD versus 20.3% at 48 hours (P = .012). At 72 hours, the mortality rate was 58.3% versus 21.2% (P = .013). The 90-days mortality was also higher among patients with PGD. The mechanical ventilation time was longer in patients with PGD III at 48 hours namely, a mean time of 72 versus 24 hours (P = .001). When PGD was defined at 72 hours, the mean ventilation time was even longer, namely 151 versus 24 hours (P < .001). The mean overall survival for patients who developed PGD at 48 hours was 490.9 versus 1665.5 days for subjects without PGD (P = .001). Considering PGD only at 72 hours, the mean survival was 177.7 days for the PGD group and 1628.9 days for the other patients (P < .001). Conclusion. PGD showed an important impacts on operative and 90-day mortality rates, mechanical ventilation time, and overall survival among lung transplant patients. PGD at 72 hours was a better predictor of lung transplant outcomes than at 48 hours. The use of donors with a smoking history or of advanced age were risk factors for the development of PGD.