911 resultados para ALLOGRAFT RECIPIENTS
Resumo:
Numerous steatotic livers are discarded for transplantation because of their poor tolerance to ischemia-reperfusion (I/R). We examined whether tauroursodeoxycholic acid (TUDCA), a known inhibitor of endoplasmic reticulum (ER) stress, protects steatotic and nonsteatotic liver grafts preserved during 6 h in University of Wisconsin (UW) solution and transplanted. The protective mechanisms of TUDCA were also examined. Neither unfolded protein response (UPR) induction nor ER stress was evidenced in steatotic and nonsteatotic liver grafts after 6 h in UW preservation solution. TUDCA only protected steatotic livers grafts and did so through a mechanism independent of ER stress. It reduced proliferator-activated receptor-gamma(PPAR gamma) and damage. When PPAR gamma was activated, TUDCA did not reduce damage. TUDCA, which inhibited PPAR gamma, and the PPAR gamma antagonist treatment up-regulated toll-like receptor 4 (TLR4), specifically the TIR domain-containing adaptor inducing IFN beta (TRIF) pathway. TLR4 agonist treatment reduced damage in steatotic liver grafts. When TLR4 action was inhibited, PPAR gamma antagonists did not protect steatotic liver grafts. In conclusion, TUDCA reduced PPAR gamma and this in turn up-regulated the TLR4 pathway, thus protecting steatotic liver grafts. TLR4 activating-based strategies could reduce the inherent risk of steatotic liver failure after transplantation.
Resumo:
Autogenous bone grafts are considered to be the gold standard in bone regeneration because of their osteogenic activity; however, due to limited availability of intraoral donor sites and the need to resolve the demands of patients requires an alternative to these. Two male patients were submitted to implant surgery in two stages with 6 months intervals between each of them: the first was exodontia and placement of DBM graft into the socket; the second stage was the drill with a 2 mm internal diameter trephine in center of the alveolar ridge previously grafted with DBM and subsequent implant placement. The samples were analyzed under histological techniques. A very mature bone was observed at 6 months after DBM graft placement in the sockets, showing it to be a good alternative as bone graft.
Resumo:
Aim This randomized, controlled, clinical study compared two surgical techniques for root coverage with the acellular dermal matrix graft (ADMG) to evaluate which procedure could provide better root coverage and greater amounts of keratinized tissue. Materials and Methods Fifteen pairs of bilateral Miller Class I or II gingival recessions were treated and assigned randomly to the test group, and the contra-lateral recessions were assigned to the control group. The ADMG was used in both groups. In the control group, the graft and flap were positioned at the level of the cemento-enamel junction (CEJ), and in the test group, the graft was positioned 1 mm apical to the CEJ and the flap 1 mm coronal to the CEJ. The clinical parameters were taken before the surgeries and after 6 months. The gingival recession area, a new parameter, was measured in standardized photographs through a special device and software. Results There were statistically significant differences favouring the proposed technique for all parameters except for the amount of keratinized tissue at 6 months. Conclusions The proposed test technique is more suitable for root coverage procedures with ADMG, and the new parameter evaluated appears valuable for root coverage analysis. (Clinicaltrials.gov Identifier: NCT01175720).
Resumo:
OBJECTIVE: We present a prospective study of a microemulsion of cyclosporin to treat idiopathic nephrotic syndrome in ten children with normal renal function who presented cyclosporin trough levels between 50 and 150 ng/ml and achieved complete remission with cyclosporin. To compare the pharmacokinetic parameters of cyclosporin in idiopathic nephrotic syndrome during remission and relapse of the nephrotic state. METHOD: The pharmacokinetic profile of cyclosporin was evaluated with the 12-hour area under the time-concentration curve (auc0-12) using seven time-point samples. This procedure was performed on each patient during remission and relapse with the same cyclosporin dose in mg/kg/day. The 12-hour area under the time-concentration curve was calculated using the trapezoidal rule. All of the pharmacokinetic parameters and the resumed 4-hour area under the time-concentration curve were correlated with the 12-hour area under the time-concentration curve. ClinicalTrials.gov: NCT01616446. RESULTS: There were no significant differences in any parameters of the pharmacokinetic of cyclosporin during remission and relapse, even when the data were normalized by dose. The best correlation with the 12-hour area under the time-concentration curve was the 4-hour area under the time-concentration curve on remission and relapse of the disease, followed by the 2-hour level after cyclosporin (c2) dosing in both disease states. CONCLUSIONS: These data indicate that the same parameters used for cyclosporin therapeutic monitoring estimated during the nephrotic state can also be used during remission. Larger controlled studies are needed to confirm these findings.
Resumo:
Background. Organ transplant recipients with refractory rejection or intolerance to the prescribed immunosuppressant may respond to rescue therapy with tacrolimus. We sought to evaluate the clinical outcomes of children undergoing heart transplantation who required conversion from a cyclosporine-based, steroid-free therapy to a tacrolimus-based regimen. Methods. We performed a prospective, observational, cohort study of 28 children who underwent conversion from cyclosporine-based, steroid-free therapy to a tacrolimus-based therapy for refractory or late rejection or intolerance to cyclosporine. Results. There was complete resolution of refractory rejection episodes and adverse side effects in all patients. The incidence rate (X100) of rejection episodes before and after conversion was 7.98 and 2.11, respectively (P <= .0001). There was a 25% mortality rate in patients using tacrolimus after a mean period of 60 months after conversion. Conclusion. Tacrolimus is effective as rescue therapy for refractory rejection and is a therapeutic option for pediatric patients.
Resumo:
We performed a comparative study and evaluated cellular infiltrates and anti-inflammatory cytokine production at different time-points after syngeneic or allogeneic skin transplantation. We observed an early IL-10 production in syngeneic grafts compared with allografts. This observation prompted us to investigate the role of IL-10 in isograft acceptance. For this, we used IL-10 KO and WT mice to perform syngeneic transplantation, where IL-10 was absent in the graft or in the recipient. The majority of syngeneic grafts derived from IL-10 KO donors did not engraft or was only partially accepted, whereas IL-10 KO mice transplanted with skin from WT donors accepted the graft. We evaluated IL-10 producers in the transplanted skin and observed that epithelial cells were the major source. Taken together, our data show that production of IL-10 by donor cells, but not by the recipient, is determinant for graft acceptance and strongly suggest that production of this cytokine by keratinocytes immediately upon transplantation is necessary for isograft survival. J. Leukoc. Biol. 92: 259-264; 2012.
Resumo:
BACKGROUND: The characteristics of blood recipients including diagnoses associated with transfusion and posttransfusion survival are unreported in Brazil. The goals of this analysis were: 1) to describe blood utilization according to clinical diagnoses and patient characteristics and 2) to determine the factors associated with survival of blood recipients. STUDY DESIGN AND METHODS: A retrospective cross-sectional analysis was conducted on all inpatients in 2004. Data came from three sources: The first two files consist of data about patient characteristics, clinical diagnosis, and transfusion. Analyses comparing transfused and nontransfused patients were conducted. The third file was used to determine survival recipients up to 3 years after transfusion. Logistic regression was conducted among transfused patients to examine characteristics associated with survival. RESULTS: In 2004, a total of 30,779 patients were admitted, with 3835 (12.4%) transfused. These patients had 10,479 transfusions episodes, consisting of 39,561 transfused components: 16,748 (42%) red blood cells, 15,828 (40%) platelets (PLTs), and 6190 (16%) plasma. The median number of components transfused was three (range, 1-656) per patient admission. Mortality during hospitalization was different for patients whose admissions included transfusion or not (24% vs. 4%). After 1 year, 56% of transfusion recipients were alive. The multivariable model of factors associated with mortality after transfusion showed that the most significant factors in descending order were hospital ward, increasing age, increasing number of components transfused, and type of components received. CONCLUSION: Ward and transfusion are markers of underlying medical conditions and are associated with the probability of survival. PLT transfusions are common and likely reflect the types of patients treated. This comprehensive blood utilization study, the first of its kind in Brazil, can help in developing transfusion policy analyses in South America.
Resumo:
Background: The sieve analysis for the Step trial found evidence that breakthrough HIV-1 sequences for MRKAd5/HIV-1 Gag/Pol/Nef vaccine recipients were more divergent from the vaccine insert than placebo sequences in regions with predicted epitopes. We linked the viral sequence data with immune response and acute viral load data to explore mechanisms for and consequences of the observed sieve effect. Methods: Ninety-one male participants (37 placebo and 54 vaccine recipients) were included; viral sequences were obtained at the time of HIV-1 diagnosis. T-cell responses were measured 4 weeks post-second vaccination and at the first or second week post-diagnosis. Acute viral load was obtained at RNA-positive and antibody-negative visits. Findings: Vaccine recipients had a greater magnitude of post-infection CD8+ T cell response than placebo recipients (median 1.68% vs 1.18%; p = 0.04) and greater breadth of post-infection response (median 4.5 vs 2; p = 0.06). Viral sequences for vaccine recipients were marginally more divergent from the insert than placebo sequences in regions of Nef targeted by pre-infection immune responses (p = 0.04; Pol p = 0.13; Gag p = 0.89). Magnitude and breadth of pre-infection responses did not correlate with distance of the viral sequence to the insert (p. 0.50). Acute log viral load trended lower in vaccine versus placebo recipients (estimated mean 4.7 vs 5.1) but the difference was not significant (p = 0.27). Neither was acute viral load associated with distance of the viral sequence to the insert (p>0.30). Interpretation: Despite evidence of anamnestic responses, the sieve effect was not well explained by available measures of T-cell immunogenicity. Sequence divergence from the vaccine was not significantly associated with acute viral load. While point estimates suggested weak vaccine suppression of viral load, the result was not significant and more viral load data would be needed to detect suppression.
Resumo:
To prevent rejection of kidney transplants, patients must be kept in immunosuppressive therapy for a long time, which includes the use of drugs such as cyclosporine, azathioprine, cyclophosphamide, and prednisone. The action of these drugs reduces the general immune response of transplant patients and thus increases their susceptibility to infections. Moreover, these drugs increase the potential of developing lesions. Therefore, oral hygiene in kidney transplant recipients contributes to maintenance of the transplanted organ and its function. Thus, an investigation of oral lesions could be counted as a notable work. The aim of this study was to investigate oral lesions in a group of 21 kidney transplant patients under immunosuppressive therapy attended during a 1-year period in the Nephrology Department of the Federal University of Sergipe, Brazil. Data related to sex, age, etiology of renal disease, types of renal transplant, time elapsed after transplantation, immunosuppressive treatment, use of concomitant agents, and presence of oral lesions were obtained. All patients received a kidney transplant from a living donor, and the mean posttransplantation follow-up time was 31.6 months; 71.5% used triple immunosuppressive therapy with cyclosporine A, azathioprine, and prednisone. Ten patients were also treated with calcium-channel blockers. Of the 21 transplant patients, 17 (81%) presented oral lesions. Gingival overgrowth was the most common alteration, followed by candidiasis and superficial ulcers. One case of spindle cell carcinoma of the lower lip was observed. Oral cavity can harbor a variety of manifestations related to renal transplantation under immunosuppressive therapy.
Resumo:
Purpose. To report a single center experience with elective surgical patients as living kidney donors. Methods. We retrospectively analyzed a prospective database of 458 living kidney donors from September 2005 to May 2011. Fifteen (3.2%) of them were elective surgical patients simultaneously undergoing living donor nephrectomy. We reviewed age, gender, operative time, intraoperative blood transfusion, intra- and postoperative complications, as well as length of hospital stay. Recipients were evaluated for delayed graft function. Four hundred forty-three patients undergoing living donor nephrectomy alone composed the control group. Results. Among the elective surgical patients group, the mean (range) operative time was 155 (90 to 310) minutes and mean (range) length of hospital stay was 3 (2 to 9) days. One (6.7%) recipient displayed delayed graft function. Among the regular living kidney donors group, the mean (range) operative time was 100 (70 to 150) minutes, mean (range) length of hospital stay was 3 (2 to 5) days, and delayed graft function was observed in 5.6% of recipients. Only operative time (P = .03) was significantly different between the groups. Conclusions. Elective surgical patients are potential donors who may be treated at the same time as the living donor nephrectomy.
Resumo:
Objective: the purpose of this study was to investigate the effect of low-level laser therapy (LLLT) on chronic kidney disease (CKD) in a model of unilateral ureteral obstruction (UUO). Background data: Regardless of the etiology, CKD involves progressive widespread tissue fibrosis, tubular atrophy, and loss of kidney function. This process also occurs in kidney allograft. At present, effective therapies for this condition are lacking. We investigated the effects of LLLT on the interstitial fibrosis that occurs after experimental UUO in rats. Methods: The occluded kidney of half of the 32 Wistar rats that underwent UUO received a single intraoperative dose of LLLT (AlGaAs laser, 780 nm, 22.5 J/cm(2), 30mW, 0.75W/cm(2), 30 sec on each of nine points). After 14 days, renal fibrosis was assessed by Sirius red staining under polarized light. Immunohistochemical analyses quantitated the renal tissue cells that expressed fibroblast (FSP-1) and myofibroblast (alpha-SMA) markers. Reverse transcriptase polymerase chain reaction (RT-PCR) was performed to determine the mRNA expression of interleukin (IL)-6, monocyte chemotactic protein-1 (MCP-1), transforming growth factor (TGF)-beta 1 and Smad3. Results: The UUO and LLLT animals had less fibrosis than the UUO animals, as well having decreased expression inflammatory and pro-fibrotic markers. Conclusions: For the first time, we showed that LLLT had a protective effect regarding renal interstitial fibrosis. It is conceivable that by attenuating inflammation, LLLT can prevent tubular activation and transdifferentiation, which are the two processes that mainly drive the renal fibrosis of the UUO model.
Resumo:
OBJECTIVE: The significance of pretransplant, donor-specific antibodies on long-term patient outcomes is a subject of debate. This study evaluated the impact and the presence or absence of donor-specific antibodies after kidney transplantation on short-and long-term graft outcomes. METHODS: We analyzed the frequency and dynamics of pretransplant donor-specific antibodies following renal transplantation from a randomized trial that was conducted from 2002 to 2004 and correlated these findings with patient outcomes through 2009. Transplants were performed against a complement-dependent T-and B-negative crossmatch. Pre- and posttransplant sera were available from 94 of the 118 patients (80%). Antibodies were detected using a solid-phase (Luminex (R)), single-bead assay, and all tests were performed simultaneously. RESULTS: Sixteen patients exhibited pretransplant donor-specific antibodies, but only 3 of these patients (19%) developed antibody-mediated rejection and 2 of them experienced early graft losses. Excluding these 2 losses, 6 of 14 patients exhibited donor-specific antibodies at the final follow-up exam, whereas 8 of these patients (57%) exhibited complete clearance of the donor-specific antibodies. Five other patients developed "de novo'' posttransplant donor-specific antibodies. Death-censored graft survival was similar in patients with pretransplant donor-specific and non-donor-specific antibodies after a mean follow-up period of 70 months. CONCLUSION: Pretransplant donor-specific antibodies with a negative complement-dependent cytotoxicity crossmatch are associated with a risk for the development of antibody-mediated rejection, although survival rates are similar when patients transpose the first months after receiving the graft. Our data also suggest that early posttransplant donor-specific antibody monitoring should increase knowledge of antibody dynamics and their impact on long-term graft outcome.
Resumo:
Background. Renal transplantation remains the optimal treatment of patients with end-stage renal disease. Urinary lithiasis represents an unusual urologic complication in renal transplantation, with an incidence of <1%. Today, recipients of kidneys from deceased donors are more likely to receive grafts with undiagnosed lithiasis, which does not occur in patients from living donors, owing to screening with computerized tomography. Objective. The aim of this study was to evaluate the incidence, diagnosis, and therapeutic management of renal lithiasis in transplanted kidneys at a single institution. Methods. We reviewed the medical records for 1,313 patients who underwent kidney transplantation from February 1968 to February 2011. Results. Among the grafts, 17 patients (1.29%) had nephrolithiasis: 9 women and 8 men. Ages ranged from 32 to 63 years (mean = 45.6 years). Fifteen patients received kidneys from cadaveric and only 2 from living related donors. Two stones, both located inside the ureter, were identified during transplant surgery (11.7%). Three instances of lithiasis were incidentally diagnosed by ultrasound during graft evaluation, within 7 days after surgery (17.6%); all 3 were in the calyces. The 12 remaining patients had the stones diagnosed later (70.58%): 6 in the calyces, 3 in the renal pelvis, and 3 inside the ureter. Conclusions. Urinary lithiasis is a rare complication in renal transplantation. In most patients the condition occurs without pain. The diagnosis and treatment options for graft urolithiasis are similar to those patients with nephrofithiasis in the general population. Extracorporeal shock wave lithotripsy (ESWL) was the most common treatment method.
Resumo:
Introduction. Tricuspid regurgitation (TR) is the most commonly valvular dysfunction found after heart transplantation (HTx). It may be related to endomyocardial biopsy (EMB) performed for allograft rejection surveillance. Objective. This investigation evaluated the presence of tricuspid valve tissue fragments obtained during routine EMB performed after HTx and its possible effect on short-term and long-term hemodynamic status. Method. This single-center review included prospectively collected and retrospectively analyzed data. From 1985 to 2010, 417 patients underwent 3550 EMB after HTx. All myocardial specimens were reviewed to identify the presence of tricuspid valve tissue by 2 observers initially and in doubtful cases by a third observer. The echocardiographic and hemodynamic parameters were only considered for valvular functional damage analysis in cases of tricuspid tissue inadvertently removed during EMB. Results. The 417 HTx patients to 3550 EMB, including 17,550 myocardial specimens. Tricuspid valve tissue was observed in 12 (2.9%) patients corresponding to 0.07% of the removed fragments. The echocardiographic and hemodynamic parameters of these patients before versus after the biopsy showed increased TR in 2 cases (2/12; 16.7%) quantified as moderate without progression in the long term. Only the right atrial pressure showed a significant increase (P = .0420) after tricuspid injury; however, the worsening of the functional class was not significant enough in any of the subjects. Thus, surgical intervention was not required. Conclusions. Histological evidence of chordal tissue in EMB specimens is a real-world problem of relatively low frequency. Traumatic tricuspid valve injury due to EMB rarely leads to severe valvular regurgitation; only a minority of patients develop significant clinical symptoms. Hemodynamic and echocardiographic alterations are also less often observed in most patients.