152 resultados para Cadaveric Kidney-transplantation
Resumo:
Background. Vascular calcification (VC) is commonly seen in patients with chronic kidney disease (CKD). Elevated levels of phosphate and parathormone (PTH) are considered nontraditional risk factors for VC. It has been shown that, in vitro, phosphate transforms vascular smooth muscle cells (VSMCs) into calcifying cells, evidenced by upregulated expression of runt-related transcription factor 2 (Runx2), whereas PTH is protective against VC. In addition, Runx2 has been detected in calcified arteries of CKD patients. However, the in vivo effect of phosphate and PTH on Runx2 expression remains unknown. Methods. Wistar rats were submitted to parathyroidectomy, 5/6 nephrectomy (Nx) and continuous infusion of 1-34 rat PTH (at physiological or supraphysiological rates) or were sham-operated. Diets varied only in phosphate content, which was low (0.2%) or high (1.2%). Biochemical, histological, immunohistochemistry and immunofluorescence analyses were performed. Results. Nephrectomized animals receiving high-PTH infusion presented VC, regardless of the phosphate intake level. However, phosphate overload and normal PTH infusion induced phenotypic changes in VSMCs, as evidenced by upregulated aortic expression of Runx2. High-PTH infusion promoted histological changes in the expression of osteoprotegerin and type I collagen in calcified arteries. Conclusions. Phosphate, by itself is a potential pathogenic factor for VC. It is of note that phosphate overload, even without VC, was associated with overexpression of Runx2 in VSMCs. The mineral imbalance often seen in patients with CKD should be corrected.
Resumo:
Objective. Endomyocardial biopsy (EMB), which is used to monitor for rejection, may cause tricuspid regurgitation (TR) after orthotopic heart transplantation (OHT). The purpose of this investigation was to examine the occurrence of tricuspid valve tissue in myocardial specimens obtained by routine EMB performed after OHT. Patients and Methods. From January 2000 to July 2008, 125 of the patients who underwent OHT survived more than I month. Their follow-up varied from I month to 8.5 years (mean, 5.1 +/- 3.7 years). EMB was the gold standard examination and myocardial scintigraphy with gallium served as a screen to routinely monitor rejection. Results. Each of 428 EMB including 4 to 7 fragments, totaling 1715 fragments, were reviewed for this study. The number of EMB per patient varied from 3 to 8 (mean, 4.6 +/- 3.5). Histopathological analysis of these fragments showed tricuspid tissue in 4 patients (3.2%), among whom only I showed aggravation of TR. Conclusions. EMB remains the standard method to diagnose rejection after OLT. It can be performed with low risk. Reducing the number of EMB using gallium myocardial scintigraphy or other alternative methods as well as adoption of special care during the biopsy can significantly minimize trauma to the tricuspid valve.
Resumo:
Objective. Arrhythmogenic right ventricular dysplasia (ARVD) is a myocardial disease of familiar, origin where the myocardium is replaced by fibrofatty tissue predominantly in the right ventricle. Herein we have presented the clinical courses of 4 patients with ARVD who underwent orthotopic heart transplantation. Patients and Methods. Among 358 adult patients undergoing heart transplantation, 4 (1.1%) displayed ARVD. The main indication for transplantation was the progression to heart failure associated with arrhythmias. All 4 patients displayed rapid, severe courses leading to heart failure with left ventricular involvement and uncontrolled arrhythmias. Results. In all cases the transplantation was performed using a bicaval technique with prophylactic tricuspid valve annuloplasty. One patient developed hyperacute rejection and infection, leading to death on the 7th day after surgery. The other 3 cases showed a good evolution with clinical remission of the symptoms. Pathological study of the explanted hearts confirmed the presence of the disease. Conclusions. ARVD is a serious cardiomyopathy that can develop malignant arrhythmias, severe ventricular dysfunction with right ventricular predominance, and sudden cardiac death. Orthotopic heart transplantation must always be considered in advanced cases of ARVD with malignant arrhythmias or refractory congestive heart failure with or without uncontrolled arrhythmias, because it is the only way to remit the symptoms and the disease.
Resumo:
Background and objectives Low bone mineral density and coronary artery calcification (CAC) are highly prevalent among chronic kidney disease (CKD) patients, and both conditions are strongly associated with higher mortality. The study presented here aimed to investigate whether reduced vertebral bone density (VBD) was associated with the presence of CAC in the earlier stages of CKD. Design, setting, participants, & measurements Seventy-two nondialyzed CKD patients (age 52 +/- 11.7 years, 70% male, 42% diabetics, creatinine clearance 40.4 +/- 18.2 ml/min per 1.73 m(2)) were studied. VBD and CAC were quantified by computed tomography. Results CAC > 10 Agatston units (AU) was observed in 50% of the patients (median 120 AU [interquartile range 32 to 584 AU]), and a calcification score >= 400 AU was found in 19% (736 [527 to 1012] AU). VBD (190 +/- 52 Hounsfield units) correlated inversely with age (r = -0.41, P < 0.001) and calcium score (r = -0.31, P = 0.01), and no correlation was found with gender, creatinine clearance, proteinuria, lipid profile, mineral parameters, body mass index, and diabetes. Patients in the lowest tertile of VBD had expressively increased calcium score in comparison to the middle and highest tertile groups. In the multiple logistic regression analysis adjusting for confounding variables, low VBD was independently associated with the presence of CAC. Conclusions Low VBD was associated with CAC in nondialyzed CKD patients. The authors suggest that low VBD might constitute another nontraditional risk factor for cardiovascular disease in CKD. Clin J Am Soc Nephrol 6: 1456-1462, 2011. doi: 10.2215/CJN.10061110
Resumo:
Purpose: The aim of this study is to evaluate the relationship between timing of renal replacement therapy (RRT) in severe acute kidney injury and clinical outcomes. Methods: This was a prospective multicenter observational study conducted at 54 intensive care units (ICUs) in 23 countries enrolling 1238 patients. Results: Timing of RRT was stratified into ""early"" and ""late"" by median urea and creatinine at the time RRT was started. Timing was also categorized temporally from ICU admission into early (<2 days), delayed (2-5 days), and late (>5 days). Renal replacement therapy timing by serum urea showed no significant difference in crude (63.4% for urea <= 24.2 mmol/L vs 61.4% for urea >24.2 mmol/L; odds ratio [OR], 0.92; 95% confidence interval [CI], 0.73-1.15; P = .48) or covariate-adjusted mortality (OR, 1.25; 95% CI, 0.91-1.70; P = .16). When stratified by creatinine, late RRT was associated with lower crude (53.4% for creatinine >309 mu mol/L vs 71.4% for creatinine <= 309 mu mol/L; OR, 0.46; 95% CI, 0.36-0.58; P < .0001) and covariate-adjusted mortality (OR, 0.51; 95% CI, 0.37-0.69; P < .001).However, for timing relative to ICU admission, late RRT was associated with greater crude (72.8% vs 62.3% vs 59%, P < .001) and covariate-adjusted mortality (OR, 1.95; 95% CI, 1.30-2.92; P = .001). Overall, late RRT was associated with a longer duration of RRT and stay in hospital and greater dialysis dependence. Conclusion: Timing of RRT, a potentially modifiable factor, might exert an important influence on patient survival. However, this largely depended on its definition. Late RRT (days from admission) was associated with a longer duration of RRT, longer hospital stay, and higher dialysis dependence. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Anorectal transplantation is a valid procedure for the treatment of anorectal dysfunction; however, the lack of a suitable animal model has hampered the development of this method. We describe a simple technique for anorectal transplantation in the rat and compare this procedure with colostomy. The anorectal segment including the skin surrounding the anus were freed by abdominal and perineal dissection. In a heterotopically transplanted group the segment was exteriorized by the formation of an anus through an abdominal incision. In an orthotopically transplanted group the segment was replaced in its original position and reimplanted by suturing. In another group a distal colostomy was performed. A sham-treated control group (simulated surgical procedure) was also included. Changes in behavior, characteristics of the stool, body weight and survival rate were assessed by daily clinical examination. Moribund animals, those with a weight loss of more than 30%, and those surviving at 1 month were killed by an overdose of anesthetic. The results were analyzed using the Mann Whitney, Student`s t and chi-squared tests, and p < 0.05 was considered significant. Within 4 days after the operation, animals submitted to orthotopic or heterotopic transplantation had achieved normal defecation, body weight gain and clinical evolution similar to the sham-treated group. The overall mortality in these groups was 4.16%. In contrast, colostomized animals showed a high incidence of diarrhea, intestinal obstruction, stress posture and violent behavior (pa parts per thousand currency sign0.05), and a mortality rate of 58.33%. Autotransplantation in the rat is a simple technique, achieves a high rate of success and better clinical evolution than colostomy. This model may ultimately lead to research into anorectal transplantation.
Resumo:
Eight hundred and seventy-nine patients with acute kidney injury were retrospectively studied over year and eleven months for evaluation of urine volume as a risk factor for death. They were divided into five groups, according to the 24 h urine volume (UV): anuric (UV <= 50 mL/24 h, group 1), oliguric (UV > 50 mL/24 h and < 400 mL/24 h, group 2), and non-oliguric (UV >= 400 mL/24 h). Nonoliguric group was subdivided in three subgroups: UV > 400 mL/24 h and <= 1000 mL/24 h (group 3, reference group), UV > 1000 mL/24 h and <= 2000 mL/24 h (group 4), and UV > 2000 mL/24 h (group 5). Linear tendency test (Mantel extension) pointed out a significant increase in mortality with UV decrease (p < 0.001), confirmed by multivariate analysis. Anuric and oliguric patients had increased risk of respectively 95% and 76% times for death compared to controls (p < 0.05). Patients from groups 4 and 5 presented a reduced risk for death of 50% and 70%, respectively, p = 0.004 and p = 0.001. In conclusion, urine volume was a strong independent factor for mortality in this cohort of AKI patients.
Resumo:
HAT is the main cause of graft loss in pediatric living-related LTx. Revascularization of the graft by thrombectomy and re-anastomosis has been reported to be effective for graft salvage in cases of HAT and should be attempted when potential donors are not available for emergency re-transplantation. Immediate complications secondary to revascularization attempts in cases of HAT are not described. Late complications are mainly related to biliary tree ischemia. We report a case of child who experienced intimal hepatic artery dissection, which extended into intra-hepatic branches of the artery after a thrombectomy with a Fogarty balloon catheter in an attempt to restore arterial flow after HAT. This complication led to acute deterioration of the graft and the need for emergency re-transplantation.
Resumo:
Recently, mild AKI has been considered as a risk factor for mortality in different scenarios. We conducted a retrospective analysis of the risk factors for two distinct definitions of AKI after elective repair of aortic aneurysms. Logistic regression was carried out to identify independent risk factors for AKI ( defined as >= 25% or >= 50% increase in baseline SCr within 48 h after surgery, AKI 25% and AKI 50%, respectively) and for mortality. Of 77 patients studied ( mean age 68 +/- 10, 83% male), 57% developed AKI 25% and 33.7% AKI 50%. There were no differences between AKI and control groups regarding comorbidities and diameter of aneurysms. However, AKI patients needed a supra-renal aortic cross-clamping more frequently and were more severely ill. Overall in-hospital mortality was 27.3%, which was markedly higher in those requiring a supra-renal aortic cross-clamping. The risk factors for AKI 25% were suprarenal aortic cross-clamping ( odds ratio 5.51, 95% CI 1.05-36.12, p = 0.04) and duration of operation for AKI 25% ( OR 6.67, 95% CI 2.23-19.9, p < 0.001). For AKI 50%, in addition to those factors, post-operative use of vasoactive drugs remained as an independent factor ( OR 6.13, 95% CI 1.64-22.8, p = 0.005). The risk factors associated with mortality were need of supra-renal aortic cross-clamping ( OR 9.6, 95% CI 1.37-67.88, p = 0.02), development of AKI 50% ( OR 8.84, 95% CI 1.31-59.39, p = 0.02), baseline GFR lower than 49 mL/min ( OR 17.07, 95% CI 2.00 145.23, p = 0.009), and serum glucose > 118 mg/dL in the post-operative period ( OR 19.99, 95% CI 2.32-172.28, p = 0.006). An increase of at least 50% in baseline SCr is a common event after surgical repair of aortic aneurysms, particularly when a supra-renal aortic cross-clamping is needed. Along with baseline moderate chronic renal failure, AKI is an independent factor contributing to the high mortality found in this scenario.
Resumo:
Acute kidney injury (AKI) is now well recognized as an independent risk factor for increased morbidity and mortality particularly when dialysis is needed. Although renal replacement therapy (RRT) has been used in AKI for more than five decades, there is no standard methodology to predict which AKI patients will need dialysis and who will recover renal function without requiring dialysis. The lack of consensus on what parameters should guide the decision to start dialysis has led to a wide variation in dialysis utilization. A contributing factor is the lack of studies in the modern era evaluating the relationship of timing of dialysis initiation and outcomes. Although listed as one of the top priorities in research on AKI, timing of dialysis initiation has not been included as a factor in large, randomized controlled trials in this area. In this review we will discuss the criteria that have been used to define early vs. late initiation in previous studies on dialysis initiation. In addition, we propose a patient-centered approach to define early and late initiation that could serve as framework for managing patients and for future studies in this area.
Resumo:
Background/Aims: The aim of this study is to compare the splanchnic non-hepatic hemodynamics and the metabolic changes during orthotopic liver transplantation between the conventional with bypass and the piggyback methods. Methodology: A prospective, consecutive series of 59 primary transplants were analyzed. Oxygen consumption, glucose, potassium, and lactate metabolism were quantitatively estimated from blood samples from the radial artery and portal vein, collected up to 120 minutes after graft reperfusion. Mean arterial pressure, portal venous pressure, portal venous blood flow, and splanchnic vascular resistance were also measured or calculated at postreperfusion collection times. Results: There was a greater increase in portal venous blood flow (p=0.05) and lower splanchnic vascular resistance (p=0.04) in the piggyback group. Mean arterial pressure and portal venous pressure were similar for both groups. Oxygen, glucose and potassium consumption were higher in the piggyback group, but none of the metabolic parameters differed significantly between groups. Conclusions: In conclusion, the study detected a higher portal venous blood flow and a lower and splanchnic vascular resistance associated with the piggyback technique. After graft reperfusion, no difference in the splanchnic non-hepatic metabolic parameters was observed between the conventional with bypass and the piggyback methods of orthotopic liver transplantation.
Resumo:
Aim: A positive effect of liver transplantation on health-related quality of life (HRQOL) has been well documented in previous studies using generic instruments. Our aim was to re-evaluate different aspects of HRQOL before and after liver transplantation with a relatively new questionnaire the `liver disease quality of life` (LDQOL). Methods: The LDQOL and the Short Form 36 (SF-36) questionnaires were applied to ambulatory patients, either in the transplant list (n=65) or after 6 months to 5 years of liver transplant (n=61). The aetiology of cirrhosis, comorbidities, model for end-stage liver disease (MELD) Child-Pugh scores and recurrence of liver disease after liver transplantation were analysed using the Mann-Whitney and Kruskall-Wallis tests. Results: In patients awaiting liver transplantation, MELD scores >= 15 and Child-Pugh class C showed statistically significant worse HRQOL, using both the SF-36 and the LDQOL questionnaires. HRQOL in pretransplant patients was found to be significantly worse in those with cirrhosis owing to hepatitis C (n=30) when compared with other aetiologies (n=35) in 2/7 domains of the SF-36 and in 7/12 domains of the LDQOL. Significant deterioration of HRQOL after recurrence of hepatitis C post-transplant was detected with the LDQOL questionnaire although not demonstrated with the SF-36. The statistically significant differences were in the LDQOL domains: symptoms of liver disease, concentration, memory and health distress. Conclusions: The LDQOL, a specific instrument for measuring HRQOL, has shown a greater accuracy in relation to liver symptoms and could demonstrate, with better reliability, impairments before and after liver transplantation.
Resumo:
A study was carried out to evaluate the feasibility of autologous adipose derived stem cells (ADSC) transplantation into female rabbits` urethra walls as an alternative to intrinsic urethral regeneration. Inguinal fat pad of 12 New Zealand adult female rabbits were harvested and processed to obtain stromal vascular fraction (SVF). The SVF were platted to isolate ADSC. Before urethral injection, cells were labeled with DiI marker. The urethra wall was injected with 1 x 10(7) autologous cells or saline (sham). The urethra was harvested at 2, 4, and 8 weeks to identify DiI-labeled cells. At 2 and 4 weeks, the ADSCs create a nodule localized in the urethral sub-mucosa. At 8 weeks, the ADSCs spread and integrated with the urethra wall from the initial injection site. This is the first study to demonstrate a successful autologous ADSCs transplantation. It confirms that ADSCs can survive and integrate within the urethral wall.
Resumo:
IRI is closely related to sepsis in ITx setting. Complete understanding of the mechanisms involved in IRI development may improve outcomes. Ortothopic ITx without immunosuppression was performed in order to characterize IRI-associated mucosal damage. Twenty pigs underwent ITx. Two groups were assigned to different CI times: G1: 90 min and, G2: 180 min. Euro-Collins was used as preservation solution. Jejunal fragments were collected at donor laparotomy, 30 min, and 3 days after reperfusion. IRI assessment involved: histopathologic analysis, quantification of MPO-positive cells through immunohistochemical studies, quantification of epithelial apoptotic cells using TUNEL staining, and quantification of IL-6, ET-1, Bak, and Bcl-XL genes expression by RT-PCR. Neutrophilic infiltration increased in a similar fashion in both groups, but lasted longer in G2. Apoptosis detected by TUNEL staining increased and anti-apoptotic gene Bcl-XL expression decreased significantly in G1, 3 days after surgery. Endothelin-1 and IL-6 genes expression increased 30 min after the procedure and returned to baseline 3 days after surgery. In conclusion, IL-6 and ET-1 are involved precociously in the development of intestinal IRI. Apoptosis was more frequently detected in G1 grafts by TUNEL-staining and by RT-PCR.