997 resultados para renal fibrosis
Resumo:
Objectives We sought to determine whether the quantitative assessment of myocardial fibrosis (MF), either by histopathology or by contrast-enhanced magnetic resonance imaging (ce-MRI), could help predict long-term survival after aortic valve replacement. Background Severe aortic valve disease is characterized by progressive accumulation of interstitial MF. Methods Fifty-four patients scheduled to undergo aortic valve replacement were examined by ce-MRI. Delayed-enhanced images were used for the quantitative assessment of MF. In addition, interstitial MF was quantified by histological analysis of myocardial samples obtained during open-heart surgery and stained with picrosirius red. The ce-MRI study was repeated 27 +/- 22 months after surgery to assess left ventricular functional improvement, and all patients were followed for 52 +/- 17 months to evaluate long-term survival. Results There was a good correlation between the amount of MF measured by histopathology and by ce-MRI (r = 0.69, p < 0.001). In addition, the amount of MF demonstrated a significant inverse correlation with the degree of left ventricular functional improvement after surgery (r = -0.42, p = 0.04 for histopathology; r = -0.47, p = 0.02 for ce-MRI). Kaplan-Meier analyses revealed that higher degrees of MF accumulation were associated with worse long-term survival (chi-square = 6.32, p = 0.01 for histopathology; chi-square = 5.85, p = 0.02 for ce-MRI). On multivariate Cox regression analyses, patient age and the amount of MF were found to be independent predictors of all-cause mortality. Conclusions The amount of MF, either by histopathology or by ce-MRI, is associated with the degree of left ventricular functional improvement and all-cause mortality late after aortic valve replacement in patients with severe aortic valve disease. (J Am Coll Cardiol 2010; 56: 278-87) (c) 2010 by the American College of Cardiology Foundation
Resumo:
Background: We tested the hypothesis that the universal application of myocardial scanning with single-photon emission computed tomography (SPECT) would result in better risk stratification in renal transplant candidates (RTC) compared with SPECT being restricted to patients who, in addition to renal disease, had other clinical risk factors. Methods: RTCs (n=363) underwent SPECT and clinical risk stratification according to the American Society of Transplantation (AST) algorithm and were followed up until a major adverse cardiovascular event (MACE) or death. Results: Of the 363 patients, 79 patients (22%) had an abnormal SPECT scan and 270 (74%) were classified as high risk. Both methods correctly identified patients with increased probability of MACE. However, clinical stratification performed better (sensitivity and negative predictive value 99% and 99% vs. 25% and 87%, respectively). High-risk patients with an abnormal SPECT scan had a modest increased risk of events (log-rank = 0.03; hazard ratio [HR] = 1.37; 95% confidence interval [95% CI], 1.02-1.82). Eighty-six patients underwent coronary angiography, and coronary artery disease (CAD) was found in 60%. High-risk patients with CAD had an increased incidence of events (log-rank = 0.008; HR=3.85; 95% CI, 1.46-13.22), but in those with an abnormal SPECT scan, the incidence of events was not influenced by CAD (log-rank = 0.23). Forty-six patients died. Clinical stratification, but not SPECT, correlated with the probability of death (log-rank = 0.02; HR=3.25; 95% CI, 1.31-10.82). Conclusion: SPECT should be restricted to high-risk patients. Moreover, in contrast to SPECT, the AST algorithm was also useful for predicting death by any cause in RTCs and for selecting patients for invasive coronary testing.
Resumo:
Objectives: To describe current practice for the discontinuation of continuous renal replacement therapy in a multinational setting and to identify variables associated with successful discontinuation. The approach to discontinue continuous renal replacement therapy may affect patient outcomes. However, there is lack of information on how and under what conditions continuous renal replacement therapy is discontinued. Design: Post hoc analysis of a prospective observational study. Setting. Fifty-four intensive care units in 23 countries. Patients: Five hundred twenty-nine patients (52.6%) who survived initial therapy among 1006 patients treated with continuous renal replacement therapy. Interventions: None. Measurements and Main Results., Three hundred thirteen patients were removed successfully from continuous renal replacement therapy and did not require any renal replacement therapy for at least 7 days and were classified as the ""success"" group and the rest (216 patients) were classified as the ""repeat-RRT"" (renal replacement therapy) group. Patients in the ""success"" group had lower hospital mortality (28.5% vs. 42.7%, p < .0001) compared with patients in the ""repeat-RRT"" group. They also had lower creatinine and urea concentrations and a higher urine output at the time of stopping continuous renal replacement therapy. Multivariate logistic regression analysis for successful discontinuation of continuous renal replacement therapy identified urine output (during the 24 hrs before stopping continuous renal replacement therapy: odds ratio, 1.078 per 100 mL/day increase) and creatinine (odds ratio, 0.996 per mu mol/L increase) as significant predictors of successful cessation. The area under the receiver operating characteristic curve to predict successful discontinuation of continuous renal replacement therapy was 0.808 for urine output and 0.635 for creatinine. The predictive ability of urine output was negatively affected by the use of diuretics (area under the receiver operating characteristic curve, 0.671 with diuretics and 0.845 without diuretics). Conclusions. We report on the current practice of discontinuing continuous renal replacement therapy in a multinational setting. Urine output at the time of initial cessation (if continuous renal replacement therapy was the most important predictor of successful discontinuation, especially if occurring without the administration of diuretics. (Crit Care Med 2009; 37:2576-2582)
Resumo:
OBJECTIVE. The purposes of this study were to use the myocardial delayed enhancement technique of cardiac MRI to investigate the frequency of unrecognized myocardial infarction (MI) in patients with end-stage renal disease, to compare the findings with those of ECG and SPECT, and to examine factors that may influence the utility of these methods in the detection of MI. SUBJECTS AND METHODS. We prospectively performed cardiac MRI, ECG, and SPECT to detect unrecognized MI in 72 patients with end-stage renal disease at high risk of coronary artery disease but without a clinical history of MI. RESULTS. Fifty-six patients (78%) were men ( mean age, 56.2 +/- 9.4 years) and 16 (22%) were women ( mean age, 55.8 +/- 11.4). The mean left ventricular mass index was 103.4 +/- 27.3 g/m(2), and the mean ejection fraction was 60.6% +/- 15.5%. Myocardial delayed enhancement imaging depicted unrecognized MI in 18 patients (25%). ECG findings were abnormal in five patients (7%), and SPECT findings were abnormal in 19 patients (26%). ECG findings were false-negative in 14 cases and false-positive in one case. The accuracy, sensitivity, and specificity of ECG were 79.2%, 22.2%, and 98.1% (p = 0.002). SPECT findings were false-negative in six cases and false-positive in seven cases. The accuracy, sensitivity, and specificity of SPECT were 81.9%, 66.7%, and 87.0% ( not significant). During a period of 4.9-77.9 months, 19 cardiac deaths were documented, but no statistical significance was found in survival analysis. CONCLUSION. Cardiac MRI with myocardial delayed enhancement can depict unrecognized MI in patients with end-stage renal disease. ECG and SPECT had low sensitivity in detection of MI. Infarct size and left ventricular mass can influence the utility of these methods in the detection of MI.
Resumo:
Purpose: The aim of this study is to evaluate the relationship between timing of renal replacement therapy (RRT) in severe acute kidney injury and clinical outcomes. Methods: This was a prospective multicenter observational study conducted at 54 intensive care units (ICUs) in 23 countries enrolling 1238 patients. Results: Timing of RRT was stratified into ""early"" and ""late"" by median urea and creatinine at the time RRT was started. Timing was also categorized temporally from ICU admission into early (<2 days), delayed (2-5 days), and late (>5 days). Renal replacement therapy timing by serum urea showed no significant difference in crude (63.4% for urea <= 24.2 mmol/L vs 61.4% for urea >24.2 mmol/L; odds ratio [OR], 0.92; 95% confidence interval [CI], 0.73-1.15; P = .48) or covariate-adjusted mortality (OR, 1.25; 95% CI, 0.91-1.70; P = .16). When stratified by creatinine, late RRT was associated with lower crude (53.4% for creatinine >309 mu mol/L vs 71.4% for creatinine <= 309 mu mol/L; OR, 0.46; 95% CI, 0.36-0.58; P < .0001) and covariate-adjusted mortality (OR, 0.51; 95% CI, 0.37-0.69; P < .001).However, for timing relative to ICU admission, late RRT was associated with greater crude (72.8% vs 62.3% vs 59%, P < .001) and covariate-adjusted mortality (OR, 1.95; 95% CI, 1.30-2.92; P = .001). Overall, late RRT was associated with a longer duration of RRT and stay in hospital and greater dialysis dependence. Conclusion: Timing of RRT, a potentially modifiable factor, might exert an important influence on patient survival. However, this largely depended on its definition. Late RRT (days from admission) was associated with a longer duration of RRT, longer hospital stay, and higher dialysis dependence. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Burkholderia cepacia complex isolates obtained by microbiological culture of respiratory samples from Brazilian CF patients were studied by recA based PCR, screened by specific PCR for virulence markers and genotyped by RAPD. Forty-one isolates of B. cepacia complex were identified by culture and confirmation of identity and genomovar determination obtained in 32 isolates, with predominance of B. cenocepacia (53.1%). Virulence markers were not consistently found among isolates. Genotyping did not identify identical patterns among different patients. B. cenocepacia was the most prevalent B. cepacia complex member among our patients, and cross-infection does not seem to occur among them. V 2008 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.
Resumo:
Hepatitis C virus (HCV) is a major cause of hepatic disease and of liver transplantation worldwide. Mannan-binding lectin (MBL), encoded by the MBL2 gene, can have an important role as an opsonin and complement activating molecule in HCV persistence and liver injury. We assessed the MBL2 polymorphism in 102 Euro-Brazilian patients with moderate and severe chronic hepatitis C, paired for gender and age with 102 HCV seronegative healthy individuals. Six common single nucleotide polymorphisms in the MBL2 gene, three in the promoter (H/L, X/Y and P/Q) and three in exon 1 (A, the wild-type, and B, C or D also known as O) were evaluated using real-time polymerase chain reaction with fluorescent hybridization probes. The concentration of MBL in plasma was measured by enzyme-linked immunosorbent assay. The frequency of the YA/YO genotype was significantly higher in the HCV patients compared with the controls (P = 0.022). On the other hand, the genotypes associated with low levels of MBL (XA/XA, XA/YO and YO/YO) were decreased significantly in the patients with severe fibrosis (stage F4), when compared with the patients with moderate fibrosis (stage F2) (P = 0.04) and to the control group (P = 0.011). Furthermore, MBL2 genotypes containing X or O mutations were found to be associated with non-responsiveness to pginterferon and ribavirin treatment (P = 0.023). MBL2 polymorphisms may therefore be associated not only with the development of chronic hepatitis C, but also with its clinical evolution and response to treatment.
Resumo:
Acute kidney injury (AKI) is now well recognized as an independent risk factor for increased morbidity and mortality particularly when dialysis is needed. Although renal replacement therapy (RRT) has been used in AKI for more than five decades, there is no standard methodology to predict which AKI patients will need dialysis and who will recover renal function without requiring dialysis. The lack of consensus on what parameters should guide the decision to start dialysis has led to a wide variation in dialysis utilization. A contributing factor is the lack of studies in the modern era evaluating the relationship of timing of dialysis initiation and outcomes. Although listed as one of the top priorities in research on AKI, timing of dialysis initiation has not been included as a factor in large, randomized controlled trials in this area. In this review we will discuss the criteria that have been used to define early vs. late initiation in previous studies on dialysis initiation. In addition, we propose a patient-centered approach to define early and late initiation that could serve as framework for managing patients and for future studies in this area.
Resumo:
In animal models, interstitial angiotensin II (ang II) and AT1 receptor (AT1R) are key mediators of renal inflammation and fibrosis in progressive chronic nephropathies. We hypothesized that these molecules were overexpressed in patients with progressive glomerulopathies. In this observational retrospective study, we described the expression of ang II and AT1R by immunohistochemistry in kidney biopsies of 7 patients with minimal change disease (MCD) and in 25 patients with progressive glomerulopathies (PGPs). Proteinuria, serum albumin, and serum creatinine were not statistically different between MCD and PGP patients. Total expression of ang II and AT1R was not statistically different between MCD (108.7 +/- 11.5 and 73.2 +/- 13.6 cells/mm(2), respectively) and PGN patients (100.7 +/- 9.0 and 157.7 +/- 13.8 cells/mm(2), respectively; p>0.05). Yet, interstitial expression of ang II and AT1R (91.6 +/- 16.0 and 45.6 +/- 5.4 cells/mm(2), respectively) was higher in patients with PGN than in those with MCD (22.0 +/- 4.1 and 17.9 +/- 2.9 cells/mm(2), respectively, p<0.05), as was the proportion of interstitial fibrosis (11.0 +/- 0.7% versus 6.1 +/- 1.2%, p<005). In patients with MCD, ang II and AT1R expressions predominate in the tubular compartment (52% and 36% of the positive cells, respectively). In those with PGP, the interstitial expression of ang II and AT1R predominates (58% and 45%, respectively). In conclusion, interstitial expression of ang II and AT1R is increased in patients with progressive glomerulopathies. The relationship of these results and interstitial fibrosis and disease progression in humans warrants further investigations.
Resumo:
Arteriovenous fistula involving renal artery and inferior vena cava are rare. We report the case of a 47-year-old woman with a chronic arteriovenous fistula between right renal artery and inferior vena cava due to a penetrating trauma. Another finding was a vena cava aneurysm caused by the fistula. The patient was successfully treated with a covered stent in the renal artery. Diagnosis and postoperative control have been documented with CT scan. Endovascular techniques may be effective and minimally invasive option for treatment and renal preservation in renal-cava arteriovenous fistulae.