9 resultados para predictive regression
Resumo:
INTRODUCTION AND AIMS: Adult orthotopic liver transplantation (OLT) is associated with considerable blood product requirements. The aim of this study was to assess the ability of preoperative information to predict intraoperative red blood cell (RBC) transfusion requirements among adult liver recipients. METHODS: Preoperative variables with previously demonstrated relationships to intraoperative RBC transfusion were identified from the literature: sex, age, pathology, prothrombin time (PT), factor V, hemoglobin (Hb), and platelet count (plt). These variables were then retrospectively collected from 758 consecutive adult patients undergoing OLT from 1997 to 2007. Relationships between these variables and intraoperative blood transfusion requirements were examined by both univariate analysis and multiple linear regression analysis. RESULTS: Univariate analysis confirmed significant associations between RBC transfusion and PT, factor V, Hb, Plt, pathology, and age (P values all < .001). However, stepwise backward multivariate analysis excluded variables Plt and factor V from the multiple regression linear model. The variables included in the final predictive model were PT, Hb, age, and pathology. Patients suffering from liver carcinoma required more blood products than those suffering from other pathologies. Yet, the overall predictive power of the final model was limited (R(2) = .308; adjusted R(2) = .30). CONCLUSION: Preoperative variables have limited predictive power for intraoperative RBC transfusion requirements even when significant statistical associations exist, identifying only a small portion of the observed total transfusion variability. Preoperative PT, Hb, age, and liver pathology seem to be the most significant predictive factors but other factors like severity of liver disease, surgical technique, medical experience in liver transplantation, and other noncontrollable human variables may play important roles to determine the final transfusion requirements.
Resumo:
Objectives: To characterize the epidemiology and risk factors for acute kidney injury (AKI) after pediatric cardiac surgery in our center, to determine its association with poor short-term outcomes, and to develop a logistic regression model that will predict the risk of AKI for the study population. Methods: This single-center, retrospective study included consecutive pediatric patients with congenital heart disease who underwent cardiac surgery between January 2010 and December 2012. Exclusion criteria were a history of renal disease, dialysis or renal transplantation. Results: Of the 325 patients included, median age three years (1 day---18 years), AKI occurred in 40 (12.3%) on the first postoperative day. Overall mortality was 13 (4%), nine of whom were in the AKI group. AKI was significantly associated with length of intensive care unit stay, length of mechanical ventilation and in-hospital death (p<0.01). Patients’ age and postoperative serum creatinine, blood urea nitrogen and lactate levels were included in the logistic regression model as predictor variables. The model accurately predicted AKI in this population, with a maximum combined sensitivity of 82.1% and specificity of 75.4%. Conclusions: AKI is common and is associated with poor short-term outcomes in this setting. Younger age and higher postoperative serum creatinine, blood urea nitrogen and lactate levels were powerful predictors of renal injury in this population. The proposed model could be a useful tool for risk stratification of these patients.
Resumo:
INTRODUCTION: The significant risk of sudden arrhythmic death in patients with congestive heart failure and electromechanical ventricular dyssynchrony has led to increased use of combined cardiac resynchronization therapy defibrillator (CRT-D) devices. OBJECTIVES: To evaluate the echocardiographic variables in patients undergoing CRT-D that predict the occurrence of appropriate therapies (AT) for ventricular tachyarrhythmia. METHODS: We analyzed 38 consecutive patients (mean age 60 +/- 12 years, 63% male) with echocardiographic evaluation before and 6 months after CRT-D implantation. Patients with AT were identified in a mean follow-up of 471 +/- 323 days. A standard echocardiographic study was performed including tissue Doppler imaging (TDI). Responders were defined as patients with improvement in NYHA class of < or = 1 in the first six months, and reverse remodeling as a decrease in left ventricular end-systolic volume of < or = 15% and/or an increase in left ventricular ejection fraction of > 25%. RESULTS: The responder rate was 74%, and the reverse remodeling rate was 55%. AT occurred in 21% of patients, who presented with greater left ventricular end-diastolic internal diameter (LVEDD) before implantation (86 +/- 8 vs. 76 +/- 11 mm, p = 0.03) and at 6 months (81 +/- 8 vs. 72 +/- 14 mm, p = 0.08), and increased left ventricular end-systolic internal diameter (66 +/- 14 vs. 56 +/- 14 mm, p = 0.03) and lower ejection fraction (24 +/- 6 vs. 34 +/- 14%, p = 0.08) at 6 months. In the group with AT, the responder rate was lower (38 vs. 83%, p = 0.03), without significant differences in reverse remodeling (38% for the AT group vs. 60%, p = 0.426) or in the other variables. By univariate analysis, predictors of AT were LVEDD before implantation and E' after implantation. Age, gender, ischemic etiology, use of antiarrhythmic drugs, reverse remodeling and the other echocardiographic parameters did not predict AT. In multivariate logistic regression analysis, both LVEDD before implantation (OR 1.24, 95% CI 1.04-1.48, p = 0.019) and postimplantation E' (OR 0.27, 95% CI 0.09-0.76, p = 0.014) remained as independent predictors of AT. CONCLUSIONS: In patients undergoing CRT-D, episodes of ventricular tachyarrhythmia occur with high incidence, independently of echocardiographic response, with LVEDD before implantation and E' after implantation as the only independent predictors of AT in the medium-term. These results highlight the importance of combined devices with defibrillation capability.
Resumo:
BACKGROUND: To optimize the noninvasive evaluation of bone remodeling, we evaluated, besides routine serum markers, serum levels of several cytokines involved in bone turnover. METHODS: A transiliac bone biopsy was performed in 47 hemodialysis patients. Serum levels of intact parathyroid hormone (iPTH; 1-84), total alkaline phosphatases (tAP), calcium, phosphate and aluminum (Al) were measured. Circulating levels of interleukin-6 (IL-6), IL-1 receptor antagonist (IL-1Ra) and soluble IL-6 receptor (sIL-6r) were determined using ELISA. Circulating IL-1beta, IL-6, IL-8, IL-10, IL-12p70 and tumor necrosis factor-alpha (TNF-alpha) were simultaneously quantified by flow cytometric immunoassay. RESULTS: Patients with low/normal bone formation rate (L/N-BFR) had significantly lower serum iPTH (p<0.001) and tAP (p<0.008) and significantly higher Al (p<0.025) than patients with high BFR. Serum calcium and phosphorus, however, did not differ (p=NS). An iPTH >300 pg/mL in association with tAP >120 U/L showed low sensitivity (58.8%) and low negative predictive value (44.0%) for the diagnosis of high BFR disease. An iPTH <300 pg/mL in association with normal or low tAP, <120 U/L, was associated with low sensitivity (66.7%) but high specificity (97.1%) for the diagnosis of L/N-BFR. Serum IL-1, IL-6, IL-12p70 and TNF-alpha were positively correlated with BFR, serum IL1-Ra and IL-10 with bone area, and by multiple regression analysis, tAP and IL-6 were independently predictive of BFR. CONCLUSIONS: Significant associations were found between several circulating cytokines and bone histomorphometry in dialysis patients. The usefulness of these determinations in the noninvasive evaluation of bone remodeling needs to be confirmed in larger dialysis populations.
Resumo:
INTRODUCTION: A growing body of evidence shows the prognostic value of oxygen uptake efficiency slope (OUES), a cardiopulmonary exercise test (CPET) parameter derived from the logarithmic relationship between O(2) consumption (VO(2)) and minute ventilation (VE) in patients with chronic heart failure (CHF). OBJECTIVE: To evaluate the prognostic value of a new CPET parameter - peak oxygen uptake efficiency (POUE) - and to compare it with OUES in patients with CHF. METHODS: We prospectively studied 206 consecutive patients with stable CHF due to dilated cardiomyopathy - 153 male, aged 53.3±13.0 years, 35.4% of ischemic etiology, left ventricular ejection fraction 27.7±8.0%, 81.1% in sinus rhythm, 97.1% receiving ACE-Is or ARBs, 78.2% beta-blockers and 60.2% spironolactone - who performed a first maximal symptom-limited treadmill CPET, using the modified Bruce protocol. In 33% of patients an cardioverter-defibrillator (ICD) or cardiac resynchronization therapy device (CRT-D) was implanted during follow-up. Peak VO(2), percentage of predicted peak VO(2), VE/VCO(2) slope, OUES and POUE were analyzed. OUES was calculated using the formula VO(2) (l/min) = OUES (log(10)VE) + b. POUE was calculated as pVO(2) (l/min) / log(10)peakVE (l/min). Correlation coefficients between the studied parameters were obtained. The prognosis of each variable adjusted for age was evaluated through Cox proportional hazard models and R2 percent (R2%) and V index (V6) were used as measures of the predictive accuracy of events of each of these variables. Receiver operating characteristic (ROC) curves from logistic regression models were used to determine the cut-offs for OUES and POUE. RESULTS: pVO(2): 20.5±5.9; percentage of predicted peak VO(2): 68.6±18.2; VE/VCO(2) slope: 30.6±8.3; OUES: 1.85±0.61; POUE: 0.88±0.27. During a mean follow-up of 33.1±14.8 months, 45 (21.8%) patients died, 10 (4.9%) underwent urgent heart transplantation and in three patients (1.5%) a left ventricular assist device was implanted. All variables proved to be independent predictors of this combined event; however, VE/VCO2 slope was most strongly associated with events (HR 11.14). In this population, POUE was associated with a higher risk of events than OUES (HR 9.61 vs. 7.01), and was also a better predictor of events (R2: 28.91 vs. 22.37). CONCLUSION: POUE was more strongly associated with death, urgent heart transplantation and implantation of a left ventricular assist device and proved to be a better predictor of events than OUES. These results suggest that this new parameter can increase the prognostic value of CPET in patients with CHF.
Resumo:
Our purposes are to determine the impact of histological factors observed in zero-time biopsies on early post transplant kidney allograft function. We specifically want to compare the semi-quantitative Banff Classification of zero time biopsies with quantification of % cortical area fibrosis. Sixty three zero-time deceased donor allograft biopsies were retrospectively semiquantitatively scored using Banff classification. By adding the individual chronic parameters a Banff Chronic Sum (BCS) Score was generated. Percentage of cortical area Picro Sirius Red (%PSR) staining was assessed and calculated with a computer program. A negative linear regression between %PSR/ GFR at 3 year post-transplantation was established (Y=62.08 +-4.6412X; p=0.022). A significant negative correlation between arteriolar hyalinosis (rho=-0.375; p=0.005), chronic interstitial (rho=0.296; p=0.02) , chronic tubular ( rho=0.276; p=0.04) , chronic vascular (rho= -0.360;P=0.007), BCS (rho=-0.413; p=0.002) and GFR at 3 years were found. However, no correlation was found between % PSR, Ci, Ct or BCS. In multivariate linear regression the negative predictive factors of 3 years GFR were: BCS in histological model; donor kidney age, recipient age and black race in clinical model. The BCS seems a good and easy to perform tool, available to every pathologist, with significant predictive short-term value. The %PSR predicts short term kidney function in univariate study and involves extra-routine and expensive-time work. We think that %PSR must be regarded as a research instrument.
Resumo:
INTRODUCTION: There are several risk scores for stratification of patients with ST-segment elevation myocardial infarction (STEMI), the most widely used of which are the TIMI and GRACE scores. However, these are complex and require several variables. The aim of this study was to obtain a reduced model with fewer variables and similar predictive and discriminative ability. METHODS: We studied 607 patients (age 62 years, SD=13; 76% male) who were admitted with STEMI and underwent successful primary angioplasty. Our endpoints were all-cause in-hospital and 30-day mortality. Considering all variables from the TIMI and GRACE risk scores, multivariate logistic regression models were fitted to the data to identify the variables that best predicted death. RESULTS: Compared to the TIMI score, the GRACE score had better predictive and discriminative performance for in-hospital mortality, with similar results for 30-day mortality. After data modeling, the variables with highest predictive ability were age, serum creatinine, heart failure and the occurrence of cardiac arrest. The new predictive model was compared with the GRACE risk score, after internal validation using 10-fold cross validation. A similar discriminative performance was obtained and some improvement was achieved in estimates of probabilities of death (increased for patients who died and decreased for those who did not). CONCLUSION: It is possible to simplify risk stratification scores for STEMI and primary angioplasty using only four variables (age, serum creatinine, heart failure and cardiac arrest). This simplified model maintained a good predictive and discriminative performance for short-term mortality.
Resumo:
OBJECTIVE: Intensive image surveillance after endovascular aneurysm repair is generally recommended due to continued risk of complications. However, patients at lower risk may not benefit from this strategy. We evaluated the predictive value of the first postoperative computed tomography angiography (CTA) characteristics for aneurysm-related adverse events as a means of patient selection for risk-adapted surveillance. METHODS: All patients treated with the Low-Permeability Excluder Endoprosthesis (W. L. Gore & Assoc, Flagstaff, Ariz) at a tertiary institution from 2004 to 2011 were included. First postoperative CTAs were analyzed for the presence of endoleaks, endograft kinking, distance from the lowermost renal artery to the start of the endograft, and for proximal and distal sealing length using center lumen line reconstructions. The primary end point was freedom from aneurysm-related adverse events. Multivariable Cox regression was used to test postoperative CTA characteristics as independent risk factors, which were subsequently used as selection criteria for low-risk and high-risk groups. Estimates for freedom from adverse events were obtained using Kaplan-Meier survival curves. RESULTS: Included were 131 patients. The median follow-up was 4.1 years (interquartile range, 2.1-6.1). During this period, 30 patients (23%) sustained aneurysm-related adverse events. Seal length <10 mm and presence of endoleak were significant risk factors for this end point. Patients were subsequently categorized as low-risk (proximal and distal seal length ≥10 mm and no endoleak, n = 62) or high-risk (seal length <10 mm or presence of endoleak, or both; n = 69). During follow-up, four low-risk patients (3%) and 26 high-risk patients (19%) sustained events (P < .001). Four secondary interventions were required in three low-risk patients, and 31 secondary interventions in 23 high-risk patients. Sac growth was observed in two low-risk patients and in 15 high-risk patients. The 5-year estimates for freedom from aneurysm-related adverse events were 98% for the low-risk group and 52% for the high-risk group. For each diagnosis, 81.7 image examinations were necessary in the low-risk group and 8.2 in the high-risk group. CONCLUSIONS: Our results suggest that the first postoperative CTA provides important information for risk stratification after endovascular aneurysm repair when the Excluder endoprosthesis is used. In patients with adequate seal and no endoleaks, the risk of aneurysm-related adverse events was significantly reduced, resulting in a large number of unnecessary image examinations. Adjusting the imaging protocol beyond 30 days and up to 5 years, based on individual patients' risk, may result in a more efficient and rational postoperative surveillance.
Resumo:
OBJECTIVE:Endograft mural thrombus has been associated with stent graft or limb thrombosis after endovascular aneurysm repair (EVAR). This study aimed to identify clinical and morphologic determinants of endograft mural thrombus accumulation and its influence on thromboembolic events after EVAR. METHODS: A prospectively maintained database of patients treated by EVAR at a tertiary institution from 2000 to 2012 was analyzed. Patients treated for degenerative infrarenal abdominal aortic aneurysms and with available imaging for thrombus analysis were considered. All measurements were performed on three-dimensional center-lumen line computed tomography angiography (CTA) reconstructions. Patients with thrombus accumulation within the endograft's main body with a thickness >2 mm and an extension >25% of the main body's circumference were included in the study group and compared with a control group that included all remaining patients. Clinical and morphologic variables were assessed for association with significant thrombus accumulation within the endograft's main body by multivariate regression analysis. Estimates for freedom from thromboembolic events were obtained by Kaplan-Meier plots. RESULTS: Sixty-eight patients (16.4%) presented with endograft mural thrombus. Median follow-up time was 3.54 years (interquartile range, 1.99-5.47 years). In-graft mural thrombus was identified on 30-day CTA in 22 patients (32.4% of the study group), on 6-month CTA in 8 patients (11.8%), and on 1-year CTA in 17 patients (25%). Intraprosthetic thrombus progressively accumulated during the study period in 40 patients of the study group (55.8%). Overall, 17 patients (4.1%) presented with endograft or limb occlusions, 3 (4.4%) in the thrombus group and 14 (4.1%) in the control group (P = .89). Thirty-one patients (7.5%) received an aortouni-iliac (AUI) endograft. Two endograft occlusions were identified among AUI devices (6.5%; overall, 0.5%). None of these patients showed thrombotic deposits in the main body, nor were any outflow abnormalities identified on the immediately preceding CTA. Estimated freedom from thromboembolic events at 5 years was 95% in both groups (P = .97). Endograft thrombus accumulation was associated with >25% proximal aneurysm neck thrombus coverage at baseline (odds ratio [OR], 1.9; 95% confidence interval [CI], 1.1-3.3), neck length ≤ 15 mm (OR, 2.4; 95% CI, 1.3-4.2), proximal neck diameter ≥ 30 mm (OR, 2.4; 95% CI, 1.3-4.6), AUI (OR, 2.2; 95% CI, 1.8-5.5), or polyester-covered stent grafts (OR, 4.0; 95% CI, 2.2-7.3) and with main component "barrel-like" configuration (OR, 6.9; 95% CI, 1.7-28.3). CONCLUSIONS: Mural thrombus formation within the main body of the endograft is related to different endograft configurations, main body geometry, and device fabric but appears to have no association with the occurrence of thromboembolic events over time.