13 resultados para thermohaline stratification
Resumo:
INTRODUCTION: There are several risk scores for stratification of patients with ST-segment elevation myocardial infarction (STEMI), the most widely used of which are the TIMI and GRACE scores. However, these are complex and require several variables. The aim of this study was to obtain a reduced model with fewer variables and similar predictive and discriminative ability. METHODS: We studied 607 patients (age 62 years, SD=13; 76% male) who were admitted with STEMI and underwent successful primary angioplasty. Our endpoints were all-cause in-hospital and 30-day mortality. Considering all variables from the TIMI and GRACE risk scores, multivariate logistic regression models were fitted to the data to identify the variables that best predicted death. RESULTS: Compared to the TIMI score, the GRACE score had better predictive and discriminative performance for in-hospital mortality, with similar results for 30-day mortality. After data modeling, the variables with highest predictive ability were age, serum creatinine, heart failure and the occurrence of cardiac arrest. The new predictive model was compared with the GRACE risk score, after internal validation using 10-fold cross validation. A similar discriminative performance was obtained and some improvement was achieved in estimates of probabilities of death (increased for patients who died and decreased for those who did not). CONCLUSION: It is possible to simplify risk stratification scores for STEMI and primary angioplasty using only four variables (age, serum creatinine, heart failure and cardiac arrest). This simplified model maintained a good predictive and discriminative performance for short-term mortality.
Resumo:
INTRODUCTION: The ProACS risk score is an early and simple risk stratification score developed for all-cause in-hospital mortality in acute coronary syndromes (ACS) from a Portuguese nationwide ACS registry. Our center only recently participated in the registry and was not included in the cohort used for developing the score. Our objective was to perform an external validation of this risk score for short- and long-term follow-up. METHODS: Consecutive patients admitted to our center with ACS were included. Demographic and admission characteristics, as well as treatment and outcome data were collected. The ProACS risk score variables are age (≥72 years), systolic blood pressure (≤116 mmHg), Killip class (2/3 or 4) and ST-segment elevation. We calculated ProACS, Global Registry of Acute Coronary Events (GRACE) and Canada Acute Coronary Syndrome risk score (C-ACS) risk scores for each patient. RESULTS: A total of 3170 patients were included, with a mean age of 64±13 years, 62% with ST-segment elevation myocardial infarction. All-cause in-hospital mortality was 5.7% and 10.3% at one-year follow-up. The ProACS risk score showed good discriminative ability for all considered outcomes (area under the receiver operating characteristic curve >0.75) and a good fit, similar to C-ACS, but lower than the GRACE risk score and slightly lower than in the original development cohort. The ProACS risk score provided good differentiation between patients at low, intermediate and high mortality risk in both short- and long-term follow-up (p<0.001 for all comparisons). CONCLUSIONS: The ProACS score is valid in external cohorts for risk stratification for ACS. It can be applied very early, at the first medical contact, but should subsequently be complemented by the GRACE risk score.
Resumo:
INTRODUCTION: Conventional risk stratification after acute myocardial infarction is usually based on the extent of myocardial damage and its clinical consequences. However, nowadays, more aggressive therapeutic strategies are used, both pharmacological and invasive, with the aim of changing the course of the disease. OBJECTIVES: To evaluate whether the number of drugs administered can influence survival of these patients, based on recent clinical trials that demonstrated the benefit of each drug for survival after acute coronary events. METHODS: This was a retrospective analysis of 368 consecutive patients admitted to our ICU during 2002 for acute coronary syndrome. A score from 1 to 4 was attributed to each patient according to the number of secondary prevention drugs administered--antiplatelets, beta blockers, angiotensin-converting enzyme inhibitors and statins--independently of the type of association. We evaluated mortality at 30-day follow-up. RESULTS: Mean age was 65 +/- 13 years, 68% were male, and 43% had ST-segment elevation acute myocardial infarction. Thirty-day mortality for score 1 to 4 was 36.8%, 15.6%, 7.8% and 2.5% respectively (p < 0.001). The use of only one or two drugs resulted in a significant increase in the risk of death at 30 days (OR 4.10, 95% CI 1.69-9.93, p = 0.002), when corrected for other variables. There was a 77% risk reduction associated with the use of three or four vs. one or two drugs. The other independent predictors of death were diabetes, Killip class on admission and renal insufficiency. CONCLUSIONS: The use of a greater number of secondary prevention drugs in patients with acute coronary syndromes was associated with improved survival. A score of 4 was a powerful predictor of mortality at 30-day follow-up
Resumo:
INTRODUCTION: Adults with repaired tetralogy of Fallot (TOF) may be at risk for progressive right ventricular (RV) dilatation and dysfunction, which is commonly associated with arrhythmic events. In frequently volume-overloaded patients with congenital heart disease, tissue Doppler imaging (TDI) is particularly useful for assessing RV function. However, it is not known whether RV TDI can predict outcome in this population. OBJECTIVE: To evaluate whether RV TDI parameters are associated with supraventricular arrhythmic events in adults with repaired TOF. METHODS: We studied 40 consecutive patients with repaired TOF (mean age 35 +/- 11 years, 62% male) referred for routine echocardiographic exam between 2007 and 2008. The following echocardiographic measurements were obtained: left ventricular (LV) ejection fraction, LV end-systolic volume, LV end-diastolic volume, RV fractional area change, RV end-systolic area, RV end-diastolic area, left and right atrial volumes, mitral E and A velocities, RV myocardial performance index (Tei index), tricuspid annular plane systolic excursion (TAPSE), myocardial isovolumic acceleration (IVA), pulmonary regurgitation color flow area, TDI basal lateral, septal and RV lateral peak diastolic and systolic annular velocities (E' 1, A' 1, S' 1, E' s, A' s, S' s, E' rv, A' rv, S' rv), strain, strain rate and tissue tracking of the same segments. QRS duration on resting ECG, total duration of Bruce treadmill exercise stress test and presence of exercise-induced arrhythmias were also analyzed. The patients were subsequently divided into two groups: Group 1--12 patients with previous documented supraventricular arrhythmias (atrial tachycardia, fibrillation or flutter) and Group 2 (control group)--28 patients with no previous arrhythmic events. Univariate and multivariate analysis was used to assess the statistical association between the studied parameters and arrhythmic events. RESULTS: Patients with previous events were older (41 +/- 14 vs. 31 +/- 6 years, p = 0.005), had wider QRS (173 +/- 20 vs. 140 +/- 32 ms, p = 0.01) and lower maximum heart rate on treadmill stress testing (69 +/- 35 vs. 92 +/- 9%, p = 0.03). All patients were in NYHA class I or II. Clinical characteristics including age at corrective surgery, previous palliative surgery and residual defects did not differ significantly between the two groups. Left and right cardiac chamber dimensions and ventricular and valvular function as evaluated by conventional Doppler parameters were also not significantly different. Right ventricular strain and strain rate were similar between the groups. However, right ventricular myocardial TDI systolic (Sa: 5.4+2 vs. 8.5 +/- 3, p = 0.004) and diastolic indices and velocities (Ea, Aa, septal E/Ea, and RV free wall tissue tracking) were significantly reduced in patients with arrhythmias compared to the control group. Multivariate linear regression analysis identified RV early diastolic velocity as the sole variable independently associated with arrhythmic history (RV Ea: 4.5 +/- 1 vs. 6.7 +/- 2 cm/s, p = 0.01). A cut-off for RV Ea of < 6.1 cm/s identified patients in the arrhythmic group with 86% sensitivity and 59% specificity (AUC = 0.8). CONCLUSIONS: Our results suggest that TDI may detect RV dysfunction in patients with apparently normal function as assessed by conventional echocardiographic parameters. Reduction in RV early diastolic velocity appears to be an early abnormality and is associated with occurrence of arrhythmic events. TDI may be useful in risk stratification of patients with repaired tetralogy of Fallot.
Resumo:
Cardiopulmonary exercise testing (CPET) is an objective method for assessment of functional capacity and for prognostic stratification of patients with chronic heart failure (CHF). In this study, we analyzed the prognostic value of a recently described CPET-derived parameter, the minute ventilation to carbon dioxide production slope normalized for peak oxygen consumption (VE/VCO2 slope/pVO2). METHODS: We prospectively studied 157 patients with stable CHF and dilated cardiomyopathy who performed maximal CPET using the modified Bruce protocol. The prognostic value of VE/VCO2 slope/pVO2 was determined and compared with traditional CPET parameters. RESULTS: During follow-up 37 patients died and 12 were transplanted. Mean follow-up in surviving patients was 29.7 months (12-36). Cox multivariate analysis revealed that VE/VCO2 slope/pVO2 had the greatest prognostic power of all the parameters studied. A VE/VCO2 slope/pVO2 of > or = 2.2 signaled cases at higher risk. CONCLUSION: Normalization of the ventilatory response to exercise for peak oxygen consumption appears to increase the prognostic value of CPET in patients with CHF.
Resumo:
Periferal vascular disease usually results from a systemic entity in which atherothrombosis develops in different vascular territories, having common risk factors. It is hence usual to find coexistent, often subclinical, coronary artery disease, which is responsible for most of perioperatory morbidity and mortality in patients submitted to vascular surgery. An adequate preoperatory risk stratification must be accomplished, having in mind the clinical manifestations, risk factors, comorbidities, functional capacity and global left ventricular systolic function of the patient. He should be included in one of three different subgroups: low, high or intermediate risk, which might reinforce the need for further testing, most often aiming at the detection of coronary artery disease and foresee the short, medium and long term outcome. This strategy is very important and it is in part due to it and to better medical/surgical and anesthetic care that the surgical results have markedly improved in recent years. In this paper a state of the art is done of the guidelines to follow and the results of several studies performed on this subject. The role of methods to detect coronary ischemia is remarked, using either nuclear or echocardiographic techniques for this purpose.
Resumo:
OBJECTIVE: Intensive image surveillance after endovascular aneurysm repair is generally recommended due to continued risk of complications. However, patients at lower risk may not benefit from this strategy. We evaluated the predictive value of the first postoperative computed tomography angiography (CTA) characteristics for aneurysm-related adverse events as a means of patient selection for risk-adapted surveillance. METHODS: All patients treated with the Low-Permeability Excluder Endoprosthesis (W. L. Gore & Assoc, Flagstaff, Ariz) at a tertiary institution from 2004 to 2011 were included. First postoperative CTAs were analyzed for the presence of endoleaks, endograft kinking, distance from the lowermost renal artery to the start of the endograft, and for proximal and distal sealing length using center lumen line reconstructions. The primary end point was freedom from aneurysm-related adverse events. Multivariable Cox regression was used to test postoperative CTA characteristics as independent risk factors, which were subsequently used as selection criteria for low-risk and high-risk groups. Estimates for freedom from adverse events were obtained using Kaplan-Meier survival curves. RESULTS: Included were 131 patients. The median follow-up was 4.1 years (interquartile range, 2.1-6.1). During this period, 30 patients (23%) sustained aneurysm-related adverse events. Seal length <10 mm and presence of endoleak were significant risk factors for this end point. Patients were subsequently categorized as low-risk (proximal and distal seal length ≥10 mm and no endoleak, n = 62) or high-risk (seal length <10 mm or presence of endoleak, or both; n = 69). During follow-up, four low-risk patients (3%) and 26 high-risk patients (19%) sustained events (P < .001). Four secondary interventions were required in three low-risk patients, and 31 secondary interventions in 23 high-risk patients. Sac growth was observed in two low-risk patients and in 15 high-risk patients. The 5-year estimates for freedom from aneurysm-related adverse events were 98% for the low-risk group and 52% for the high-risk group. For each diagnosis, 81.7 image examinations were necessary in the low-risk group and 8.2 in the high-risk group. CONCLUSIONS: Our results suggest that the first postoperative CTA provides important information for risk stratification after endovascular aneurysm repair when the Excluder endoprosthesis is used. In patients with adequate seal and no endoleaks, the risk of aneurysm-related adverse events was significantly reduced, resulting in a large number of unnecessary image examinations. Adjusting the imaging protocol beyond 30 days and up to 5 years, based on individual patients' risk, may result in a more efficient and rational postoperative surveillance.
Resumo:
We increasingly face conservative surgery for rectal cancer and even the so called ‘wait and see’ approach, as far as 10–20% patients can reach a complete pathological response at the time of surgery. But what can we say to our patients about risks? Standard surgery with mesorectal excision gives a <2% local recurrence with a post operative death rate of 2–8% (may reach 30% at 6 months in those over 85), but low AR has some deterioration in bowel function and in low cancer a permanent stoma may be required. Also a long-term impact on urinary and sexual function is possible. Distant metastasis rate seem to be identical in the standard and conservative approach. It is difficult to evaluate conservative approach because a not clear standardization of surgery for low rectal cancer. Rullier et al tried to clarify, and they found identical results for recurrence (5–9%), disease free survival (70%) at 5y for coloanal anastomosis and intersphinteric resection. Other series have found local recurrence higher than with standard approach and functional results may be worse and, in some situations, salvage therapy is compromised or has more complications. In this context, functional outcomes are very important but most studies are incomplete in measuring bowel function in the context of conservative approach. In 2005 Temple et al made a survey of 122/184 patient after sphinter preserving surgery and found a 96.9% of incomplete evacuation, 94.4% clustering, 93.2% food affecting frequency, 91.8% gas incontinence and proposed a systematic evaluation with a specific questionnaire. In which concerns ‘Wait and see’ approach for complete clinical responders, it was first advocated by Habr Gama for tumors up to 7cm, with a low locoregional failure of 4.6%, 5y overall survival 96%, 72% for disease free survival; one fifth of patients failed in the first year; a Dutch trial had identical results but others had worse recurrence rates; in other series 25% of patients could not be salvaged even with APR; 30% have subsequent metastatic disease what seems equal for ‘wait and see’ and operated patients. In a recent review Glynne Jones considers that all the evaluated ‘wait and see’ studies are heterogeneous in staging, inclusion criteria, design and follow up after chemoradiation and that there is the suggestion that patients who progress while under observation fare worse than those resected. He proposes long-term observational studies with more uniform inclusion criteria. We are now facing a moment where we may be more aggressive in early cancer and neoadjuvant treatment to be more conservative in the subsequent treatment but we need a better stratification of patients, better evaluation of results and more clear prognostic markers.
Resumo:
Food allergy (FA) prevalence data in infants and preschool-age children are sparse, and proposed risk factors lack confirmation. In this study, 19 children’s day care centers (DCC) from 2 main Portuguese cities were selected after stratification and cluster analysis. An ISAAC’s (International Study of Asthma and Allergies in Childhood) derived health questionnaire was applied to a sample of children attending DCCs. Outcomes were FA parental report and anaphylaxis. Logistic regression was used to explore potential risk factors for reported FA. From the 2228 distributed questionnaires, 1217 were included in the analysis (54.6%). Children’s median age was 3.5 years, and 10.8% were described as ever having had FA. Current FA was reported in 5.7%. Three (0.2%) reports compatible with anaphylaxis were identified. Reported parental history of FA, personal history of atopic dermatitis, and preterm birth increased the odds for reported current FA. A high prevalence of parental-perceived FA in preschool-age children was identified. Risk factor identification may enhance better prevention.
Resumo:
The quality of care can be improved by the development and implementation of evidence-based treatment guidelines. Different national guidelines for chronic obstructive pulmonary disease (COPD) exist in Europe and relevant differences may exist among them.This was an evaluation of COPD treatment guidelines published in Europe and Russia in the past 7 years. Each guideline was reviewed in detail and information about the most important aspects of patient diagnosis, risk stratification and pharmacotherapy was extracted following a standardised process. Guidelines were available from the Czech Republic, England and Wales, Finland, France, Germany, Italy, Poland, Portugal, Russia, Spain and Sweden. The treatment goals, criteria for COPD diagnosis, consideration of comorbidities in treatment selection and support for use of long-acting bronchodilators, were similar across treatment guidelines. There were differences in measures used for stratification of disease severity, consideration of patient phenotypes, criteria for the use of inhaled corticosteroids and recommendations for other medications (e.g. theophylline and mucolytics) in addition to bronchodilators.There is generally good agreement on treatment goals, criteria for diagnosis of COPD and use of long-acting bronchodilators as the cornerstone of treatment among guidelines for COPD management in Europe and Russia. However, there are differences in the definitions of patient subgroups and other recommended treatments.
Resumo:
Objectives: To characterize the epidemiology and risk factors for acute kidney injury (AKI) after pediatric cardiac surgery in our center, to determine its association with poor short-term outcomes, and to develop a logistic regression model that will predict the risk of AKI for the study population. Methods: This single-center, retrospective study included consecutive pediatric patients with congenital heart disease who underwent cardiac surgery between January 2010 and December 2012. Exclusion criteria were a history of renal disease, dialysis or renal transplantation. Results: Of the 325 patients included, median age three years (1 day---18 years), AKI occurred in 40 (12.3%) on the first postoperative day. Overall mortality was 13 (4%), nine of whom were in the AKI group. AKI was significantly associated with length of intensive care unit stay, length of mechanical ventilation and in-hospital death (p<0.01). Patients’ age and postoperative serum creatinine, blood urea nitrogen and lactate levels were included in the logistic regression model as predictor variables. The model accurately predicted AKI in this population, with a maximum combined sensitivity of 82.1% and specificity of 75.4%. Conclusions: AKI is common and is associated with poor short-term outcomes in this setting. Younger age and higher postoperative serum creatinine, blood urea nitrogen and lactate levels were powerful predictors of renal injury in this population. The proposed model could be a useful tool for risk stratification of these patients.
Resumo:
INTRODUCTION: New scores have been developed and validated in the US for in-hospital mortality risk stratification in patients undergoing coronary angioplasty: the National Cardiovascular Data Registry (NCDR) risk score and the Mayo Clinic Risk Score (MCRS). We sought to validate these scores in a European population with acute coronary syndrome (ACS) and to compare their predictive accuracy with that of the GRACE risk score. METHODS: In a single-center ACS registry of patients undergoing coronary angioplasty, we used the area under the receiver operating characteristic curve (AUC), a graphical representation of observed vs. expected mortality, and net reclassification improvement (NRI)/integrated discrimination improvement (IDI) analysis to compare the scores. RESULTS: A total of 2148 consecutive patients were included, mean age 63 years (SD 13), 74% male and 71% with ST-segment elevation ACS. In-hospital mortality was 4.5%. The GRACE score showed the best AUC (0.94, 95% CI 0.91-0.96) compared with NCDR (0.87, 95% CI 0.83-0.91, p=0.0003) and MCRS (0.85, 95% CI 0.81-0.90, p=0.0003). In model calibration analysis, GRACE showed the best predictive power. With GRACE, patients were more often correctly classified than with MCRS (NRI 78.7, 95% CI 59.6-97.7; IDI 0.136, 95% CI 0.073-0.199) or NCDR (NRI 79.2, 95% CI 60.2-98.2; IDI 0.148, 95% CI 0.087-0.209). CONCLUSION: The NCDR and Mayo Clinic risk scores are useful for risk stratification of in-hospital mortality in a European population of patients with ACS undergoing coronary angioplasty. However, the GRACE score is still to be preferred.
Resumo:
PURPOSE: In this prospective, multicenter, 14-day inception cohort study, we investigated the epidemiology, patterns of infections, and outcome in patients admitted to the intensive care unit (ICU) as a result of severe acute respiratory infections (SARIs). METHODS: All patients admitted to one of 206 participating ICUs during two study weeks, one in November 2013 and the other in January 2014, were screened. SARI was defined as possible, probable, or microbiologically confirmed respiratory tract infection with recent onset dyspnea and/or fever. The primary outcome parameter was in-hospital mortality within 60 days of admission to the ICU. RESULTS: Among the 5550 patients admitted during the study periods, 663 (11.9 %) had SARI. On admission to the ICU, Gram-positive and Gram-negative bacteria were found in 29.6 and 26.2 % of SARI patients but rarely atypical bacteria (1.0 %); viruses were present in 7.7 % of patients. Organ failure occurred in 74.7 % of patients in the ICU, mostly respiratory (53.8 %), cardiovascular (44.5 %), and renal (44.6 %). ICU and in-hospital mortality rates in patients with SARI were 20.2 and 27.2 %, respectively. In multivariable analysis, older age, greater severity scores at ICU admission, and hematologic malignancy or liver disease were independently associated with an increased risk of in-hospital death, whereas influenza vaccination prior to ICU admission and adequate antibiotic administration on ICU admission were associated with a lower risk. CONCLUSIONS: Admission to the ICU for SARI is common and associated with high morbidity and mortality rates. We identified several risk factors for in-hospital death that may be useful for risk stratification in these patients.