808 resultados para Mortality Risk
Resumo:
Background: The role of an impaired estimated glomerular filtration rate (eGFR) at hospital admission in the outcome of acute kidney injury (AKI) after acute myocardial infarction (AMI) has been underreported. The aim of this study was to assess the influence of an admission eGFR<60 mL/min/1.73 m(2) on the incidence and early and late mortality of AMI-associated AKI. Methods: A prospective study of 828 AMI patients was performed. AKI was defined as a serum creatinine increase of >= 50% from the time of admission (RIFLE criteria) in the first 7 days of hospitalization. Patients were divided into subgroups according to their eGFR upon hospital admission (MDRD formula, mL/min/1.73 m(2)) and the development of AKI: eGFR >= 60 without AKI, eGFR<60 without AKI, eGFR >= 60 with AKI and eGFR<60 with AKI. Results: Overall, 14.6% of the patients in this study developed AKI. The admission eGFR had no impact on the incidence of AKI. However, the admission eGFR was associated with the outcome of AMI-associated AKI. The adjusted hazard ratios (AHR, Cox multivariate analysis) for 30-day mortality were 2.00 (95% CI 1.11-3.61) for eGFR, 60 without AKI, 4.76 (95% CI 2.45-9.26) for eGFR >= 60 with AKI and 6.27 (95% CI 3.20-12.29) for eGFR, 60 with AKI. Only an admission eGFR of <60 with AKI was significantly associated with a 30-day to 1-year mortality hazard (AHR 3.05, 95% CI 1.50-6.19). Conclusions: AKI development was associated with an increased early mortality hazard in AMI patients with either preserved or impaired admission eGFR. Only the association of impaired admission eGFR and AKI was associated with an increased hazard for late mortality among these patients.
Resumo:
Background Androgen suppression therapy and radiotherapy are used to treat locally advanced prostate cancer. 3 years of androgen suppression confers a small survival benefit compared with 6 months of therapy in this setting, but is associated with more toxic effects. Early identification of men in whom radiotherapy and 6 months of androgen suppression is insufficient for cure is important. Thus, we assessed whether prostate-specific antigen (PSA) values can act as an early surrogate for prostate cancer-specific mortality (PCSM). Methods We systematically reviewed randomised controlled trials that showed improved overall and prostate cancer-specific survival with radiotherapy and 6 months of androgen suppression compared with radio therapy alone and measured lowest PSA concentrations (PSA nadir) and those immediately after treatment (PSA end). We assessed a cohort of 734 men with localised or locally advanced prostate cancer from two eligible trials in the USA and Australasia that randomly allocated participants between Feb 2, 1996, and Dec 27, 2001. We used Prentice criteria to assess whether reported PSA nadir or PSA end concentrations of more than 0.5 ng/mL were surrogates for PCSM. Findings Men treated with radiotherapy and 6 months of androgen suppression in both trials were significantly less likely to have PSA end and PSA nadir values of more than 0.5 ng/mL than were those treated with radiotherapy alone (p<0.0001). Presence of candidate surrogates (ie, PSA end and PSA nadir values >0.5 ng/mL) alone and when assessed in conjunction with the randomised treatment group increased risk of PCSM in the US trial (PSA nadir p=0.0016; PSA end p=0.017) and Australasian trial (PSA nadir p<0.0001; PSA end p=0.0012). In both trials, the randomised treatment group was no longer associated with PCSM (p >= 0.20) when the candidate surrogates were included in the model. Therefore, both PSA metrics satisfied Prentice criteria for surrogacy. Interpretation After radiotherapy and 6 months of androgen suppression, men with PSA end values exceeding 0.5 ng/mL should be considered for long-term androgen suppression and those with localised or locally advanced prostate cancer with PSA nadir values exceeding 0.5 ng/mL should be considered for inclusion in randomised trials investigating the use of drugs that have extended survival in castration-resistant metastatic prostate cancer.
Resumo:
Background: Exposure to fine fractions of particulate matter (PM2.5) is associated with increased hospital admissions and mortality for respiratory and cardiovascular disease in children and the elderly. This study aims to estimate the toxicological risk of PM2.5 from biomass burning in children and adolescents between the age of 6 and 14 in Tangara da Serra, a municipality of Subequatorial Brazilian Amazon. Methods: Risk assessment methodology was applied to estimate the risk quotient in two scenarios of exposure according to local seasonality. The potential dose of PM2.5 was estimated using the Monte Carlo simulation, stratifying the population by age, gender, asthma and Body Mass Index (BMI). Results: Male asthmatic children under the age of 8 at normal body rate had the highest risk quotient among the subgroups. The general potential average dose of PM2.5 was 1.95 mu g/kg.day (95% CI: 1.62 - 2.27) during the dry scenario and 0.32 mu g/kg. day (95% CI: 0.29 - 0.34) in the rainy scenario. During the dry season, children and adolescents showed a toxicological risk to PM2.5 of 2.07 mu g/kg. day (95% CI: 1.85 - 2.30). Conclusions: Children and adolescents living in the Subequatorial Brazilian Amazon region were exposed to high levels of PM2.5 resulting in toxicological risk for this multi-pollutant. The toxicological risk quotients of children in this region were comparable or higher to children living in metropolitan regions with PM2.5 air pollution above the recommended limits to human health.
Resumo:
Background: Percutaneous coronary intervention (PCI) has increased as the initial revascularization strategy in chronic coronary artery disease. Consequently, more patients undergoing coronary artery bypass grafting (CABG) have history of coronary stent. Objective: Evaluate the impact of previous PCI on in-hospital mortality after CABG in patients with multivessel coronary artery disease. Methods: Between May/2007 and June/2009, 1099 consecutive patients underwent CABG on cardiopulmonary bypass. Patients with no PCI (n=938, 85.3%) were compared with patients with previous PCI (n=161, 14.6%). Logistic regression models and propensity score matching analysis were used to assess the risk-adjusted impact of previous PCI on in-hospital mortality. Results: Both groups were similar, except for the fact that patients with previous PCI were more likely to have unstable angina (16.1% x 9.9%, p=0.019). In-hospital mortality after CABG was higher in patients with previous PCI (9.3% x 5.1%, p=0.034) and it was comparable with EuroSCORE and 2000 Bernstein-Parsonnet risk score. Using multivariate logistic regression analysis, previous PCI emerged as an independent predictor of postoperative in-hospital mortality (odds ratio 1.94, 95% CI 1.02-3.68, p=0.044) as strong as diabetes (odds ratio 1.86, 95% CI 1.07-3.24, p=0.028). After computed propensity score matching based on preoperative risk factors, in-hospital mortality remained higher among patients with previous PCI (odds ratio 3.46, 95% CI 1.10-10.93, p=0.034). Conclusions: Previous PCI in patients with multivessel coronary artery disease is an independent risk factor for in-hospital mortality after CABG. This fact must be considered when PCI is indicated as initial alternative in patients with more severe coronary artery disease. (Arq Bras Cardiol 2012;99(1):586-595)
Resumo:
OBJECTIVES: Though elderly persons with chronic atrial fibrillation have more comorbidities that could limit indications for the chronic use of anticoagulants, few studies have focused on the risk of falls within this particular group. To evaluate the predictors of the risk of falls among elderly with chronic atrial fibrillation, a cross-sectional, observational study was performed. METHODS: From 295 consecutive patients aged 60 years or older with a history of atrial fibrillation who were enrolled within the last 2 years in the cardiogeriatrics outpatient clinic of the Instituto do Coracao do Hospital das Clinicas da Faculdade de Medicina da Universidade de Sao Paulo, 107 took part in this study. Their age was 77.9 +/- 6.4 years, and 62 were female. They were divided into two groups: a) no history of falls in the previous year and b) a history of one or more falls in the previous year. Data regarding the history of falls and social, demographic, anthropometric, and clinical information were collected. Multidimensional assessment instruments and questionnaires were applied. RESULTS: At least one fall was reported in 55 patients (51.4%). Among them, 27 (49.1%) presented recurrent falls, with body lesions in 90.4% and fractures in 9.1% of the cases. Multivariate logistic regression showed that self-reported difficulty maintaining balance, use of amiodarone, and diabetes were independent variables associated with the risk of falls, with a sensitivity of 92.9% and a specificity of 44.9%. CONCLUSION: In a group of elderly patients with chronic atrial fibrillation who were relatively independent and able to attend an outpatient clinic, the occurrence of falls with recurrence and clinical consequences was high. Difficulty maintaining balance, the use of amiodarone and a diagnosis of diabetes mellitus were independent predictors of the risk for falls. Thus, simple clinical data predicted falls better than objective functional tests.
Resumo:
Objective: The objective was to evaluate the cardiovascular profile of first-episode psychosis patients in Sao Paulo, Brazil, an issue that has not been sufficiently explored in low-/middle-income countries. Method: A cross-sectional study was performed 1 to 3 years after an initial, larger survey that assessed first-episode psychosis in sao Paulo. We evaluated cardiovascular risk factors and lifestyle habits using standard clinical examination and laboratory evaluation. Results: Of 151 contacted patients, 82 agreed to participate (mean age=35 years; 54% female). The following diagnoses were found: 20.7% were obese, 29.3% had hypertension, 39.0% had dyslipidemia, 19.5% had metabolic syndrome, and 1.2% had a >20% 10-year risk of coronary heart disease based on Framingham score. Also, 72% were sedentary, 25.6% were current smokers, and 7.3% reported a heavy alcohol intake. Conclusion: Compared to other samples, ours presented a distinct profile of higher rates of hypertension and diabetes (possibly due to dietary habits) and lower rates of smoking and alcohol intake (possibly due to higher dependence on social support). Indirect comparison vs. healthy, age-matched Brazilians revealed that our sample had higher frequencies of hypertension, diabetes and metabolic syndrome. Therefore, we confirmed a high cardiovascular risk in first-episode psychosis in Brazil. Transcultural studies are needed to investigate to which extent lifestyle contributes to such increased risk. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
Background: The combined effect of diabetes and stroke on disability and mortality remains largely unexplored in Brazil and Latin America. Previous studies have been based primarily on data from developed countries. This study addresses the empirical gap by evaluating the combined impact of diabetes and stroke on disability and mortality in Brazil. Methods: The sample was drawn from two waves of the Survey on Health and Well-being of the Elderly, which followed 2,143 older adults in Sao Paulo, Brazil, from 2000 to 2006. Disability was assessed via measures of activities of daily living (ADL) limitations, severe ADL limitations, and receiving assistance to perform these activities. Logistic and multinomial regression models controlling for sociodemographic and health conditions were used to address the influence of diabetes and stroke on disability and mortality. Results: By itself, the presence of diabetes did not increase the risk of disability or the need for assistance; however, diabetes was related to increased risks when assessed in combination with stroke. After controlling for demographic, social and health conditions, individuals who had experienced stroke but not diabetes were 3.4 times more likely to have ADL limitations than those with neither condition (95% CI 2.26-5.04). This elevated risk more than doubled for those suffering from a combination of diabetes and stroke (OR 7.34, 95% CI 3.73-14.46). Similar effects from the combination of diabetes and stroke were observed for severe ADL limitations (OR 19.75, 95% CI 9.81-39.76) and receiving ADL assistance (OR 16.57, 95% CI 8.39-32.73). Over time, older adults who had experienced a stroke were at higher risk of remaining disabled (RRR 4.28, 95% CI 1.53, 11.95) and of mortality (RRR 3.42, 95% CI 1.65, 7.09). However, risks were even higher for those who had experienced both diabetes and stroke. Diabetes was associated with higher mortality. Conclusions: Findings indicate that a combined history of stroke and diabetes has a great impact on disability prevalence and mortality among older adults in Sao Paulo, Brazil.
Resumo:
The Simplified Acute Physiology Score II (SAPS II) and Logistic Organ Dysfunction System (LODS) are instruments used to classify Intensive Care Unit (ICU) inpatients according to the severity of their condition and risk of death, and evaluate the quality of nursing care. The objective of this study is to evaluate and compare the performance of SAPS II and LODS to predict the mortality of patients admitted to the ICU. The participants were 600 patients from four ICUs located in Sao Paulo, Brazil. Receiver Operator Characteristic (ROC) curves were used to compare the performance of the indexes. Results: The areas under the ROC curves of LODS (0.69) and SAPS II (0.71) indicated moderate discriminatory capacity to identify death or survival. No statistically significant differences were found between these areas (p=0.26). In conclusion, there was equivalence between SAPS II and LODS to estimate the risk of death of ICU patients.
Resumo:
Background. The link between endogenous estrogen, coronary artery disease (CAD), and death in postmenopausal women is uncertain. We analyzed the association between death and blood levels of estrone in postmenopausal women with known coronary artery disease (CAD) or with a high-risk factor score for CAD. Methods. 251 postmenopausal women age 50-90 years not on estrogen therapy. Fasting blood for estrone and heart disease risk factors were collected at baseline. Women were grouped according to their estrone levels (<15 and >= 15 pg/mL). Fatal events were recorded after 5.8 perpendicular to 1.4 years of followup. Results. The Kaplan-Meier survival curve showed a significant trend (P = 0.039) of greater all-cause mortality in women with low estrone levels (< 15 pg/mL). Cox multivariate regression analysis model adjusted for body mass index, diabetes, dyslipidemia, family history, and estrone showed estrone (OR = 0.45; P = 0.038) as the only independent variable for all-cause mortality. Multivariate regression model adjusted for age, body mass index, hypertension, diabetes, dyslipidemia, family history, and estrone showed that only age (OR = 1.06; P = 0.017) was an independent predictor of all-cause mortality. Conclusions. Postmenopausal women with known CAD or with a high-risk factor score for CAD and low estrone levels (< 15 pg/mL) had increased all-cause mortality.
Resumo:
Background: Ankle-brachial index (ABI) can access peripheral artery disease and predict mortality in prevalent patients on hemodialysis. However, ABI has not yet been tested in incident patients, who present significant mortality. Typically, ABI is measured by Doppler, which is not always available, limiting its use in most patients. We therefore hypothesized that ABI, evaluated by a simplified method, can predict mortality in an incident hemodialysis population. Methodology/Principal Findings: We studied 119 patients with ESRD who had started hemodialysis three times weekly. ABI was calculated by using two oscillometric blood pressure devices simultaneously. Patients were followed until death or the end of the study. ABI was categorized in two groups normal (0.9-1.3) or abnormal (<0.9 and >1.3). There were 33 deaths during a median follow-up of 12 months (from 3 to 24 months). Age (1 year) (hazard of ratio, 1.026; p = 0.014) and ABI abnormal (hazard ratio, 3.664; p = 0.001) were independently related to mortality in a multiple regression analysis. Conclusions: An easy and inexpensive technique to measure ABI was tested and showed to be significant in predicting mortality. Both low and high ABI were associated to mortality in incident patients on hemodialysis. This technique allows nephrologists to identify high-risk patients and gives the opportunity of early intervention that could alter the natural progression of this population.
Resumo:
Introduction. Patients with terminal heart failure have increased more than the available organs leading to a high mortality rate on the waiting list. Use of Marginal and expanded criteria donors has increased due to the heart shortage. Objective. We analyzed all heart transplantations (HTx) in Sao Paulo state over 8 years for donor profile and recipient risk factors. Method. This multi-institutional review collected HTx data from all institutions in the state of Sao Paulo, Brazil. From 2002 to 2008 (6 years), only 512 (28.8%) of 1777 available heart donors were accepted for transplantation. All medical records were analyzed retrospectively; none of the used donors was excluded, even those considered to be nonstandard. Results. The hospital mortality rate was 27.9% (n = 143) and the average follow-up time was 29.4 +/- 28.4 months. The survival rate was 55.5% (n = 285) at 6 years after HTx. Univariate analysis showed the following factors to impact survival: age (P = .0004), arterial hypertension (P = .4620), norepinephrine (P = .0450), cardiac arrest (P = .8500), diabetes mellitus (P = .5120), infection (P = .1470), CKMB (creatine kinase MB) (P = .8694), creatinine (P = .7225), and Na+ (P = .3273). On multivariate analysis, only age showed significance; logistic regression showed a significant cut-off at 40 years: organs from donors older than 40 years showed a lower late survival rates (P = .0032). Conclusions. Donor age older than 40 years represents an important risk factor for survival after HTx. Neither donor gender nor norepinephrine use negatively affected early survival.
Resumo:
de Araujo CC, Silva JD, Samary CS, Guimaraes IH, Marques PS, Oliveira GP, do Carmo LGRR, Goldenberg RC, Bakker-Abreu I, Diaz BL, Rocha NN, Capelozzi VL, Pelosi P, Rocco PRM. Regular and moderate exercise before experimental sepsis reduces the risk of lung and distal organ injury. J Appl Physiol 112: 1206-1214, 2012. First published January 19, 2012; doi:10.1152/japplphysiol.01061.2011.-Physical activity modulates inflammation and immune response in both normal and pathologic conditions. We investigated whether regular and moderate exercise before the induction of experimental sepsis reduces the risk of lung and distal organ injury and survival. One hundred twenty-four BALB/c mice were randomly assigned to two groups: sedentary (S) and trained (T). Animals in T group ran on a motorized treadmill, at moderate intensity, 5% grade, 30 min/day, 3 times a week for 8 wk. Cardiac adaptation to exercise was evaluated using echocardiography. Systolic volume and left ventricular mass were increased in T compared with S group. Both T and S groups were further randomized either to sepsis induced by cecal ligation and puncture surgery (CLP) or sham operation (control). After 24 h, lung mechanics and histology, the degree of cell apoptosis in lung, heart, kidney, liver, and small intestine villi, and interleukin (IL)-6, KC (IL-8 murine functional homolog), IL-1 beta, IL-10, and number of cells in bronchoalveolar lavage (BALF) and peritoneal lavage (PLF) fluids as well as plasma were measured. In CLP, T compared with S groups showed: 1) improvement in survival; 2) reduced lung static elastance, alveolar collapse, collagen and elastic fiber content, number of neutrophils in BALF, PLF, and plasma, as well as lung and distal organ cell apoptosis; and 3) increased IL-10 in BALF and plasma, with reduced IL-6, KC, and IL-1 beta in PLF. In conclusion, regular and moderate exercise before the induction of sepsis reduced the risk of lung and distal organ damage, thus increasing survival.
Resumo:
Background: Several models have been designed to predict survival of patients with heart failure. These, while available and widely used for both stratifying and deciding upon different treatment options on the individual level, have several limitations. Specifically, some clinical variables that may influence prognosis may have an influence that change over time. Statistical models that include such characteristic may help in evaluating prognosis. The aim of the present study was to analyze and quantify the impact of modeling heart failure survival allowing for covariates with time-varying effects known to be independent predictors of overall mortality in this clinical setting. Methodology: Survival data from an inception cohort of five hundred patients diagnosed with heart failure functional class III and IV between 2002 and 2004 and followed-up to 2006 were analyzed by using the proportional hazards Cox model and variations of the Cox's model and also of the Aalen's additive model. Principal Findings: One-hundred and eighty eight (188) patients died during follow-up. For patients under study, age, serum sodium, hemoglobin, serum creatinine, and left ventricular ejection fraction were significantly associated with mortality. Evidence of time-varying effect was suggested for the last three. Both high hemoglobin and high LV ejection fraction were associated with a reduced risk of dying with a stronger initial effect. High creatinine, associated with an increased risk of dying, also presented an initial stronger effect. The impact of age and sodium were constant over time. Conclusions: The current study points to the importance of evaluating covariates with time-varying effects in heart failure models. The analysis performed suggests that variations of Cox and Aalen models constitute a valuable tool for identifying these variables. The implementation of covariates with time-varying effects into heart failure prognostication models may reduce bias and increase the specificity of such models.
Resumo:
The aim of the present study is to contribute an ecologically relevant assessment of the ecotoxicological effects of pesticide applications in agricultural areas in the tropics, using an integrated approach with information gathered from soil and aquatic compartments. Carbofuran, an insecticide/nematicide used widely on sugarcane crops, was selected as a model substance. To evaluate the toxic effects of pesticide spraying for soil biota, as well as the potential indirect effects on aquatic biota resulting from surface runoff and/or leaching, field and laboratory (using a cost-effective simulator of pesticide applications) trials were performed. Standard ecotoxicological tests were performed with soil (Eisenia andrei, Folsomia candida, and Enchytraeus crypticus) and aquatic (Ceriodaphnia silvestrii) organisms, using serial dilutions of soil, eluate, leachate, and runoff samples. Among soil organisms, sensitivity was found to be E. crypticus < E. andrei < F. candida. Among the aqueous extracts, mortality of C. silvestrii was extreme in runoff samples, whereas eluates were by far the least toxic samples. A generally higher toxicity was found in the bioassays performed with samples from the field trial, indicating the need for improvements in the laboratory simulator. However, the tool developed proved to be valuable in evaluating the toxic effects of pesticide spraying in soils and the potential risks for aquatic compartments. Environ. Toxicol. Chem. 2012;31:437-445. (C) 2011 SETAC
Resumo:
Background and aim of the study: The natriuretic peptides, brain natriuretic peptide (BNP) and its N-terminal prohormone (NT-proBNP), can be used as diagnostic and prognostic markers for aortic stenosis (AS). However, the association between BNP, NT-proBNP, and long-term clinical outcomes in patients with severe AS remains uncertain. Methods: A total of 64 patients with severe AS was prospectively enrolled into the study, and underwent clinical and echocardiographic assessments at baseline. Blood samples were drawn for plasma BNP and NT-proBNP analyses. The primary outcome was death from any cause, through a six-year follow up period. Cox proportional hazards modeling was used to examine the association between natriuretic peptides and long-term mortality, adjusting for important clinical factors. Results: During a mean period of 1,520 681 days, 51 patients (80%) were submitted to aortic valve replacement, and 13 patients (20%) were medically managed without surgical interventions. Mortality rates were 13.7% in the surgical group and 62% in the medically managed group (p <0.001). Patients with higher plasma BNP (>135 pg/ml) and NT-proBNP (>1,150 pg/ml) levels at baseline had a greater risk of long-term mortality (hazard ratio [HR] 3.2, 95% confidence interval [CI] 1.1-9.1; HR 4.3, 95% CI 1.4-13.5, respectively). After adjusting for important covariates, both BNP and NT-proBNP remained independently associated with long-term mortality (HR 2.9, 95%CI 1.5-5.7; HR 1.8, 95%CI 1.1-3.1, respectively). Conclusion: In patients with severe AS, plasma BNP and NT-proBNP levels were associated with long-term mortality. The use of these biomarkers to guide treatment might represent an interesting approach that deserves further evaluation. The Journal of Heart Valve Disease 2012;21:331-336