743 resultados para In-hospital Mortality
Resumo:
Background: Ankle-brachial index (ABI) can access peripheral artery disease and predict mortality in prevalent patients on hemodialysis. However, ABI has not yet been tested in incident patients, who present significant mortality. Typically, ABI is measured by Doppler, which is not always available, limiting its use in most patients. We therefore hypothesized that ABI, evaluated by a simplified method, can predict mortality in an incident hemodialysis population. Methodology/Principal Findings: We studied 119 patients with ESRD who had started hemodialysis three times weekly. ABI was calculated by using two oscillometric blood pressure devices simultaneously. Patients were followed until death or the end of the study. ABI was categorized in two groups normal (0.9-1.3) or abnormal (<0.9 and >1.3). There were 33 deaths during a median follow-up of 12 months (from 3 to 24 months). Age (1 year) (hazard of ratio, 1.026; p = 0.014) and ABI abnormal (hazard ratio, 3.664; p = 0.001) were independently related to mortality in a multiple regression analysis. Conclusions: An easy and inexpensive technique to measure ABI was tested and showed to be significant in predicting mortality. Both low and high ABI were associated to mortality in incident patients on hemodialysis. This technique allows nephrologists to identify high-risk patients and gives the opportunity of early intervention that could alter the natural progression of this population.
Resumo:
Objective: To validate the 2000 Bernstein Parsonnet (2000BP) and additive EuroSCORE (ES) to predict mortality in patients who underwent coronary bypass surgery and/or heart valve surgery at the Heart Institute, University of Sao Paulo (InCor/HC-FMUSP). Methods:A prospective observational design. We analyzed 3000 consecutive patients who underwent coronary bypass surgery and/or heart valve surgery, between May 2007 and July 2009 at the InCor/HC-FMUSP. Mortality was calculated with the 2000BP and ES models. The correlation between estimated mortality and observed mortality was validated by calibration and discrimination tests. Results: There were significant differences in the prevalence of risk factors between the study population, 2000BP and ES. Patients were stratified into five groups for 2000BP and three for the ES. In the validation of models, the ES showed good calibration (P = 0396), however, the 2000BP (P = 0.047) proved inadequate. In discrimination, the area under the ROC curve proved to be good for models, ES (0.79) and 2000BP (0.80). Conclusion: In the validation, 2000BP proved questionable and ES appropriate to predict mortality in patients who underwent coronary bypass surgery and/or heart valve surgery at the InCor/HC-FMUSP.
Fatores de risco pré-operatórios para mediastinite após cirurgia cardíaca: análise de 2768 pacientes
Resumo:
INTRODUÇÃO: A esternotomia mediana longitudinal é a via de acesso mais utilizada no tratamento das doenças cardíacas. As infecções profundas da ferida operatória no pós-operatório das cirurgias cardiovasculares são uma complicação séria, com alto custo durante o tratamento. Diferentes estudos têm encontrado fatores de risco para o desenvolvimento de mediastinite e as variáveis pré-operatórias têm tido especial destaque. OBJETIVO: O objetivo deste estudo é identificar fatores de risco pré-operatórios para o desenvolvimento de mediastinite em pacientes submetidos a revascularização do miocárdio e a substituição valvar. MÉTODOS: Este estudo observacional representa uma coorte de 2768 pacientes operados consecutivamente. O período considerado para análise foi de maio de 2007 a maio de 2009 e não houve critérios de exclusão. Foi realizada análise univariada e multivariada pelo modelo de regressão logística das 38 variáveis pré-operatórias eleitas. RESULTADOS: Nesta série, 35 (1,3%) pacientes evoluíram com mediastinite e 19 (0,7%) com osteomielite associada. A idade média dos pacientes foi de 59,9 ± 13,5 anos e o EuroSCORE de 4,5 ± 3,6. A mortalidade hospitalar foi de 42,8%. Na análise multivariada, foram identificadas três variáveis como preditoras independentes de mediastinite: balão intra-aórtico (OR 5,41, 95% IC [1,83 -16,01], P=0,002), hemodiálise (OR 4,87, 95% IC [1,41 - 16,86], P=0,012) e intervenção vascular extracardíaca (OR 4,39, 95% IC [1,64 - 11,76], P=0,003). CONCLUSÃO: O presente estudo demonstrou que necessidade do suporte hemodinâmico pré-operatório com balão intra-aórtico, hemodiálise e intervenção vascular extracardíaca são fatores de risco para o desenvolvimento de mediastinite após cirurgia cardíaca.
Resumo:
Introduction. Patients with terminal heart failure have increased more than the available organs leading to a high mortality rate on the waiting list. Use of Marginal and expanded criteria donors has increased due to the heart shortage. Objective. We analyzed all heart transplantations (HTx) in Sao Paulo state over 8 years for donor profile and recipient risk factors. Method. This multi-institutional review collected HTx data from all institutions in the state of Sao Paulo, Brazil. From 2002 to 2008 (6 years), only 512 (28.8%) of 1777 available heart donors were accepted for transplantation. All medical records were analyzed retrospectively; none of the used donors was excluded, even those considered to be nonstandard. Results. The hospital mortality rate was 27.9% (n = 143) and the average follow-up time was 29.4 +/- 28.4 months. The survival rate was 55.5% (n = 285) at 6 years after HTx. Univariate analysis showed the following factors to impact survival: age (P = .0004), arterial hypertension (P = .4620), norepinephrine (P = .0450), cardiac arrest (P = .8500), diabetes mellitus (P = .5120), infection (P = .1470), CKMB (creatine kinase MB) (P = .8694), creatinine (P = .7225), and Na+ (P = .3273). On multivariate analysis, only age showed significance; logistic regression showed a significant cut-off at 40 years: organs from donors older than 40 years showed a lower late survival rates (P = .0032). Conclusions. Donor age older than 40 years represents an important risk factor for survival after HTx. Neither donor gender nor norepinephrine use negatively affected early survival.
Resumo:
Although the prevalence of drug-drug interactions (DDIs) in elderly outpatients is high, many potential DDIs do not have any actual clinical effect, and data on the occurrence of DDI-related adverse drug reactions (ADRs) in elderly outpatients are scarce. This study aimed to determine the incidence and characteristics of DDI-related ADRs among elderly outpatients as well as the factors associated with these reactions. A prospective cohort study was conducted between 1 November 2010 and 31 November 2011 in the primary public health system of the Ourinhos micro-region, Brazil. Patients aged a parts per thousand yen60 years with at least one potential DDI were eligible for inclusion. Causality, severity, and preventability of the DDI-related ADRs were assessed independently by four clinicians using validated methods; data were analysed using descriptive analysis and multiple logistic regression. A total of 433 patients completed the study. The incidence of DDI-related ADRs was 6 % (n = 30). Warfarin was the most commonly involved drug (37 % cases), followed by acetylsalicylic acid (17 %), digoxin (17 %), and spironolactone (17 %). Gastrointestinal bleeding occurred in 37 % of the DDI-related ADR cases, followed by hyperkalemia (17 %) and myopathy (13 %). The multiple logistic regression showed that age a parts per thousand yen80 years [odds ratio (OR) 4.4; 95 % confidence interval (CI) 3.0-6.1, p < 0.01], a Charlson comorbidity index a parts per thousand yen4 (OR 1.3; 95 % CI 1.1-1.8, p < 0.01), consumption of five or more drugs (OR 2.7; 95 % CI 1.9-3.1, p < 0.01), and the use of warfarin (OR 1.7; 95 % CI1.1-1.9, p < 0.01) were associated with the occurrence of DDI-related ADRs. With regard to severity, approximately 37 % of the DDI-related ADRs detected in our cohort necessitated hospital admission. All DDI-related ADRs could have been avoided (87 % were ameliorable and 13 % were preventable). The incidence of ADRs not related to DDIs was 10 % (n = 44). The incidence of DDI-related ADRs in elderly outpatients is high; most events presented important clinical consequences and were preventable or ameliorable.
Resumo:
Introduction The primary end points of randomized clinical trials evaluating the outcome of therapeutic strategies for coronary artery disease (CAD) have included nonfatal acute myocardial infarction, the need for further revascularization, and overall mortality. Noncardiac causes of death may distort the interpretation of the long-term effects of coronary revascularization. Materials and methods This post-hoc analysis of the second Medicine, Angioplasty, or Surgery Study evaluates the cause of mortality of patients with multivessel CAD undergoing medical treatment, percutaneous coronary intervention, or surgical myocardial revascularization [coronary artery bypass graft surgery (CABG)] after a 6-year follow-up. Mortality was classified as cardiac and noncardiac death, and the causes of noncardiac death were reported. Results Patients were randomized into CABG and non-CABG groups (percutaneous coronary intervention plus medical treatment). No statistical differences were observed in overall mortality (P = 0.824). A significant difference in the distribution of causes of mortality was observed among the CABG and non-CABG groups (P = 0.003). In the CABG group, of the 203 randomized patients, the overall number of deaths was 34. Sixteen patients (47.1%) died of cardiac causes and 18 patients (52.9%) died of noncardiac causes. Of these, seven deaths (20.6%) were due to neoplasia. In the non-CABG group, comprising 408 patients, the overall number of deaths was 69. Fifty-three patients (77%) died of cardiac causes and 16 patients (23%) died of noncardiac causes. Only five deaths (7.2%) were due to neoplasia. Conclusion Different treatment options for multivessel coronary artery disease have similar overall mortality: CABG patients had the lowest incidence of cardiac death, but the highest incidence of noncardiac causes of death, and specifically a higher tendency toward cancer-related deaths. Coron Artery Dis 23:79-84 (C) 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins.
Resumo:
Objective: Sepsis is a common condition encountered in hospital environments. There is no effective treatment for sepsis, and it remains an important cause of death at intensive care units. This study aimed to discuss some methods that are available in clinics, and tests that have been recently developed for the diagnosis of sepsis. Methods: A systematic review was performed through the analysis of the following descriptors: sepsis, diagnostic methods, biological markers, and cytokines. Results: The deleterious effects of sepsis are caused by an imbalance between the invasiveness of the pathogen and the ability of the host to mount an effective immune response. Consequently, the host's immune surveillance fails to eliminate the pathogen, allowing it to spread. Moreover, there is a pro-inflammatory mediator release, inappropriate activation of the coagulation and complement cascades, leading to dysfunction of multiple organs and systems. The difficulty achieve total recovery of the patient is explainable. There is an increased incidence of sepsis worldwide due to factors such as aging population, larger number of surgeries, and number of microorganisms resistant to existing antibiotics. Conclusion: The search for new diagnostic markers associated with increased risk of sepsis development and molecules that can be correlated to certain steps of sepsis is becoming necessary. This would allow for earlier diagnosis, facilitate patient prognosis characterization, and prediction of possible evolution of each case. All other markers are regrettably constrained to research units.
Resumo:
Background: Food and nutritional care quality must be assessed and scored, so as to improve health institution efficacy. This study aimed to detect and compare actions related to food and nutritional care quality in public and private hospitals. Methods: Investigation of the Hospital Food and Nutrition Service (HFNS) of 37 hospitals by means of structured interviews assessing two quality control corpora, namely nutritional care quality (NCQ) and hospital food service quality (FSQ). HFNS was also evaluated with respect to human resources per hospital bed and per produced meal. Results: Comparison between public and private institutions revealed that there was a statistically significant difference between the number of hospital beds per HFNS staff member (p = 0.02) and per dietitian (p < 0.01). The mean compliance with NCQ criteria in public and private institutions was 51.8% and 41.6%, respectively. The percentage of public and private health institutions in conformity with FSQ criteria was 42.4% and 49.1%, respectively. Most of the actions comprising each corpus, NCQ and FSQ, varied considerably between the two types of institution. NCQ was positively influenced by hospital type (general) and presence of a clinical dietitian. FSQ was affected by institution size: large and medium-sized hospitals were significantly better than small ones. Conclusions: Food and nutritional care in hospital is still incipient, and actions concerning both nutritional care and food service take place on an irregular basis. It is clear that the design of food and nutritional care in hospital indicators is mandatory, and that guidelines for the development of actions as well as qualification and assessment of nutritional care are urgent.
Resumo:
The innate and adaptive immune responses in neonates are usually functionally impaired when compared with their adult counterparts. The qualitative and quantitative differences in the neonatal immune response put them at risk for the development of bacterial and viral infections, resulting in increased mortality. Newborns often exhibit decreased production of Th1-polarizing cytokines and are biased toward Th2-type responses. Studies aimed at understanding the plasticity of the immune response in the neonatal and early infant periods or that seek to improve neonatal innate immune function with adjuvants or special formulations are crucial for preventing the infectious disease burden in this susceptible group. Considerable studies focused on identifying potential immunomodulatory therapies have been performed in murine models. This article highlights the strategies used in the emerging field of immunomodulation in bacterial and viral pathogens, focusing on preclinical studies carried out in animal models with particular emphasis on neonatal-specific immune deficits.
Resumo:
Abstract Background Due to the growing number of outbreaks of infection in hospital nurseries, it becomes essential to set up a sanitation program that indicates that the appropriate chemical agent was chosen for application in the most effective way. Method For the purpose of evaluating the efficacy of a chemical agent, the minimum inhibitory concentration (MIC) was reached by the classic method of successive broth dilutions. The reference bacteria utilized were Bacillus subtilis var. globigii ATCC 9372, Bacillus stearothermophilus ATCC 7953, Escherichia coli ATCC 25922, Staphylococcus aureus ATCC 25923. The strains of Enterobacter cloacae IAL 1976 (Adolfo Lutz Institute), Serratia marcescens IAL 1478 and Acinetobactev calcoaceticus IAL 124 (ATCC 19606), were isolated from material collected from babies involved in outbreaks of infection in hospital nurseries. Results The MIC intervals, which reduced bacteria populations over 08 log10, were: 59 to 156 mg/L of quaternarium ammonium compounds (QACs); 63 to 10000 mg/L of chlorhexidine digluconate; 1375 to 3250 mg/L of glutaraldehyde; 39 to 246 mg/L of formaldehyde; 43750 to 87500 mg/L of isopropanol or ethanol; 1250 to 6250 mg/L of iodine in polyvinyl-pyrolidone complexes, 150 to 4491 mg/L of chlorine-releasing-agents (CRAs); 469 to 2500 mg/L of hydrogen peroxide; and, 2310 to 18500 mg/L of peracetic acid. Conclusions Chlorhexidine showed non inhibitory activity over germinating spores. A. calcoaceticus, was observed to show resistance to the majority of the agents tested, followed by E. cloacae and S. marcescens.
Resumo:
Abstract Background A typical purification system that provides purified water which meets ionic and organic chemical standards, must be protected from microbial proliferation to minimize cross-contamination for use in cleaning and preparations in pharmaceutical industries and in health environments. Methodology Samples of water were taken directly from the public distribution water tank at twelve different stages of a typical purification system were analyzed for the identification of isolated bacteria. Two miniature kits were used: (i) identification system (api 20 NE, Bio-Mérieux) for non-enteric and non-fermenting gram-negative rods; and (ii) identification system (BBL crystal, Becton and Dickson) for enteric and non-fermenting gram-negative rods. The efficiency of the chemical sanitizers used in the stages of the system, over the isolated and identified bacteria in the sampling water, was evaluated by the minimum inhibitory concentration (MIC) method. Results The 78 isolated colonies were identified as the following bacteria genera: Pseudomonas, Flavobacterium and Acinetobacter. According to the miniature kits used in the identification, there was a prevalence of isolation of P. aeruginosa 32.05%, P. picketti (Ralstonia picketti) 23.08%, P. vesiculares 12.82%,P. diminuta 11.54%, F. aureum 6.42%, P. fluorescens 5.13%, A. lwoffi 2.56%, P. putida 2.56%, P. alcaligenes 1.28%, P. paucimobilis 1.28%, and F. multivorum 1.28%. Conclusions We found that research was required for the identification of gram-negative non-fermenting bacteria, which were isolated from drinking water and water purification systems, since Pseudomonas genera represents opportunistic pathogens which disperse and adhere easily to surfaces, forming a biofilm which interferes with the cleaning and disinfection procedures in hospital and industrial environments.
Resumo:
Abstract Background: Left ventricular free wall rupture occurs in up to 10% of the in-hospital deaths following myocardial infarction. It is mainly associated with posterolateral myocardial infarction and its antemortem diagnosis is rarely made. Contrast echocardiography has been increasingly used for the evaluation of myocardial perfusion in patients with acute myocardial infarction, with important prognostic implications. In this case, we reported its use for the detection of a mechanical complication following myocardial infarction. Case presentation: A 50-year-old man with acute myocardial infarction in the lateral wall underwent myocardial contrast echocardiography for the evaluation of myocardial perfusion in the third day post-infarction. A perfusion defect was detected in lateral and inferior walls as well as the presence of contrast extrusion from the left ventricular cavity into the myocardium, forming a serpiginous duct extending from the endocardium to the epicardial region of the lateral wall, without communication with the pericardial space. Magnetic resonance imaging confirmed the diagnosis of impending rupture of the left ventricular free wall. While waiting for cardiac surgery, patient presented with cardiogenic shock and died. Anatomopathological findings were consistent with acute myocardial infarction in the lateral wall and a left ventricular free wall rupture at the infarct site. Conclusion: This case illustrates the early diagnosis of left ventricular free wall rupture by contrast echocardiography. Due to its ability to be performed at bedside this modality of imaging has the potential to identify this catastrophic condition in patients with acute myocardial infarction and help to treat these patients with emergent surgery.
Resumo:
Abstract Introduction Noninvasive ventilation (NIV), as a weaning-facilitating strategy in predominantly chronic obstructive pulmonary disease (COPD) mechanically ventilated patients, is associated with reduced ventilator-associated pneumonia, total duration of mechanical ventilation, length of intensive care unit (ICU) and hospital stay, and mortality. However, this benefit after planned extubation in patients with acute respiratory failure of various etiologies remains to be elucidated. The aim of this study was to determine the efficacy of NIV applied immediately after planned extubation in contrast to oxygen mask (OM) in patients with acute respiratory failure (ARF). Methods A randomized, prospective, controlled, unblinded clinical study in a single center of a 24-bed adult general ICU in a university hospital was carried out in a 12-month period. Included patients met extubation criteria with at least 72 hours of mechanical ventilation due to acute respiratory failure, after following the ICU weaning protocol. Patients were randomized immediately before elective extubation, being randomly allocated to one of the study groups: NIV or OM. We compared both groups regarding gas exchange 15 minutes, 2 hours, and 24 hours after extubation, reintubation rate after 48 hours, duration of mechanical ventilation, ICU length of stay, and hospital mortality. Results Forty patients were randomized to receive NIV (20 patients) or OM (20 patients) after the following extubation criteria were met: pressure support (PSV) of 7 cm H2O, positive end-expiratory pressure (PEEP) of 5 cm H2O, oxygen inspiratory fraction (FiO2) ≤ 40%, arterial oxygen saturation (SaO2) ≥ 90%, and ratio of respiratory rate and tidal volume in liters (f/TV) < 105. Comparing the 20 patients (NIV) with the 18 patients (OM) that finished the study 48 hours after extubation, the rate of reintubation in NIV group was 5% and 39% in OM group (P = 0.016). Relative risk for reintubation was 0.13 (CI = 0.017 to 0.946). Absolute risk reduction for reintubation showed a decrease of 33.9%, and analysis of the number needed to treat was three. No difference was found in the length of ICU stay (P = 0.681). Hospital mortality was zero in NIV group and 22.2% in OM group (P = 0.041). Conclusions In this study population, NIV prevented 48 hours reintubation if applied immediately after elective extubation in patients with more than 3 days of ARF when compared with the OM group. Trial Registration number ISRCTN: 41524441.
Fatores maternos e neonatais associados ao mecônio no líquido amniótico em um centro de parto normal
Resumo:
OBJETIVO: Analisar a frequência e os fatores maternos e neonatais associados ao mecônio no líquido amniótico no parto. MÉTODOS: Estudo transversal com 2.441 nascimentos em um centro de parto normal hospitalar em São Paulo, SP, em março e abril de 2005. A associação entre mecônio no líquido amniótico e as variáveis independentes (idade materna, paridade, ter ou não cesariana prévia, idade gestacional, antecedentes obstétricos, uso de ocitocina no trabalho de parto, dilatação cervical na admissão, tipo do parto atual, peso do RN, índice de Apgar de 1º e 5º minutos de vida) foi expressa como razão de prevalência. RESULTADOS: Verificou-se mecônio no líquido amniótico em 11,9% dos partos; 68,2% desses foram normais e 38,8%, cesarianas. O mecônio esteve associado a: primiparidade (RP = 1,49; IC95% 1,29;1,73), idade gestacional ≥ 41 semanas (RP = 5,05; IC95% 1,93;13,25), ocitocina no parto (RP = 1,83, IC95% 1,60;2,10), cesariana (RP = 2,65; IC95% 2,17;3,24) e índice de Apgar < 7 no 5º minuto (RP = 2,96, IC95% 2,94;2,99). A mortalidade neonatal foi 1,6/1.000 nascidos vivos; mecônio no líquido amniótico foi encontrado em 50% das mortes neonatais e associado a maiores taxas de partos cirúrgicos. CONCLUSÕES: Emprego de ocitocina, piores condições do recém-nascido logo após o parto e aumento de taxas de cesariana foram fatores associados ao mecônio. A utilização rotineira de ocitocina no intraparto poderia ser revista por sua associação com mecônio no líquido amniótico.
Avaliação do tratamento cirúrgico da cardiopatia congênita em pacientes com idade superior a 16 anos
Resumo:
FUNDAMENTO: O número crescente de crianças com cardiopatias congênitas em evolução demanda maior preparo dos profissionais e das instituições que as manuseiam. OBJETIVO: Descrever o perfil dos pacientes com idade superior a 16 anos com cardiopatia congênita operados e analisar os fatores de risco preditivos de mortalidade hospitalar. MÉTODOS: Mil, quinhentos e vinte pacientes (idade média 27 ± 13 anos) foram operados entre janeiro de 1986 e dezembro de 2010. Foram realizadas análise descritiva do perfil epidemiológico da população estudada e análise dos fatores de risco para mortalidade hospitalar, considerando escore de complexidade, ano em que a cirurgia foi realizada, procedimento realizado pelo cirurgião pediátrico ou não e presença de reoperação. RESULTADOS: Ocorreu um crescimento expressivo no número de casos a partir do ano 2000. A média do escore de complexidade foi 5,4 e os defeitos septais corresponderam a 45% dos casos. A mortalidade geral foi 7,7% e o maior número de procedimentos (973 ou 61,9%) com maior complexidade foi realizado por cirurgiões pediátricos. Complexidade (OR 1,5), reoperação (OR 2,17) e cirurgião pediátrico (OR 0,28) foram fatores de risco independentes que influenciaram a mortalidade. A análise multivariada mostrou que o ano em que a cirurgia foi realizada (OR 1,03), a complexidade (OR 1,44) e o cirurgião pediátrico (OR 0,28) influenciaram no resultado. CONCLUSÃO: Observa-se um número crescente de pacientes com idade superior a 16 anos e que, apesar do grande número de casos simples, os mais complexos foram encaminhados para os cirurgiões pediátricos, que apresentaram menor mortalidade, em especial nos anos mais recentes. (Arq Bras Cardiol. 2012; [online].ahead print, PP.0-0)