116 resultados para In-hospital Mortality
Resumo:
PURPOSES Geriatric problems frequently go undetected in older patients in emergency departments (EDs), thus increasing their risk of adverse outcomes. We evaluated a novel emergency geriatric screening (EGS) tool designed to detect geriatric problems. BASIC PROCEDURES The EGS tool consisted of short validated instruments used to screen 4 domains (cognition, falls, mobility, and activities of daily living). Emergency geriatric screening was introduced for ED patients 75 years or older throughout a 4-month period. We analyzed the prevalence of abnormal EGS and whether EGS increased the number of EGS-related diagnoses in the ED during the screening, as compared with a preceding control period. MAIN FINDINGS Emergency geriatric screening was performed on 338 (42.5%) of 795 patients presenting during screening. Emergency geriatric screening was unfeasible in 175 patients (22.0%) because of life-threatening conditions and was not performed in 282 (35.5%) for logistical reasons. Emergency geriatric screening took less than 5 minutes to perform in most (85.8%) cases. Among screened patients, 285 (84.3%) had at least 1 abnormal EGS finding. In 270 of these patients, at least 1 abnormal EGS finding did not result in a diagnosis in the ED and was reported for further workup to subsequent care. During screening, 142 patients (42.0%) had at least 1 diagnosis listed within the 4 EGS domains, significantly more than the 29.3% in the control period (odds ratio 1.75; 95% confidence interval, 1.34-2.29; P<.001). Emergency geriatric screening predicted nursing home admission after the in-hospital stay (odds ratio for ≥3 vs <3 abnormal domains 12.13; 95% confidence interval, 2.79-52.72; P=.001). PRINCIPAL CONCLUSIONS The novel EGS is feasible, identifies previously undetected geriatric problems, and predicts determinants of subsequent care.
Resumo:
OBJECTIVE Blood-borne biomarkers reflecting atherosclerotic plaque burden have great potential to improve clinical management of atherosclerotic coronary artery disease and acute coronary syndrome (ACS). APPROACH AND RESULTS Using data integration from gene expression profiling of coronary thrombi versus peripheral blood mononuclear cells and proteomic analysis of atherosclerotic plaque-derived secretomes versus healthy tissue secretomes, we identified fatty acid-binding protein 4 (FABP4) as a biomarker candidate for coronary artery disease. Its diagnostic and prognostic performance was validated in 3 different clinical settings: (1) in a cross-sectional cohort of patients with stable coronary artery disease, ACS, and healthy individuals (n=820), (2) in a nested case-control cohort of patients with ACS with 30-day follow-up (n=200), and (3) in a population-based nested case-control cohort of asymptomatic individuals with 5-year follow-up (n=414). Circulating FABP4 was marginally higher in patients with ST-segment-elevation myocardial infarction (24.9 ng/mL) compared with controls (23.4 ng/mL; P=0.01). However, elevated FABP4 was associated with adverse secondary cerebrovascular or cardiovascular events during 30-day follow-up after index ACS, independent of age, sex, renal function, and body mass index (odds ratio, 1.7; 95% confidence interval, 1.1-2.5; P=0.02). Circulating FABP4 predicted adverse events with similar prognostic performance as the GRACE in-hospital risk score or N-terminal pro-brain natriuretic peptide. Finally, no significant difference between baseline FABP4 was found in asymptomatic individuals with or without coronary events during 5-year follow-up. CONCLUSIONS Circulating FABP4 may prove useful as a prognostic biomarker in risk stratification of patients with ACS.
Resumo:
BACKGROUND Bacterial meningitis (BM) is an infectious disease that results in high mortality and morbidity. Despite efficacious antibiotic therapy, neurological sequelae are often observed in patients after disease. Currently, the main challenge in BM treatment is to develop adjuvant therapies that reduce the occurrence of sequelae. In recent papers published by our group, we described the associations between the single nucleotide polymorphisms (SNPs) AADAT +401C > T, APEX1 Asn148Glu, OGG1 Ser326Cys and PARP1 Val762Ala and BM. In this study, we analyzed the associations between the SNPs TNF -308G > A, TNF -857C > T, IL-8 -251A > T and BM and investigated gene-gene interactions, including the SNPs that we published previously. METHODS The study was conducted with 54 BM patients and 110 healthy volunteers (as the control group). The genotypes were investigated via primer-introduced restriction analysis-polymerase chain reaction (PIRA-PCR) or polymerase chain reaction-based restriction fragment length polymorphism (PCR-RFLP) analysis. Allelic and genotypic frequencies were also associated with cytokine and chemokine levels, as measured with the x-MAP method, and cell counts. We analyzed gene-gene interactions among SNPs using the generalized multifactor dimensionality reduction (GMDR) method. RESULTS We did not find significant association between the SNPs TNF -857C > T and IL-8 -251A > T and the disease. However, a higher frequency of the variant allele TNF -308A was observed in the control group, associated with changes in cytokine levels compared to individuals with wild type genotypes, suggesting a possible protective role. In addition, combined inter-gene interaction analysis indicated a significant association between certain genotypes and BM, mainly involving the alleles APEX1 148Glu, IL8 -251 T and AADAT +401 T. These genotypic combinations were shown to affect cyto/chemokine levels and cell counts in CSF samples from BM patients. CONCLUSIONS In conclusion, this study revealed a significant association between genetic variability and altered inflammatory responses, involving important pathways that are activated during BM. This knowledge may be useful for a better understanding of BM pathogenesis and the development of new therapeutic approaches.
Resumo:
OBJECTIVES The aim of this study was to assess the safety of the concurrent administration of a clopidogrel and prasugrel loading dose in patients undergoing primary percutaneous coronary intervention. BACKGROUND Prasugrel is one of the preferred P2Y12 platelet receptor antagonists for ST-segment elevation myocardial infarction patients. The use of prasugrel was evaluated clinically in clopidogrel-naive patients. METHODS Between September 2009 and October 2012, a total of 2,023 STEMI patients were enrolled in the COMFORTABLE (Comparison of Biomatrix Versus Gazelle in ST-Elevation Myocardial Infarction [STEMI]) and the SPUM-ACS (Inflammation and Acute Coronary Syndromes) studies. Patients receiving a prasugrel loading dose were divided into 2 groups: 1) clopidogrel and a subsequent prasugrel loading dose; and 2) a prasugrel loading dose. The primary safety endpoint was Bleeding Academic Research Consortium types 3 to 5 bleeding in hospital at 30 days. RESULTS Of 2,023 patients undergoing primary percutaneous coronary intervention, 427 (21.1%) received clopidogrel and a subsequent prasugrel loading dose, 447 (22.1%) received a prasugrel loading dose alone, and the remaining received clopidogrel only. At 30 days, the primary safety endpoint was observed in 1.9% of those receiving clopidogrel and a subsequent prasugrel loading dose and 3.4% of those receiving a prasugrel loading dose alone (adjusted hazard ratio [HR]: 0.57; 95% confidence interval [CI]: 0.25 to 1.30, p = 0.18). The HAS-BLED (hypertension, abnormal renal/liver function, stroke, bleeding history or predisposition, labile international normalized ratio, elderly, drugs/alcohol concomitantly) bleeding score tended to be higher in prasugrel-treated patients (p = 0.076). The primary safety endpoint results, however, remained unchanged after adjustment for these differences (clopidogrel and a subsequent prasugrel loading dose vs. prasugrel only; HR: 0.54 [95% CI: 0.23 to 1.27], p = 0.16). No differences in the composite of cardiac death, myocardial infarction, or stroke were observed at 30 days (adjusted HR: 0.66, 95% CI: 0.27 to 1.62, p = 0.36). CONCLUSIONS This observational, nonrandomized study of ST-segment elevation myocardial infarction patients suggests that the administration of a loading dose of prasugrel in patients pre-treated with a loading dose of clopidogrel is not associated with an excess of major bleeding events. (Comparison of Biomatrix Versus Gazelle in ST-Elevation Myocardial Infarction [STEMI] [COMFORTABLE]; NCT00962416; and Inflammation and Acute Coronary Syndromes [SPUM-ACS]; NCT01000701).
Resumo:
BACKGROUND Living at higher altitude was dose-dependently associated with lower risk of ischaemic heart disease (IHD). Higher altitudes have different climatic, topographic and built environment properties than lowland regions. It is unclear whether these environmental factors mediate/confound the association between altitude and IHD. We examined how much of the altitude-IHD association is explained by variations in exposure at place of residence to sunshine, temperature, precipitation, aspect, slope and distance to main road. METHODS We included 4.2 million individuals aged 40-84 at baseline living in Switzerland at altitudes 195-2971 m above sea level (ie, full range of residence), providing 77 127 IHD deaths. Mortality data 2000-2008, sociodemographic/economic information and coordinates of residence were obtained from the Swiss National Cohort, a longitudinal, census-based record linkage study. Environment information was modelled to residence level using Weibull regression models. RESULTS In the model not adjusted for other environmental factors, IHD mortality linearly decreased with increasing altitude resulting in a lower risk (HR, 95% CI 0.67, 0.60 to 0.74) for those living >1500 m (vs<600 m). This association remained after adjustment for all other environmental factors 0.74 (0.66 to 0.82). CONCLUSIONS The benefit of living at higher altitude was only partially confounded by variations in climate, topography and built environment. Rather, physical environment factors appear to have an independent effect and may impact on cardiovascular health in a cumulative way. Inclusion of additional modifiable factors as well as individual information on traditional IHD risk factors in our combined environmental model could help to identify strategies for the reduction of inequalities in IHD mortality.
Resumo:
The long-term risk associated with different coronary artery disease (CAD) presentations in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stents (DES) is poorly characterized. We pooled patient-level data for women enrolled in 26 randomized clinical trials. Of 11,577 women included in the pooled database, 10,133 with known clinical presentation received a DES. Of them, 5,760 (57%) had stable angina pectoris (SAP), 3,594 (35%) had unstable angina pectoris (UAP) or non-ST-segment-elevation myocardial infarction (NSTEMI), and 779 (8%) had ST-segment-elevation myocardial infarction (STEMI) as clinical presentation. A stepwise increase in 3-year crude cumulative mortality was observed in the transition from SAP to STEMI (4.9% vs 6.1% vs 9.4%; p <0.01). Conversely, no differences in crude mortality rates were observed between 1 and 3 years across clinical presentations. After multivariable adjustment, STEMI was independently associated with greater risk of 3-year mortality (hazard ratio [HR] 3.45; 95% confidence interval [CI] 1.99 to 5.98; p <0.01), whereas no differences were observed between UAP or NSTEMI and SAP (HR 0.99; 95% CI 0.73 to 1.34; p = 0.94). In women with ACS, use of new-generation DES was associated with reduced risk of major adverse cardiac events (HR 0.58; 95% CI 0.34 to 0.98). The magnitude and direction of the effect with new-generation DES was uniform between women with or without ACS (pinteraction = 0.66). In conclusion, in women across the clinical spectrum of CAD, STEMI was associated with a greater risk of long-term mortality. Conversely, the adjusted risk of mortality between UAP or NSTEMI and SAP was similar. New-generation DESs provide improved long-term clinical outcomes irrespective of the clinical presentation in women.
Resumo:
Pleural infection is a frequent clinical condition. Prompt treatment has been shown to reduce hospital costs, morbidity and mortality. Recent advances in treatment have been variably implemented in clinical practice. This statement reviews the latest developments and concepts to improve clinical management and stimulate further research. The European Association for Cardio-Thoracic Surgery (EACTS) Thoracic Domain and the EACTS Pleural Diseases Working Group established a team of thoracic surgeons to produce a comprehensive review of available scientific evidence with the aim to cover all aspects of surgical practice related to its treatment, in particular focusing on: surgical treatment of empyema in adults; surgical treatment of empyema in children; and surgical treatment of post-pneumonectomy empyema (PPE). In the management of Stage 1 empyema, prompt pleural space chest tube drainage is required. In patients with Stage 2 or 3 empyema who are fit enough to undergo an operative procedure, there is a demonstrated benefit of surgical debridement or decortication [possibly by video-assisted thoracoscopic surgery (VATS)] over tube thoracostomy alone in terms of treatment success and reduction in hospital stay. In children, a primary operative approach is an effective management strategy, associated with a lower mortality rate and a reduction of tube thoracostomy duration, length of antibiotic therapy, reintervention rate and hospital stay. Intrapleural fibrinolytic therapy is a reasonable alternative to primary operative management. Uncomplicated PPE [without bronchopleural fistula (BPF)] can be effectively managed with minimally invasive techniques, including fenestration, pleural space irrigation and VATS debridement. PPE associated with BPF can be effectively managed with individualized open surgical techniques, including direct repair, myoplastic and thoracoplastic techniques. Intrathoracic vacuum-assisted closure may be considered as an adjunct to the standard treatment. The current literature cements the role of VATS in the management of pleural empyema, even if the choice of surgical approach relies on the individual surgeon's preference.
Resumo:
BACKGROUND There has been little research on bathroom accidents. It is unknown whether the shower or bathtub are connected with special dangers in different age groups or whether there are specific risk factors for adverse outcomes. METHODS This cross-sectional analysis included all direct admissions to the Emergency Department at the Inselspital Bern, Switzerland from 1 January 2000 to 28 February 2014 after accidents associated with the bathtub or shower. Time, age, location, mechanism and diagnosis were assessed and special risk factors were examined. Patient groups with and without intracranial bleeding were compared with the Mann-Whitney U test.The association of risk factors with intracranial bleeding was investigated using univariate analysis with Fisher's exact test or logistic regression. The effects of different variables on cerebral bleeding were analysed by multivariate logistic regression. RESULTS Two hundred and eighty (280) patients with accidents associated with the bathtub or shower were included in our study. Two hundred and thirty-five (235) patients suffered direct trauma by hitting an object (83.9%) and traumatic brain injury (TBI) was detected in 28 patients (10%). Eight (8) of the 27 patients with mild traumatic brain injuries (GCS 13-15), (29.6%) exhibited intracranial haemorrhage. All patients with intracranial haemorrhage were older than 48 years and needed in-hospital treatment. Patients with intracranial haemorrhage were significantly older and had higher haemoglobin levels than the control group with TBI but without intracranial bleeding (p<0.05 for both).In univariate analysis, we found that intracranial haemorrhage in patients with TBI was associated with direct trauma in general and with age (both p<0.05), but not with the mechanism of the fall, its location (shower or bathtub) or the gender of the patient. Multivariate logistic regression analysis identified only age as a risk factor for cerebral bleeding (p<0.05; OR 1.09 (CI 1.01;1.171)). CONCLUSION In patients with ED admissions associated with the bathtub or shower direct trauma and age are risk factors for intracranial haemorrhage. Additional effort in prevention should be considered, especially in the elderly.
Resumo:
Number of days spent in acute hospitals (DAH) at the end of life is regarded as an important care quality indicator for cancer patients. We analysed DAH during 90 days prior to death in patients from four Swiss cantons. Claims data from an insurance provider with about 20% market share and patient record review identified 2086 patients as dying of cancer. We calculated total DAH per patient. Multivariable generalised linear modelling served to evaluate potential explanatory variables. Mean DAH was 26 days. In the multivariable model, using complementary and alternative medicine (DAH = 33.9; +8.8 days compared to non-users) and canton of residence (for patient receiving anti-cancer therapy, Zürich DAH = 22.8 versus Basel DAH = 31.4; for other patients, Valais DAH = 22.7 versus Ticino DAH = 33.7) had the strongest influence. Age at death and days spent in other institutions were additional significant predictors. DAH during the last 90 days of life of cancer patients from four Swiss cantons is high compared to most other countries. Several factors influence DAH. Resulting differences are likely to have financial impact, as DAH is a major cost driver for end-of-life care. Whether they are supply- or demand-driven and whether patients would prefer fewer days in hospital remains to be established.
Resumo:
Changes in species composition in two 4–ha plots of lowland dipterocarp rainforest at Danum, Sabah, were measured over ten years (1986 to 1996) for trees greater than or equal to 10 cm girth at breast height (gbh). Each included a lower–slope to ridge gradient. The period lay between two drought events of moderate intensity but the forest showed no large lasting responses, suggesting that its species were well adapted to this regime. Mortality and recruitment rates were not unusual in global or regional comparisons. The forest continued to aggrade from its relatively (for Sabah) low basal area in 1986 and, together with the very open upper canopy structure and an abundance of lianas, this suggests a forest in a late stage of recovery from a major disturbance, yet one continually affected by smaller recent setbacks. Mortality and recruitment rates were not related to population size in 1986, but across subplots recruitment was positively correlated with the density and basal area of small trees (10 to <50 cm gbh) forming the dense understorey. Neither rate was related to topography. While species with larger mean gbh had greater relative growth rates (rgr) than smaller ones, subplot mean recruitment rates were correlated with rgr among small trees. Separating understorey species (typically the Euphorbiaceae) from the overstorey (Dipterocarpaceae) showed marked differences in change in mortality with increasing gbh: in the former it increased, in the latter it decreased. Forest processes are centred on this understorey quasi–stratum. The two replicate plots showed a high correspondence in the mortality, recruitment, population changes and growth rates of small trees for the 49 most abundant species in common to both. Overstorey species had higher rgrs than understorey ones, but both showed considerable ranges in mortality and recruitment rates. The supposed trade–off in traits, viz slower rgr, shade tolerance and lower population turnover in the understorey group versus faster potential growth rate, high light responsiveness and high turnover in the overstorey group, was only partly met, as some understorey species were also very dynamic. The forest at Danum, under such a disturbance–recovery regime, can be viewed as having a dynamic equilibrium in functional and structural terms. A second trade–off in shade–tolerance versus drought–tolerance is suggested for among the understorey species. A two–storey (or vertical component) model is proposed where the understorey–overstorey species’ ratio of small stems (currently 2:1) is maintained by a major feedback process. The understorey appears to be an important part of this forest, giving resilience against drought and protecting the overstorey saplings in the long term. This view could be valuable for understanding forest responses to climate change where drought frequency in Borneo is predicted to intensify in the coming decades.
Resumo:
BALB/c interleukin-4 (IL-4(-/-)) or IL-4 receptor-alpha (IL-4ralpha(-/-)) knockout (KO) mice were used to assess the roles of the IL-4 and IL-13 pathways during infections with the blood or liver stages of plasmodium in murine malaria. Intraperitoneal infection with the blood-stage erythrocytes of Plasmodium berghei (ANKA) resulted in 100% mortality within 24 days in BALB/c mice, as well as in the mutant mouse strains. However, when infected intravenously with the sporozoite liver stage, 60 to 80% of IL-4(-/-) and IL-4ralpha(-/-) mice survived, whereas all BALB/c mice succumbed with high parasitemia. Compared to infected BALB/c controls, the surviving KO mice showed increased NK cell numbers and expression of inducible nitric oxide synthase (iNOS) in the liver and were able to eliminate parasites early during infection. In vivo blockade of NO resulted in 100% mortality of sporozoite-infected KO mice. In vivo depletion of NK cells also resulted in 80 to 100% mortality, with a significant reduction in gamma interferon (IFN-gamma) production in the liver. These results suggest that IFN-gamma-producing NK cells are critical in host resistance against the sporozoite liver stage by inducing NO production, an effective killing effector molecule against Plasmodium. The absence of IL-4-mediated functions increases the protective innate immune mechanism identified above, which results in immunity against P. berghei infection in these mice, with no major role for IL-13.