81 resultados para Infant mortality rate
Resumo:
Septic shock is characterized by increased vascular permeability and hypotension despite increased cardiac output. Numerous vasoactive cytokines are upregulated during sepsis, including angiopoietin 2 (ANG2), which increases vascular permeability. Here we report that mice engineered to inducibly overexpress ANG2 in the endothelium developed sepsis-like hemodynamic alterations, including systemic hypotension, increased cardiac output, and dilatory cardiomyopathy. Conversely, mice with cardiomyocyte-restricted ANG2 overexpression failed to develop hemodynamic alterations. Interestingly, the hemodynamic alterations associated with endothelial-specific overexpression of ANG2 and the loss of capillary-associated pericytes were reversed by intravenous injections of adeno-associated viruses (AAVs) transducing cDNA for angiopoietin 1, a TIE2 ligand that antagonizes ANG2, or AAVs encoding PDGFB, a chemoattractant for pericytes. To confirm the role of ANG2 in sepsis, we i.p. injected LPS into C57BL/6J mice, which rapidly developed hypotension, acute pericyte loss, and increased vascular permeability. Importantly, ANG2 antibody treatment attenuated LPS-induced hemodynamic alterations and reduced the mortality rate at 36 hours from 95% to 61%. These data indicate that ANG2-mediated microvascular disintegration contributes to septic shock and that inhibition of the ANG2/TIE2 interaction during sepsis is a potential therapeutic target.
Resumo:
BACKGROUND We describe the setup of a neonatal quality improvement tool and list which peer-reviewed requirements it fulfils and which it does not. We report on the so-far observed effects, how the units can identify quality improvement potential, and how they can measure the effect of changes made to improve quality. METHODS Application of a prospective longitudinal national cohort data collection that uses algorithms to ensure high data quality (i.e. checks for completeness, plausibility and reliability), and to perform data imaging (Plsek's p-charts and standardized mortality or morbidity ratio SMR charts). The collected data allows monitoring a study collective of very low birth-weight infants born from 2009 to 2011 by applying a quality cycle following the steps 'guideline - perform - falsify - reform'. RESULTS 2025 VLBW live-births from 2009 to 2011 representing 96.1% of all VLBW live-births in Switzerland display a similar mortality rate but better morbidity rates when compared to other networks. Data quality in general is high but subject to improvement in some units. Seven measurements display quality improvement potential in individual units. The methods used fulfil several international recommendations. CONCLUSIONS The Quality Cycle of the Swiss Neonatal Network is a helpful instrument to monitor and gradually help improve the quality of care in a region with high quality standards and low statistical discrimination capacity.
Resumo:
OBJECTIVES This study aimed to demonstrate that the presence of late gadolinium enhancement (LGE) is a predictor of death and other adverse events in patients with suspected cardiac sarcoidosis. BACKGROUND Cardiac sarcoidosis is the most important cause of patient mortality in systemic sarcoidosis, yielding a 5-year mortality rate between 25% and 66% despite immunosuppressive treatment. Other groups have shown that LGE may hold promise in predicting future adverse events in this patient group. METHODS We included 155 consecutive patients with systemic sarcoidosis who underwent cardiac magnetic resonance (CMR) for workup of suspected cardiac sarcoid involvement. The median follow-up time was 2.6 years. Primary endpoints were death, aborted sudden cardiac death, and appropriate implantable cardioverter-defibrillator (ICD) discharge. Secondary endpoints were ventricular tachycardia (VT) and nonsustained VT. RESULTS LGE was present in 39 patients (25.5%). The presence of LGE yields a Cox hazard ratio (HR) of 31.6 for death, aborted sudden cardiac death, or appropriate ICD discharge, and of 33.9 for any event. This is superior to functional or clinical parameters such as left ventricular (LV) ejection fraction (EF), LV end-diastolic volume, or presentation as heart failure, yielding HRs between 0.99 (per % increase LVEF) and 1.004 (presentation as heart failure), and between 0.94 and 1.2 for potentially lethal or other adverse events, respectively. Except for 1 patient dying from pulmonary infection, no patient without LGE died or experienced any event during follow-up, even if the LV was enlarged and the LVEF severely impaired. CONCLUSIONS Among our population of sarcoid patients with nonspecific symptoms, the presence of myocardial scar indicated by LGE was the best independent predictor of potentially lethal events, as well as other adverse events, yielding a Cox HR of 31.6 and of 33.9, respectively. These data support the necessity for future large, longitudinal follow-up studies to definitely establish LGE as an independent predictor of cardiac death in sarcoidosis, as well as to evaluate the incremental prognostic value of additional parameters.
Resumo:
BACKGROUND & AIMS Patients with cirrhosis hospitalized for an acute decompensation (AD) and organ failure are at risk for imminent death and considered to have acute-on-chronic liver failure (ACLF). However, there are no established diagnostic criteria for ACLF, so little is known about its development and progression. We aimed to identify diagnostic criteria of ACLF and describe the development of this syndrome in European patients with AD. METHODS We collected data from 1343 hospitalized patients with cirrhosis and AD from February to September 2011 at 29 liver units in 8 European countries. We used the organ failure and mortality data to define ACLF grades, assess mortality, and identify differences between ACLF and AD. We established diagnostic criteria for ACLF based on analyses of patients with organ failure (defined by the chronic liver failure-sequential organ failure assessment [CLIF-SOFA] score) and high 28-day mortality rate (>15%). RESULTS Of the patients assessed, 303 had ACLF when the study began, 112 developed ACLF, and 928 did not have ACLF. The 28-day mortality rate among patients who had ACLF when the study began was 33.9%, among those who developed ACLF was 29.7%, and among those who did not have ACLF was 1.9%. Patients with ACLF were younger and more frequently alcoholic, had more associated bacterial infections, and had higher numbers of leukocytes and higher plasma levels of C-reactive protein than patients without ACLF (P < .001). Higher CLIF-SOFA scores and leukocyte counts were independent predictors of mortality in patients with ACLF. In patients without a prior history of AD, ACLF was unexpectedly characterized by higher numbers of organ failures, leukocyte count, and mortality compared with ACLF in patients with a prior history of AD. CONCLUSIONS We analyzed data from patients with cirrhosis and AD to establish diagnostic criteria for ACLF and showed that it is distinct from AD, based not only on the presence of organ failure(s) and high mortality rate but also on age, precipitating events, and systemic inflammation. ACLF mortality is associated with loss of organ function and high leukocyte counts. ACLF is especially severe in patients with no prior history of AD.
Resumo:
Occasional strong droughts are an important feature of the climatic environment of tropical rain forest in much of Borneo. This paper compares the response of a lowland dipterocarp forest at Danum, Sabah, in a period of low (LDI) and a period of high (HDI) drought intensity (1986-96, 9.98 y;1996-99, 2.62 y). Mean annual drought intensity was two-fold higher in the HDI than LDI period (1997 v. 976 mm), and each period had one moderately strong main drought (viz. 1992, 1998). Mortality of `all' trees greater than or equal to 10 cm gbh (girth at breast height) and stem growth rates of `small' trees 10less than or equal to50 cm gbh were measured in sixteen 0.16-ha subplots (half on ridge, half on lower slope sites) within two 4-ha plots. These 10-50-cm trees were composed largely of true understorey species. A new procedure was developed to correct for the effect of differences in length of census interval when comparing tree mortality rates. Mortality rates of small trees declined slightly but not significantly between the LDI and HDI periods (1.53 to 1.48% y(-1)): mortality of all trees showed a similar pattern. Relative growth rates declined significantly by 23% from LDI to HDI periods (11.1 to 8.6 mm m(-1) y(-1)): for absolute growth rates the decrease was 28% (2.45 to 1.77 mm y(-1)). Neither mortality nor growth rates were significantly influenced by topography. For small trees, across subplots, absolute growth rate was positively correlated in the LDI period, but negatively correlated in the HDI period, with mortality rate. There was no consistent pattern in the responses among the 19 most abundant species (n greater than or equal to 50 trees) which included a proposed drought-tolerant guild. In terms of tree survival, the forest at Danum was resistant to increasing drought intensity, but showed decreased stem growth attributable to increasing water stress.
Resumo:
A marked increase in canine leptospirosis was observed in Switzerland over 10 years with a peak incidence of 28.1 diagnosed cases/100,000 dogs/year in the most affected canton. With 95% affected dogs living at altitudes <800 m, the disease presented a seasonal pattern associated with temperature (r2 0.73) and rainfall (r2 0.39), >90% cases being diagnosed between May and October. The increasing yearly incidence however was only weakly correlated with climatic data including number of summer (r2 0.25) or rainy days (r2 0.38). Serovars Australis and Bratislava showed the highest seropositivity rates with 70.5% and 69.1%, respectively. Main clinical manifestations included renal (99.6%), pulmonary (76.7%), hepatic (26.0%), and hemorrhagic syndromes (18.2%), leading to a high mortality rate (43.3%). Similar to the human disease, liver involvement had the strongest association with negative outcome (OR 16.3). Based on these data, canine leptospirosis presents similar features and severity as the human infection for which it therefore can be considered a model. Its re-emergence in a temperate country with very high incidence rates in canines should thus be viewed as a warning and emphasize the need for increased awareness in other species.
Resumo:
Seed production, seed dispersal, and seedling recruitment are integral to forest dynamics, especially in masting species. Often these are studied separately, yet scarcely ever for species with ballistic dispersal even though this mode of dispersal is common in legume trees of tropical African rain forests. Here, we studied two dominant main-canopy tree species, Microberlinia bisulcata and Tetraberlinia bifoliolata (Caesalpinioideae), in 25 ha of primary rain forest at Korup, Cameroon, during two successive masting events (2007/2010). In the vicinity of c. 100 and 130 trees of each species, 476/580 traps caught dispersed seeds and beneath their crowns c. 57,000 pod valves per species were inspected to estimate tree-level fecundity. Seed production of trees increased non-linearly and asymptotically with increasing stem diameters. It was unequal within the two species’ populations, and differed strongly between years to foster both spatial and temporal patchiness in seed rain. The M. bisulcata trees could begin seeding at 42–44 cm diameter: at a much larger size than could T. bifoliolata (25 cm). Nevertheless, per capita life-time reproductive capacity was c. five times greater in M. bisulcata than T. bifoliolata owing to former’s larger adult stature, lower mortality rate (despite a shorter life-time) and smaller seed mass. The two species displayed strong differences in their dispersal capabilities. Inverse modelling (IM) revealed that dispersal of M. bisulcata was best described by a lognormal kernel. Most seeds landed at 10–15 m from stems, with 1% of them going beyond 80 m (<100 m). The direct estimates of fecundity significantly improved the models fitted. The lognormal also described well the seedling recruitment distribution of this species in 121 ground plots. By contrast, the lower intensity of masting and more limited dispersal of the heavier-seeded T. bifoliolata prevented reliable IM. For this species, seed density as function of distance to traps suggested a maximum dispersal distance of 40–50 m, and a correspondingly more aggregated seedling recruitment pattern ensued than for M. bisulcata. From this integrated field study, we conclude that the reproductive traits of M. bisulcata give it a considerable advantage over T. bifoliolata by better dispersing more seeds per capita to reach more suitable establishment sites, and combined with other key traits they explain its local dominance in the forest. Understanding the linkages between size at onset of maturity, individual fecundity, and dispersal capability can better inform the life-history strategies, and hence management, of co-occurring tree species in tropical forests.
Resumo:
OBJECTIVES The purpose of this study was to investigate the survival effects of inferior vena cava filters in patients with venous thromboembolism (VTE) who had a significant bleeding risk. BACKGROUND The effectiveness of inferior vena cava filter use among patients with acute symptomatic VTE and known significant bleeding risk remains unclear. METHODS In this prospective cohort study of patients with acute VTE identified from the RIETE (Computerized Registry of Patients With Venous Thromboembolism), we assessed the association between inferior vena cava filter insertion for known significant bleeding risk and the outcomes of all-cause mortality, pulmonary embolism (PE)-related mortality, and VTE rates through 30 days after the initiation of VTE treatment. Propensity score matching was used to adjust for the likelihood of receiving a filter. RESULTS Of the 40,142 eligible patients who had acute symptomatic VTE, 371 underwent filter placement because of known significant bleeding risk. A total of 344 patients treated with a filter were matched with 344 patients treated without a filter. Propensity score-matched pairs showed a nonsignificant trend toward lower risk of all-cause death for filter insertion compared with no insertion (6.6% vs. 10.2%; p = 0.12). The risk-adjusted PE-related mortality rate was lower for filter insertion than no insertion (1.7% vs. 4.9%; p = 0.03). Risk-adjusted recurrent VTE rates were higher for filter insertion than for no insertion (6.1% vs. 0.6%; p < 0.001). CONCLUSIONS In patients presenting with VTE and with a significant bleeding risk, inferior vena cava filter insertion compared with anticoagulant therapy was associated with a lower risk of PE-related death and a higher risk of recurrent VTE. However, study design limitations do not imply a causal relationship between filter insertion and outcome.
Resumo:
Purpose The aim was to test the impact of body mass index (BMI) and gender on infectious complications after polytrauma. Methods A total of 651 patients were included in this retrospective study, with an Injury Severity Score (ISS) C16 and age C16 years. The sample was subdivided into three groups: BMI\25 kg/m2, BMI 25–30 kg/m2, and BMI[30 kg/m2, and a female and a male group. Infectious complications were observed for 31 days after admission. Data are given as mean ± standard errors of the means. Analysis of variance, Kruskal–Wallis test, v2 tests, and Pearson’s correlation were used for the analyses and the significance level was set at P\0.05. Results The overall infection rates were 31.0 % in the BMI\25 kg/m2 group, 29.0 % in the BMI 25–30 kg/m2 group, and 24.5 % in the BMI[30 kg/m2 group (P = 0.519). The female patients developed significantly fewer infectious complications than the male patients (26.8 vs. 73.2 %; P\0.001). The incidence of death was significantly decreased according to the BMI group (8.8 vs. 7.2 vs. 1.5 %; P\0.0001) and the female population had a significantly lower mortality rate (4.1 vs. 13.4 %; P\0.0001). Pearson’s correlations between the Abbreviated Injury Scale (AIS) score and the corresponding infectious foci were not significant. Conclusion Higher BMI seems to be protective against polytrauma-associated death but not polytrauma-associated infections, and female gender protects against both polytrauma- associated infections and death. Understanding gender-specific immunomodulation could improve the outcome of polytrauma patients.
Resumo:
OBJECTIVES To report the mid-term results of aortic root replacement using a self-assembled biological composite graft, consisting of a vascular tube graft and a stented tissue valve. METHODS Between January 2005 and December 2011, 201 consecutive patients [median age 66 (interquartile range, IQR, 55-77) years, 31 female patients (15.4%), median logistic EuroSCORE 10 (IQR 6.8-23.2)] underwent aortic root replacement using a stented tissue valve for the following indications: annulo-aortic ectasia or ascending aortic aneurysm with aortic valve disease in 162 (76.8%) patients, active infective endocarditis in 18 (9.0%) and acute aortic dissection Stanford type A in 21 (10.4%). All patients underwent clinical and echocardiographic follow-up. We analysed survival and valve-related events. RESULTS The overall in-hospital mortality rate was 4.5%. One- and 5-year cardiac-related mortality rates were 3 and 6%, and overall survival was 95 ± 1.5 and 75 ± 3.6%, respectively. The rate of freedom from structural valve failure was 99% and 97 ± 0.4% at the 1- and 5-year follow-up, respectively. The incidence rates of prosthetic valve endocarditis were 3 and 4%, respectively. During a median follow-up of 28 (IQR 14-51) months, only 2 (1%) patients required valve-related redo surgery due to prosthetic valvular endocarditis and none suffered from thromboembolic events. One percent of patients showed structural valve deterioration without any clinical symptoms; none of the patients suffered greater than mild aortic regurgitation. CONCLUSIONS Aortic root replacement using a self-assembled biological composite graft is an interesting option. Haemodynamic results are excellent, with freedom from structured valve failure. Need for reoperation is extremely low, but long-term results are necessary to prove the durability of this concept.
Resumo:
BACKGROUND The population-based effectiveness of thoracic endovascular aortic repair (TEVAR) versus open surgery for descending thoracic aortic aneurysm remains in doubt. METHODS Patients aged over 50 years, without a history of aortic dissection, undergoing repair of a thoracic aortic aneurysm between 2006 and 2011 were assessed using mortality-linked individual patient data from Hospital Episode Statistics (England). The principal outcomes were 30-day operative mortality, long-term survival (5 years) and aortic-related reinterventions. TEVAR and open repair were compared using crude and multivariable models that adjusted for age and sex. RESULTS Overall, 759 patients underwent thoracic aortic aneurysm repair, mainly for intact aneurysms (618, 81·4 per cent). Median ages of TEVAR and open cohorts were 73 and 71 years respectively (P < 0·001), with more men undergoing TEVAR (P = 0·004). For intact aneurysms, the operative mortality rate was similar for TEVAR and open repair (6·5 versus 7·6 per cent; odds ratio 0·79, 95 per cent confidence interval (c.i.) 0·41 to 1·49), but the 5-year survival rate was significantly worse after TEVAR (54·2 versus 65·6 per cent; adjusted hazard ratio 1·45, 95 per cent c.i. 1·08 to 1·94). After 5 years, aortic-related mortality was similar in the two groups, but cardiopulmonary mortality was higher after TEVAR. TEVAR was associated with more aortic-related reinterventions (23·1 versus 14·3 per cent; adjusted HR 1·70, 95 per cent c.i. 1·11 to 2·60). There were 141 procedures for ruptured thoracic aneurysm (97 TEVAR, 44 open), with TEVAR showing no significant advantage in terms of operative mortality. CONCLUSION In England, operative mortality for degenerative descending thoracic aneurysm was similar after either TEVAR or open repair. Patients who had TEVAR appeared to have a higher reintervention rate and worse long-term survival, possibly owing to cardiopulmonary morbidity and other selection bias.
Resumo:
BACKGROUND Renal cell carcinoma (RCC) is marked by high mortality rate. To date, no robust risk stratification by clinical or molecular prognosticators of cancer-specific survival (CSS) has been established for early stages. Transcriptional profiling of small non-coding RNA gene products (miRNAs) seems promising for prognostic stratification. The expression of miR-21 and miR-126 was analysed in a large cohort of RCC patients; a combined risk score (CRS)-model was constructed based on expression levels of both miRNAs. METHODS Expression of miR-21 and miR-126 was evaluated by qRT-PCR in tumour and adjacent non-neoplastic tissue in n = 139 clear cell RCC patients. Relation of miR-21 and miR-126 expression with various clinical parameters was assessed. Parameters were analysed by uni- and multivariate COX regression. A factor derived from the z-score resulting from the COX model was determined for both miRs separately and a combined risk score (CRS) was calculated multiplying the relative expression of miR-21 and miR-126 by this factor. The best fitting COX model was selected by relative goodness-of-fit with the Akaike information criterion (AIC). RESULTS RCC with and without miR-21 up- and miR-126 downregulation differed significantly in synchronous metastatic status and CSS. Upregulation of miR-21 and downregulation of miR-126 were independently prognostic. A combined risk score (CRS) based on the expression of both miRs showed high sensitivity and specificity in predicting CSS and prediction was independent from any other clinico-pathological parameter. Association of CRS with CSS was successfully validated in a testing cohort containing patients with high and low risk for progressive disease. CONCLUSIONS A combined expression level of miR-21 and miR-126 accurately predicted CSS in two independent RCC cohorts and seems feasible for clinical application in assessing prognosis.
Resumo:
OBJECTIVES Fontan failure (FF) represents a growing and challenging indication for paediatric orthotopic heart transplantation (OHT). The aim of this study was to identify predictors of the best mid-term outcome in OHT after FF. METHODS Twenty-year multi-institutional retrospective analysis on OHT for FF. RESULTS Between 1991 and 2011, 61 patients, mean age 15.0 ± 9.7 years, underwent OHT for failing atriopulmonary connection (17 patients = 27.8%) or total cavopulmonary connection (44 patients = 72.2%). Modality of FF included arrhythmia (14.8%), complex obstructions in the Fontan circuit (16.4%), protein-losing enteropathy (PLE) (22.9%), impaired ventricular function (31.1%) or a combination of the above (14.8%). The mean time interval between Fontan completion and OHT was 10.7 ± 6.6 years. Early FF occurred in 18%, requiring OHT 0.8 ± 0.5 years after Fontan. The hospital mortality rate was 18.3%, mainly secondary to infection (36.4%) and graft failure (27.3%). The mean follow-up was 66.8 ± 54.2 months. The overall Kaplan-Meier survival estimate was 81.9 ± 1.8% at 1 year, 73 ± 2.7% at 5 years and 56.8 ± 4.3% at 10 years. The Kaplan-Meier 5-year survival estimate was 82.3 ± 5.9% in late FF and 32.7 ± 15.0% in early FF (P = 0.0007). Late FF with poor ventricular function exhibited a 91.5 ± 5.8% 5-year OHT survival. PLE was cured in 77.7% of hospital survivors, but the 5-year Kaplan-Meier survival estimate in PLE was 46.3 ± 14.4 vs 84.3 ± 5.5% in non-PLE (P = 0.0147). Cox proportional hazards identified early FF (P = 0.0005), complex Fontan pathway obstruction (P = 0.0043) and PLE (P = 0.0033) as independent predictors of 5-year mortality. CONCLUSIONS OHT is an excellent surgical option for late FF with impaired ventricular function. Protein dispersion improves with OHT, but PLE negatively affects the mid-term OHT outcome, mainly for early infective complications.
Resumo:
BACKGROUND Community-acquired pneumonia (CAP) is the third-leading infectious cause of death worldwide. The standard treatment of CAP has not changed for the past fifty years and its mortality and morbidity remain high despite adequate antimicrobial treatment. Systemic corticosteroids have anti-inflammatory effects and are therefore discussed as adjunct treatment for CAP. Available studies show controversial results, and the question about benefits and harms of adjunct corticosteroid therapy has not been conclusively resolved, particularly in the non-critical care setting. METHODS/DESIGN This randomized multicenter study compares a treatment with 7 days of prednisone 50 mg with placebo in adult patients hospitalized with CAP independent of severity. Patients are screened and enrolled within the first 36 hours of presentation after written informed consent is obtained. The primary endpoint will be time to clinical stability, which is assessed every 12 hours during hospitalization. Secondary endpoints will be, among others, all-cause mortality within 30 and 180 days, ICU stay, duration of antibiotic treatment, disease activity scores, side effects and complications, value of adrenal function testing and prognostic hormonal and inflammatory biomarkers to predict outcome and treatment response to corticosteroids. Eight hundred included patients will provide an 85% power for the intention-to-treat analysis of the primary endpoint. DISCUSSION This largest to date double-blind placebo-controlled multicenter trial investigates the effect of adjunct glucocorticoids in 800 patients with CAP requiring hospitalization. It aims to give conclusive answers about benefits and risks of corticosteroid treatment in CAP. The inclusion of less severe CAP patients will be expected to lead to a relatively low mortality rate and survival benefit might not be shown. However, our study has adequate power for the clinically relevant endpoint of clinical stability. Due to discontinuing glucocorticoids without tapering after seven days, we limit duration of glucocorticoid exposition, which may reduce possible side effects. TRIAL REGISTRATION 7 September 2009 on ClinicalTrials.gov: NCT00973154.
Resumo:
OBJECTIVES/HYPOTHESIS Study of the clinical evolution of a primary ear, nose, and throat infection complicated by septic thrombophlebitis of the internal jugular vein. STUDY DESIGN Retrospective case-control study. PATIENTS AND METHODS From 1998 to 2010, 23 patients at our institution were diagnosed with a septic thrombosis of the internal jugular vein. Diagnostics included microbiologic analysis and imaging such as computed tomography, magnetic resonance imaging, and ultrasound. Therapy included broad-spectrum antibiotics, surgery of the primary infectious lesion, and postoperative anticoagulation. The patients were retrospectively analyzed. RESULTS The primary infection sites were found in the middle ear (11), oropharynx (8), sinus (3), and oral cavity (1). Fourteen patients needed intensive care unit treatment for a mean duration of 6 days. Seven patients were intubated, and two developed severe acute respiratory distress syndrome. An oropharynx primary infection site was most prone to a prolonged clinical evolution. Anticoagulation therapy was given in 90% of patients. All 23 patients survived the disseminated infection without consecutive systemic morbidity. CONCLUSION In the pre-antibiotic time, septic internal jugular vein thrombophlebitis was a highly fatal condition with a mortality rate of 90%. Modern imaging techniques allow early and often incidental diagnosis of this clinically hidden complication. Anticoagulation, intensive antibiotic therapy assisted by surgery of the primary infection site, and intensive supportive care can reach remission rates of 100%. LEVEL OF EVIDENCE 3b. Laryngoscope, 2014.