857 resultados para Infant Mortality Rate


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Septic shock is characterized by increased vascular permeability and hypotension despite increased cardiac output. Numerous vasoactive cytokines are upregulated during sepsis, including angiopoietin 2 (ANG2), which increases vascular permeability. Here we report that mice engineered to inducibly overexpress ANG2 in the endothelium developed sepsis-like hemodynamic alterations, including systemic hypotension, increased cardiac output, and dilatory cardiomyopathy. Conversely, mice with cardiomyocyte-restricted ANG2 overexpression failed to develop hemodynamic alterations. Interestingly, the hemodynamic alterations associated with endothelial-specific overexpression of ANG2 and the loss of capillary-associated pericytes were reversed by intravenous injections of adeno-associated viruses (AAVs) transducing cDNA for angiopoietin 1, a TIE2 ligand that antagonizes ANG2, or AAVs encoding PDGFB, a chemoattractant for pericytes. To confirm the role of ANG2 in sepsis, we i.p. injected LPS into C57BL/6J mice, which rapidly developed hypotension, acute pericyte loss, and increased vascular permeability. Importantly, ANG2 antibody treatment attenuated LPS-induced hemodynamic alterations and reduced the mortality rate at 36 hours from 95% to 61%. These data indicate that ANG2-mediated microvascular disintegration contributes to septic shock and that inhibition of the ANG2/TIE2 interaction during sepsis is a potential therapeutic target.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Birth defects are a leading cause of infant mortality in the developed countries. They are also of increasing concern in many developing countries, such as China. However, prevalence and causes of birth defects in China are inadequately understood.^ The purpose of the present study was to estimated prevalence of birth defects in surviving children under seven years of age in Tianjin, China and investigate determinants of birth defects in the study area.^ The present study took place in Tianjin, China in 1986, involving 22,081 surviving children under seven years of age. Children with birth defects were ascertained through physical examinations by physicians during household visits and ascertainment of birth defects was verified through multiple sources. Of 22,081 surviving children, 524 had birth defects (23.7 per 1,000). The study noted a striking discrepancy in the prevalence of birth defects between urban and rural area. The prevalence of birth defects was 16.3 per 1,000 in the urban and 33.2 per 1,000 in the rural area.^ Using cases of birth defects ascertained from surviving children, a case-control study was carried out. The study observed that first-trimester maternal flu was associated with increased risk of both major and minor birth defects in children after controlling for other maternal factors (adjusted odds ratio (OR) = 8.7, 95% confidence interval (CI) = 4.3-17.3; OR = 3.6, 95% CI = 1.7-7.5). This association could be biased by different reporting of exposure between mothers of children with birth defects and mothers of children without defects. This study indicated that maternal flu was also associated with congenital heart defects and polydactyly after controlling for other maternal factors (adjusted OR = 32.3, 95% CI = 13.3-78.3; adjusted OR = 5.5, 95% CI = 1.1-27.7). The associations remained when affected controls (children with similar birth defects other than congenital heart defects or polydactyly) were used (adjusted OR = 4.3, 95% CI = 1.2-15.3; OR = 1.4, 95% CI = 1.4-7.9). A weak association between first-trimester vaginal bleeding and selected groups of birth defects was found in this study, but the association may be confounded by other factors. Maternal smoking during pregnancy was modestly associated with cleft lip with or without cleft palate (OR = 1.4, 95% = 0.4-4.9), but the association may be due to chance. Some major limitations in this study warrant caution in interpretation of the findings, especially the causal relation. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Non-Hodgkin's Lymphoma (NHLs) are neoplasms of the immune system. Currently, less than 1% of the etiology of the 22,000 newly diagnosed lymphoma cases in the U.S.A. every year is known. This disease has a significant prevalence and high mortality rate. Cell growth in lymphomas has been shown to be an important parameter in aggressive NHL when establishing prognosis, as well as an integral part in the pathophysiology of the disease process. While many aggressive B cell NHLs respond initially to chemotherapeutic regimens such as CHOP-bleo (adriamycin, vincristine and bleomycin) etc., relapse is common, and the patient is then often refractory to further salvage treatment regimens.^ To assess their potential to inhibit aggressive B cell NHLs and induce apoptosis (also referred to as programmed cell death (PCD)), it was proposed to utilize the following biological agents-liposomal all-trans retinoic acid (L-ATRA) which is a derivative of Vitamin A in liposomes and Vitamin D3. Preliminary evidence indicates that L-ATRA may inhibit cell growth in these cells and may induce PCD as well. Detailed studies were performed to understand the above phenomena by L-ATRA and Vitamin D3 in recently established NHL-B cell lines and primary cell cultures. The gene regulation involved in the case of L-ATRA was also delineated. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cancer is a chronic disease that often necessitates recurrent hospitalizations, a costly pattern of medical care utilization. In chronically ill patients, most readmissions are for treatment of the same condition that caused the preceding hospitalization. There is concern that rather than reducing costs, earlier discharge may shift costs from the initial hospitalization to emergency center visits. ^ This is the first descriptive study to measure the incidence of emergency center visits (ECVs) after hospitalization at The University of M. D. Anderson Cancer Center (UTMDACC), to identify the risk factors for and outcomes of these ECVs, and to compare 30-day all-cause mortality and costs for episodes of care with and without ECVs. ^ We identified all hospitalizations at UTMDACC with admission dates from September 1, 1993 through August 31, 1997 which met inclusion criteria. Data were electronically obtained primarily from UTMDACC's institutional database. Demographic factors, clinical factors, duration of the index hospitalization, method of payment for care, and year of hospitalization study were variables determined for each hospitalization. ^ The overall incidence of ECVs was 18%. Forty-five percent of ECVs resulted in hospital readmission (8% of all hospitalizations). In 1% of ECVs the patient died in the emergency center, and for the remaining 54% of ECVs the patient was discharged home. Risk factors for ECVs were marital status, type of index hospitalization, cancer type, and duration of the index hospitalization. The overall 30-day all-cause mortality rate was 8.6% for hospitalizations with an ECV and 5.3% for those without an ECV. In all subgroups, the 30-day all-cause mortality rate was higher for groups with ECVs than for those without ECVs. The most important factor increasing cost was having an ECV. In all patient subgroups, the cost per episode of care with an ECV was at least 1.9 times the cost per episode without an ECV. ^ The higher costs and poorer outcomes of episodes of care with ECVs and hospital readmissions suggest that interventions to avoid these ECVs or mitigate their costs are needed. Further research is needed to improve understanding of the methodological issues involved in relation to health care issues for cancer patients. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND We describe the setup of a neonatal quality improvement tool and list which peer-reviewed requirements it fulfils and which it does not. We report on the so-far observed effects, how the units can identify quality improvement potential, and how they can measure the effect of changes made to improve quality. METHODS Application of a prospective longitudinal national cohort data collection that uses algorithms to ensure high data quality (i.e. checks for completeness, plausibility and reliability), and to perform data imaging (Plsek's p-charts and standardized mortality or morbidity ratio SMR charts). The collected data allows monitoring a study collective of very low birth-weight infants born from 2009 to 2011 by applying a quality cycle following the steps 'guideline - perform - falsify - reform'. RESULTS 2025 VLBW live-births from 2009 to 2011 representing 96.1% of all VLBW live-births in Switzerland display a similar mortality rate but better morbidity rates when compared to other networks. Data quality in general is high but subject to improvement in some units. Seven measurements display quality improvement potential in individual units. The methods used fulfil several international recommendations. CONCLUSIONS The Quality Cycle of the Swiss Neonatal Network is a helpful instrument to monitor and gradually help improve the quality of care in a region with high quality standards and low statistical discrimination capacity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES This study aimed to demonstrate that the presence of late gadolinium enhancement (LGE) is a predictor of death and other adverse events in patients with suspected cardiac sarcoidosis. BACKGROUND Cardiac sarcoidosis is the most important cause of patient mortality in systemic sarcoidosis, yielding a 5-year mortality rate between 25% and 66% despite immunosuppressive treatment. Other groups have shown that LGE may hold promise in predicting future adverse events in this patient group. METHODS We included 155 consecutive patients with systemic sarcoidosis who underwent cardiac magnetic resonance (CMR) for workup of suspected cardiac sarcoid involvement. The median follow-up time was 2.6 years. Primary endpoints were death, aborted sudden cardiac death, and appropriate implantable cardioverter-defibrillator (ICD) discharge. Secondary endpoints were ventricular tachycardia (VT) and nonsustained VT. RESULTS LGE was present in 39 patients (25.5%). The presence of LGE yields a Cox hazard ratio (HR) of 31.6 for death, aborted sudden cardiac death, or appropriate ICD discharge, and of 33.9 for any event. This is superior to functional or clinical parameters such as left ventricular (LV) ejection fraction (EF), LV end-diastolic volume, or presentation as heart failure, yielding HRs between 0.99 (per % increase LVEF) and 1.004 (presentation as heart failure), and between 0.94 and 1.2 for potentially lethal or other adverse events, respectively. Except for 1 patient dying from pulmonary infection, no patient without LGE died or experienced any event during follow-up, even if the LV was enlarged and the LVEF severely impaired. CONCLUSIONS Among our population of sarcoid patients with nonspecific symptoms, the presence of myocardial scar indicated by LGE was the best independent predictor of potentially lethal events, as well as other adverse events, yielding a Cox HR of 31.6 and of 33.9, respectively. These data support the necessity for future large, longitudinal follow-up studies to definitely establish LGE as an independent predictor of cardiac death in sarcoidosis, as well as to evaluate the incremental prognostic value of additional parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

I modeled the cumulative impact of hydroelectric projects with and without commercial fishing weirs and water-control dams on the production, survival to the sea, and potential fecundity of migrating female silver-phase American eels, Anguilla rostrata in the Kennebec River basin, Maine, This river basin has 22 hydroelectric projects, 73 water-control dams, and 15 commercial fishing weir sites. The modeled area included an 8,324 km(2) segment of the drainage area between Merrymeeting Bay and the upper limit of American eel distribution in the basin. One set of input,, (assumed or real values) concerned population structure (Le., population density and sex ratio changes throughout the basin, female length-class distribution, and drainage area between dams), Another set concerned factors influencing survival and potential fecundity of migrating American eels (i.e., pathway sequences through projects, survival rate per project by length-class. and length-fecundity relationship). Under baseline conditions about 402,400 simulated silver female American eels would be produced annually reductions in their numbers due to dams and weirs would reduce the realized fecundity (i.e., the number of eggs produced by all females that survived the migration). Without weirs or water-control dams, about 63% of the simulated silverphase American eels survived their freshwater spawning migration run to the sea when the survival rate at each hydroelectric dam was 9017, 40% survived at 80% survival per dam, and 18% survived at 60% survival per dam. Removing the lowermost hydroelectric dam on the Kennebec River increased survival by 6.0-7.6% for the basin. The efficient commercial weirs reduced survival to the sea to 69-76%( of what it would have been without weirs', regardless of survival rates at hydroelectric dams. Water-control dams had little impact on production in this basin because most were located in the upper reaches of tributaries. Sensitivity analysis led to the conclusion that small changes in population density and female length distribution had greater effects on survival and realized fecundity than similar changes in turbine survival rate. The latter became more important as turbine survival rate decreased. Therefore, it might be more fruitful to determine population distribution in basins of interest than to determine mortality rate at each hydroelectric project.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cancers of the reproductive system are among the leading causes of mortality in women in the United States. While both genetic and environmental factors have been implicated in their etiology, the extent of the contribution of environmental factors to human diseases remains controversial. To better address the role of environmental exposures in cancer etiology, there has been an increasing focus on the development of nontraditional, environmentally relevant models. Our research involves the development of one such model, Gonadal tumors have been described in the softshell clam (Mya arenaria) in Maine and the hardshell clam (Mercenaria spp.) from Florida. Prevalence of these tumors is as high as 40% in some populations in eastern Maine and 60% in Some areas along the Indian River in Florida. The average tumor prevalence in Maine and Florida is approximately 20 and 11%, respectively. An association has been suggested between the use of herbicides and the incidence of gonadal tumors in the softshell clam in Maine. The role of environmental exposures in the development of the tumors in Mercenaria in Florida is unknown, however, there is evidence that genetic factors may contribute to its etiology. Epidemiologic studies of human populations in these same areas show a higher than average mortality rate due to cancers of the reproductive system in women, including both ovarian and breast career. The relationship, if any, among these observations is unknown, Our studies on the molecular basis of this disease in clams may provide additional information on environmental exposures and their possible link to cancer in clams and other organisms, including humans.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND & AIMS Patients with cirrhosis hospitalized for an acute decompensation (AD) and organ failure are at risk for imminent death and considered to have acute-on-chronic liver failure (ACLF). However, there are no established diagnostic criteria for ACLF, so little is known about its development and progression. We aimed to identify diagnostic criteria of ACLF and describe the development of this syndrome in European patients with AD. METHODS We collected data from 1343 hospitalized patients with cirrhosis and AD from February to September 2011 at 29 liver units in 8 European countries. We used the organ failure and mortality data to define ACLF grades, assess mortality, and identify differences between ACLF and AD. We established diagnostic criteria for ACLF based on analyses of patients with organ failure (defined by the chronic liver failure-sequential organ failure assessment [CLIF-SOFA] score) and high 28-day mortality rate (>15%). RESULTS Of the patients assessed, 303 had ACLF when the study began, 112 developed ACLF, and 928 did not have ACLF. The 28-day mortality rate among patients who had ACLF when the study began was 33.9%, among those who developed ACLF was 29.7%, and among those who did not have ACLF was 1.9%. Patients with ACLF were younger and more frequently alcoholic, had more associated bacterial infections, and had higher numbers of leukocytes and higher plasma levels of C-reactive protein than patients without ACLF (P < .001). Higher CLIF-SOFA scores and leukocyte counts were independent predictors of mortality in patients with ACLF. In patients without a prior history of AD, ACLF was unexpectedly characterized by higher numbers of organ failures, leukocyte count, and mortality compared with ACLF in patients with a prior history of AD. CONCLUSIONS We analyzed data from patients with cirrhosis and AD to establish diagnostic criteria for ACLF and showed that it is distinct from AD, based not only on the presence of organ failure(s) and high mortality rate but also on age, precipitating events, and systemic inflammation. ACLF mortality is associated with loss of organ function and high leukocyte counts. ACLF is especially severe in patients with no prior history of AD.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Occasional strong droughts are an important feature of the climatic environment of tropical rain forest in much of Borneo. This paper compares the response of a lowland dipterocarp forest at Danum, Sabah, in a period of low (LDI) and a period of high (HDI) drought intensity (1986-96, 9.98 y;1996-99, 2.62 y). Mean annual drought intensity was two-fold higher in the HDI than LDI period (1997 v. 976 mm), and each period had one moderately strong main drought (viz. 1992, 1998). Mortality of `all' trees greater than or equal to 10 cm gbh (girth at breast height) and stem growth rates of `small' trees 10less than or equal to50 cm gbh were measured in sixteen 0.16-ha subplots (half on ridge, half on lower slope sites) within two 4-ha plots. These 10-50-cm trees were composed largely of true understorey species. A new procedure was developed to correct for the effect of differences in length of census interval when comparing tree mortality rates. Mortality rates of small trees declined slightly but not significantly between the LDI and HDI periods (1.53 to 1.48% y(-1)): mortality of all trees showed a similar pattern. Relative growth rates declined significantly by 23% from LDI to HDI periods (11.1 to 8.6 mm m(-1) y(-1)): for absolute growth rates the decrease was 28% (2.45 to 1.77 mm y(-1)). Neither mortality nor growth rates were significantly influenced by topography. For small trees, across subplots, absolute growth rate was positively correlated in the LDI period, but negatively correlated in the HDI period, with mortality rate. There was no consistent pattern in the responses among the 19 most abundant species (n greater than or equal to 50 trees) which included a proposed drought-tolerant guild. In terms of tree survival, the forest at Danum was resistant to increasing drought intensity, but showed decreased stem growth attributable to increasing water stress.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A marked increase in canine leptospirosis was observed in Switzerland over 10 years with a peak incidence of 28.1 diagnosed cases/100,000 dogs/year in the most affected canton. With 95% affected dogs living at altitudes <800 m, the disease presented a seasonal pattern associated with temperature (r2 0.73) and rainfall (r2 0.39), >90% cases being diagnosed between May and October. The increasing yearly incidence however was only weakly correlated with climatic data including number of summer (r2 0.25) or rainy days (r2 0.38). Serovars Australis and Bratislava showed the highest seropositivity rates with 70.5% and 69.1%, respectively. Main clinical manifestations included renal (99.6%), pulmonary (76.7%), hepatic (26.0%), and hemorrhagic syndromes (18.2%), leading to a high mortality rate (43.3%). Similar to the human disease, liver involvement had the strongest association with negative outcome (OR 16.3). Based on these data, canine leptospirosis presents similar features and severity as the human infection for which it therefore can be considered a model. Its re-emergence in a temperate country with very high incidence rates in canines should thus be viewed as a warning and emphasize the need for increased awareness in other species.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Seed production, seed dispersal, and seedling recruitment are integral to forest dynamics, especially in masting species. Often these are studied separately, yet scarcely ever for species with ballistic dispersal even though this mode of dispersal is common in legume trees of tropical African rain forests. Here, we studied two dominant main-canopy tree species, Microberlinia bisulcata and Tetraberlinia bifoliolata (Caesalpinioideae), in 25 ha of primary rain forest at Korup, Cameroon, during two successive masting events (2007/2010). In the vicinity of c. 100 and 130 trees of each species, 476/580 traps caught dispersed seeds and beneath their crowns c. 57,000 pod valves per species were inspected to estimate tree-level fecundity. Seed production of trees increased non-linearly and asymptotically with increasing stem diameters. It was unequal within the two species’ populations, and differed strongly between years to foster both spatial and temporal patchiness in seed rain. The M. bisulcata trees could begin seeding at 42–44 cm diameter: at a much larger size than could T. bifoliolata (25 cm). Nevertheless, per capita life-time reproductive capacity was c. five times greater in M. bisulcata than T. bifoliolata owing to former’s larger adult stature, lower mortality rate (despite a shorter life-time) and smaller seed mass. The two species displayed strong differences in their dispersal capabilities. Inverse modelling (IM) revealed that dispersal of M. bisulcata was best described by a lognormal kernel. Most seeds landed at 10–15 m from stems, with 1% of them going beyond 80 m (<100 m). The direct estimates of fecundity significantly improved the models fitted. The lognormal also described well the seedling recruitment distribution of this species in 121 ground plots. By contrast, the lower intensity of masting and more limited dispersal of the heavier-seeded T. bifoliolata prevented reliable IM. For this species, seed density as function of distance to traps suggested a maximum dispersal distance of 40–50 m, and a correspondingly more aggregated seedling recruitment pattern ensued than for M. bisulcata. From this integrated field study, we conclude that the reproductive traits of M. bisulcata give it a considerable advantage over T. bifoliolata by better dispersing more seeds per capita to reach more suitable establishment sites, and combined with other key traits they explain its local dominance in the forest. Understanding the linkages between size at onset of maturity, individual fecundity, and dispersal capability can better inform the life-history strategies, and hence management, of co-occurring tree species in tropical forests.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES The purpose of this study was to investigate the survival effects of inferior vena cava filters in patients with venous thromboembolism (VTE) who had a significant bleeding risk. BACKGROUND The effectiveness of inferior vena cava filter use among patients with acute symptomatic VTE and known significant bleeding risk remains unclear. METHODS In this prospective cohort study of patients with acute VTE identified from the RIETE (Computerized Registry of Patients With Venous Thromboembolism), we assessed the association between inferior vena cava filter insertion for known significant bleeding risk and the outcomes of all-cause mortality, pulmonary embolism (PE)-related mortality, and VTE rates through 30 days after the initiation of VTE treatment. Propensity score matching was used to adjust for the likelihood of receiving a filter. RESULTS Of the 40,142 eligible patients who had acute symptomatic VTE, 371 underwent filter placement because of known significant bleeding risk. A total of 344 patients treated with a filter were matched with 344 patients treated without a filter. Propensity score-matched pairs showed a nonsignificant trend toward lower risk of all-cause death for filter insertion compared with no insertion (6.6% vs. 10.2%; p = 0.12). The risk-adjusted PE-related mortality rate was lower for filter insertion than no insertion (1.7% vs. 4.9%; p = 0.03). Risk-adjusted recurrent VTE rates were higher for filter insertion than for no insertion (6.1% vs. 0.6%; p < 0.001). CONCLUSIONS In patients presenting with VTE and with a significant bleeding risk, inferior vena cava filter insertion compared with anticoagulant therapy was associated with a lower risk of PE-related death and a higher risk of recurrent VTE. However, study design limitations do not imply a causal relationship between filter insertion and outcome.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose The aim was to test the impact of body mass index (BMI) and gender on infectious complications after polytrauma. Methods A total of 651 patients were included in this retrospective study, with an Injury Severity Score (ISS) C16 and age C16 years. The sample was subdivided into three groups: BMI\25 kg/m2, BMI 25–30 kg/m2, and BMI[30 kg/m2, and a female and a male group. Infectious complications were observed for 31 days after admission. Data are given as mean ± standard errors of the means. Analysis of variance, Kruskal–Wallis test, v2 tests, and Pearson’s correlation were used for the analyses and the significance level was set at P\0.05. Results The overall infection rates were 31.0 % in the BMI\25 kg/m2 group, 29.0 % in the BMI 25–30 kg/m2 group, and 24.5 % in the BMI[30 kg/m2 group (P = 0.519). The female patients developed significantly fewer infectious complications than the male patients (26.8 vs. 73.2 %; P\0.001). The incidence of death was significantly decreased according to the BMI group (8.8 vs. 7.2 vs. 1.5 %; P\0.0001) and the female population had a significantly lower mortality rate (4.1 vs. 13.4 %; P\0.0001). Pearson’s correlations between the Abbreviated Injury Scale (AIS) score and the corresponding infectious foci were not significant. Conclusion Higher BMI seems to be protective against polytrauma-associated death but not polytrauma-associated infections, and female gender protects against both polytrauma- associated infections and death. Understanding gender-specific immunomodulation could improve the outcome of polytrauma patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES To report the mid-term results of aortic root replacement using a self-assembled biological composite graft, consisting of a vascular tube graft and a stented tissue valve. METHODS Between January 2005 and December 2011, 201 consecutive patients [median age 66 (interquartile range, IQR, 55-77) years, 31 female patients (15.4%), median logistic EuroSCORE 10 (IQR 6.8-23.2)] underwent aortic root replacement using a stented tissue valve for the following indications: annulo-aortic ectasia or ascending aortic aneurysm with aortic valve disease in 162 (76.8%) patients, active infective endocarditis in 18 (9.0%) and acute aortic dissection Stanford type A in 21 (10.4%). All patients underwent clinical and echocardiographic follow-up. We analysed survival and valve-related events. RESULTS The overall in-hospital mortality rate was 4.5%. One- and 5-year cardiac-related mortality rates were 3 and 6%, and overall survival was 95 ± 1.5 and 75 ± 3.6%, respectively. The rate of freedom from structural valve failure was 99% and 97 ± 0.4% at the 1- and 5-year follow-up, respectively. The incidence rates of prosthetic valve endocarditis were 3 and 4%, respectively. During a median follow-up of 28 (IQR 14-51) months, only 2 (1%) patients required valve-related redo surgery due to prosthetic valvular endocarditis and none suffered from thromboembolic events. One percent of patients showed structural valve deterioration without any clinical symptoms; none of the patients suffered greater than mild aortic regurgitation. CONCLUSIONS Aortic root replacement using a self-assembled biological composite graft is an interesting option. Haemodynamic results are excellent, with freedom from structured valve failure. Need for reoperation is extremely low, but long-term results are necessary to prove the durability of this concept.