129 resultados para Fire severity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Triggering receptor expressed on myeloid cells-1 (TREM-1) is a potent amplifier of pro-inflammatory innate immune reactions. While TREM-1-amplified responses likely aid an improved detection and elimination of pathogens, excessive production of cytokines and oxygen radicals can also severely harm the host. Studies addressing the pathogenic role of TREM-1 during endotoxin-induced shock or microbial sepsis have so far mostly relied on the administration of TREM-1 fusion proteins or peptides representing part of the extracellular domain of TREM-1. However, binding of these agents to the yet unidentified TREM-1 ligand could also impact signaling through alternative receptors. More importantly, controversial results have been obtained regarding the requirement of TREM-1 for microbial control. To unambiguously investigate the role of TREM-1 in homeostasis and disease, we have generated mice deficient in Trem1. Trem1(-/-) mice are viable, fertile and show no altered hematopoietic compartment. In CD4(+) T cell- and dextran sodium sulfate-induced models of colitis, Trem1(-/-) mice displayed significantly attenuated disease that was associated with reduced inflammatory infiltrates and diminished expression of pro-inflammatory cytokines. Trem1(-/-) mice also exhibited reduced neutrophilic infiltration and decreased lesion size upon infection with Leishmania major. Furthermore, reduced morbidity was observed for influenza virus-infected Trem1(-/-) mice. Importantly, while immune-associated pathologies were significantly reduced, Trem1(-/-) mice were equally capable of controlling infections with L. major, influenza virus, but also Legionella pneumophila as Trem1(+/+) controls. Our results not only demonstrate an unanticipated pathogenic impact of TREM-1 during a viral and parasitic infection, but also indicate that therapeutic blocking of TREM-1 in distinct inflammatory disorders holds considerable promise by blunting excessive inflammation while preserving the capacity for microbial control.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PRINCIPALS Accidents in agriculture are a problem of global importance. The hazards of working in agriculture are manifold (machines, animals, heights). We therefore assessed injury severity and mortality from accidents in farming. METHODS We retrospectively analysed all farming accidents treated over a 12-year period in the emergency department (ED) of our level I trauma centre. RESULTS Out of 815 patients 96.3% were male and 3.7% female (p <0.0001). A total of 70 patients (8.6%, 70/815) were severely injured. Patients with injuries to the chest were most likely to suffer from severe injuries (odds ratio [OR] 9.45, 95% confidence interval [CI] 5.59-16.00, p <0.0001), followed by patients with injuries to the abdomen (OR 7.06, 95% CI 3.22-15.43, p <0.0001) and patients with injuries to the head (OR 5.03, 95% CI 2.99-8.66, p <0.0001). Hospitalisation was associated with machine- and fall-related injuries (OR 22.39, 95% CI 1.95-4.14, p <0.0001 and OR 2.84 95% CI 1.68-3.41 p <0.001, respectively). Patients suffering from a fall and patients with severe injury were more likely to die than others (OR 3.32, 95% CI 1.07-10.29, p <0.037 and OR 9.17, 95% CI 6.20-13.56, p <0.0001, respectively). Fall height correlated positively with the injury severity score , hospitalisation and mortality (all p <0.0001). CONCLUSION Injuries in agriculture are accompanied by substantial morbidity and mortality, and range from minor injuries to severe multiple injuries. Additional prospective studies should be conducted on injury severity, long-term disability and mortality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Disorganized behavior is a key symptom of schizophrenia. The objective assessment of disorganized behavior is particularly challenging. Actigraphy has enabled the objective assessment of motor behavior in various settings. Reduced motor activity was associated with negative syndrome scores, but simple motor activity analyses were not informative on other symptom dimensions. The analysis of movement patterns, however, could be more informative for assessing schizophrenia symptom dimensions. Here, we use time series analyses on actigraphic data of 100 schizophrenia spectrum disorder patients. Actigraphy recording intervals were set at 2 s. Data from 2 defined 60-min periods were analyzed, and partial autocorrelations of the actigraphy time series indicated predictability of movements in each individual. Increased positive syndrome scores were associated with reduced predictability of movements but not with the overall amount of movement. Negative syndrome scores were associated with low activity levels but unrelated with predictability of movement. The factors disorganization and excitement were related to movement predictability but emotional distress was not. Thus, the predictability of objectively assessed motor behavior may be a marker of positive symptoms and disorganized behavior. This behavior could become relevant for translational research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE Infection of pancreatic necrosis in necrotizing pancreatitis increases the lethality of patients with acute pancreatitis. To examine mechanisms underlying this clinical observation, we developed and tested a model, in which primary infection of necrosis is achieved in taurocholate-induced pancreatitis in mice. METHODS Sterile necrosis of acute necrotizing pancreatitis was induced by retrograde injection of 4% taurocholate into the common bile duct of Balb/c mice. Primary infection of pancreatic necrosis was induced by coinjecting 10 colony-forming units of Escherichia coli. Animals were killed after 6, 12, 24, 48, and 120 hours, and pancreatic damage and pancreatitis-associated systemic inflammatory response were assessed. RESULTS Mice with pancreatic acinar cell necrosis had an increased bacterial concentration in all tissues and showed sustained bacteremia. Acute pancreatitis was induced only by coinjection of taurocholate and not by bacterial infection alone. Infection of pancreatic necrosis increased pancreatic damage and the pulmonary vascular leak. Serum glucose concentrations serving as a parameter of hepatic function were reduced in mice with infected pancreatic necrosis. CONCLUSIONS Primary infection of pancreatic necrosis with E. coli increases both pancreatic damage and pulmonary and hepatic complications in acute necrotizing pancreatitis in mice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Treelines are expected to rise to higher elevations with climate warming; the rate and extent however are still largely unknown. Here we present the first multi-proxy palaeoecological study from the treeline in the Northwestern Swiss Alps that covers the entire Holocene. We reconstructed climate, fire and vegetation dynamics at Iffigsee, an alpine lake at 2,065 m a.s.l., by using seismic sedimentary surveys, loss on ignition, visible spectrum reflectance spectroscopy, pollen, spore, macrofossil and charcoal analyses. Afforestation with Larix decidua and tree Betula (probably B. pendula) started at ~9,800 cal. b.p., more than 1,000 years later than at similar elevations in the Central and Southern Alps, indicating cooler temperatures and/or a high seasonality. Highest biomass production and forest position of ~2,100–2,300 m a.s.l. are inferred during the Holocene Thermal Maximum from 7,000 to 5,000 cal. b.p. With the onset of pastoralism and transhumance at 6,800–6,500 cal. b.p., human impact became an important factor in the vegetation dynamics at Iffigsee. This early evidence of pastoralism is documented by the presence of grazing indicators (pollen, spores), as well as a wealth of archaeological finds at the nearby mountain pass of Schnidejoch. Human and fire impact during the Neolithic and Bronze Ages led to the establishment of pastures and facilitated the expansion of Picea abies and Alnus viridis. We expect that in mountain areas with land abandonment, the treeline will react quickly to future climate warming by shifting to higher elevations, causing drastic changes in species distribution and composition as well as severe biodiversity losses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomarkers of blood lipid modification and oxidative stress have been associated with increased cardiovascular morbidity. We sought to determine whether these biomarkers were related to functional indices of stenosis severity among patients with stable coronary artery disease. We studied 197 consecutive patients with stable coronary artery disease due to single vessel disease. Fractional flow reserve (FFR) ≤ 0.80 was assessed as index of a functionally significant lesion. Serum levels of secretory phospholipase A2 (sPLA2) activity, secretory phospholipase A2 type IIA (sPLA2-IIA), myeloperoxydase (MPO), lipoprotein-associated phospholipase A2 (Lp-PLA2), and oxidized low-density lipoprotein (OxLDL) were assessed using commercially available assays. Patients with FFR > 0.8 had higher sPLA2 activity, sPLA2 IIA, and OxLDL levels than patients with FFR ≤ 0.8 (21.25 [16.03-27.28] vs 25.85 [20.58-34.63] U/mL, p < 0.001, 2.0 [1.5-3.4] vs 2.6 [2.0-3.4] ng/mL, p < 0.01; and 53.0 [36.0-71.0] vs 64.5 [50-89.25], p < 0.001 respectively). Patients with FFR > 0.80 had similar Lp-PLA2 and MPO levels versus those with FFR ≤ 0.8. sPLA2 activity, sPLA2 IIA significantly increased area under the curve over baseline characteristics to predict FFR ≤ 0.8 (0.67 to 0.77 (95 % confidence interval [CI]: 0.69-0.85) p < 0.01 and 0.67 to 0.77 (95 % CI: 0.69-0.84) p < 0.01, respectively). Serum sPLA2 activity as well as sPLA2-IIA level is related to functional characteristics of coronary stenoses in patients with stable coronary artery disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM The aim of this study was to evaluate whether coronary artery disease (CAD) severity exerts a gradient of risk in patients with aortic stenosis (AS) undergoing transcatheter aortic valve implantation (TAVI). METHODS AND RESULTS A total of 445 patients with severe AS undergoing TAVI were included into a prospective registry between 2007 and 2012. The preoperative SYNTAX score (SS) was determined from baseline coronary angiograms. In case of revascularization prior to TAVI, residual SS (rSS) was also determined. Clinical outcomes were compared between patients without CAD (n = 158), patients with low SS (0-22, n = 207), and patients with high SS (SS >22, n = 80). The pre-specified primary endpoint was the composite of cardiovascular death, stroke, or myocardial infarction (MI). At 1 year, CAD severity was associated with higher rates of the primary endpoint (no CAD: 12.5%, low SS: 16.1%, high SS: 29.6%; P = 0.016). This was driven by differences in cardiovascular mortality (no CAD: 8.6%, low SS: 13.6%, high SS: 20.4%; P = 0.029), whereas the risk of stroke (no CAD: 5.1%, low SS: 3.3%, high SS: 6.7%; P = 0.79) and MI (no CAD: 1.5%, low SS: 1.1%, high SS: 4.0%; P = 0.54) was similar across the three groups. Patients with high SS received less complete revascularization as indicated by a higher rSS (21.2 ± 12.0 vs. 4.0 ± 4.4, P < 0.001) compared with patients with low SS. High rSS tertile (>14) was associated with higher rates of the primary endpoint at 1 year (no CAD: 12.5%, low rSS: 16.5%, high rSS: 26.3%, P = 0.043). CONCLUSIONS Severity of CAD appears to be associated with impaired clinical outcomes at 1 year after TAVI. Patients with SS >22 receive less complete revascularization and have a higher risk of cardiovascular death, stroke, or MI than patients without CAD or low SS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To compare EEG power spectra and LORETA-computed intracortical activity between Alzheimer's disease (AD) patients and healthy controls, and to correlate the results with cognitive performance in the AD group. METHODS Nineteen channel resting EEG was recorded in 21 mild to moderate AD patients and in 23 controls. Power spectra and intracortical LORETA tomography were computed in seven frequency bands and compared between groups. In the AD patients, the EEG results were correlated with cognitive performance (Mini Mental State Examination, MMSE). RESULTS AD patients showed increased power in EEG delta and theta frequency bands, and decreased power in alpha2, beta1, beta2 and beta3. LORETA specified that increases and decreases of power affected different cortical areas while largely sparing prefrontal cortex. Delta power correlated negatively and alpha1 power positively with the AD patients' MMSE scores; LORETA tomography localized these correlations in left temporo-parietal cortex. CONCLUSIONS The non-invasive EEG method of LORETA localized pathological cortical activity in our mild to moderate AD patients in agreement with the literature, and yielded striking correlations between EEG delta and alpha1 activity and MMSE scores in left temporo-parietal cortex. SIGNIFICANCE The present data support the hypothesis of an asymmetrical progression of the Alzheimer's disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomass burning is a major source of greenhouse gases and influences regional to global climate. Pre-industrial fire-history records from black carbon, charcoal and other proxies provide baseline estimates of biomass burning at local to global scales spanning millennia, and are thus useful to examine the role of fire in the carbon cycle and climate system. Here we use the specific biomarker levoglucosan together with black carbon and ammonium concentrations from the North Greenland Eemian (NEEM) ice cores (77.49° N, 51.2° W; 2480 m a.s.l) over the past 2000 years to infer changes in boreal fire activity. Increases in boreal fire activity over the periods 1000–1300 CE and decreases during 700–900 CE coincide with high-latitude NH temperature changes. Levoglucosan concentrations in the NEEM ice cores peak between 1500 and 1700 CE, and most levoglucosan spikes coincide with the most extensive central and northern Asian droughts of the past millennium. Many of these multi-annual droughts are caused by Asian monsoon failures, thus suggesting a connection between low- and high-latitude climate processes. North America is a primary source of biomass burning aerosols due to its relative proximity to the Greenland Ice Cap. During major fire events, however, isotopic analyses of dust, back trajectories and links with levoglucosan peaks and regional drought reconstructions suggest that Siberia is also an important source of pyrogenic aerosols to Greenland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Changes in fire occurrence during the last decades in the southern Swiss Alps make knowledge on fire history essential to understand future evolution of the ecosystem composition and functioning. In this context, palaeoecology provides useful insights into processes operating at decadal-to-millennial time scales, such as the response of plant communities to intensified fire disturbances during periods of cultural change. We provide a high-resolution macroscopic charcoal and pollen series from Guèr, a well-dated peat sequence at mid-elevation (832 m.a.s.l.) in southern Switzerland, where the presence of local settlements is documented since the late Bronze Age and the Iron Age. Quantitative fire reconstruction shows that fire activity sharply increased from the Neolithic period (1–3 episodes/1000 year) to the late Bronze and Iron Age (7–9 episodes/1000 year), leading to extensive clearance of the former mixed deciduous forest (Alnus glutinosa, Betula, deciduous Quercus). The increase in anthropogenic pollen indicators (e.g. Cerealia-type, Plantago lanceolata) together with macroscopic charcoal suggests anthropogenic rather than climatic forcing as the main cause of the observed vegetation shift. Fire and controlled burning were extensively used during the late Roman Times and early Middle Ages to promote the introduction and establishment of chestnut (Castanea sativa) stands, which provided an important wood and food supply. Fire occurrence declined markedly (from 9 to 5–6 episodes/1000 year) during late Middle Ages because of fire suppression, biomass removal by human population, and landscape fragmentation. Land-abandonment during the last decades allowed forest to partly re-expand (mainly Alnus glutinosa, Betula) and fire frequency to increase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to determine if severity assessment tools (general severity of illness and community-acquired pneumonia specific scores) can be used to guide decisions for patients admitted to the intensive care unit (ICU) due to pandemic influenza A pneumonia. A prospective, observational, multicentre study included 265 patients with a mean age of 42 (±16.1) years and an ICU mortality of 31.7%. On admission to the ICU, the mean pneumonia severity index (PSI) score was 103.2 ± 43.2 points, the CURB-65 score was 1.7 ± 1.1 points and the PIRO-CAP score was 3.2 ± 1.5 points. None of the scores had a good predictive ability: area under the ROC for PSI, 0.72 (95% CI, 0.65-0.78); CURB-65, 0.67 (95% CI, 0.59-0.74); and PIRO-CAP, 0.64 (95% CI, 0.56-0.71). The PSI score (OR, 1.022 (1.009-1.034), p 0.001) was independently associated with ICU mortality; however, none of the three scores, when used at ICU admission, were able to reliably detect a low-risk group of patients. Low risk for mortality was identified in 27.5% of patients using PIRO-CAP, but above 40% when using PSI (I-III) or CURB65 (<2). Observed mortality was 13.7%, 13.5% and 19.4%, respectively. Pneumonia-specific scores undervalued severity and should not be used as instruments to guide decisions in the ICU.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a cohort study among 2751 members (71.5% females) of the German and Swiss RLS patient organizations changes in restless legs syndrome (RLS) severity over time was assessed and the impact on quality of life, sleep quality and depressive symptoms was analysed. A standard set of scales (RLS severity scale IRLS, SF-36, Pittsburgh Sleep Quality Index and the Centre for Epidemiologic Studies Depression Scale) in mailed questionnaires was repeatedly used to assess RLS severity and health status over time and a 7-day diary once to assess short-term variations. A clinically relevant change of the RLS severity was defined by a change of at least 5 points on the IRLS scale. During 36 months follow-up minimal improvement of RLS severity between assessments was observed. Men consistently reported higher severity scores. RLS severity increased with age reaching a plateau in the age group 45-54 years. During 3 years 60.2% of the participants had no relevant (±5 points) change in RLS severity. RLS worsening was significantly related to an increase in depressive symptoms and a decrease in sleep quality and quality of life. The short-term variation showed distinctive circadian patterns with rhythm magnitudes strongly related to RLS severity. The majority of participants had a stable course of severe RLS over three years. An increase in RLS severity was accompanied by a small to moderate negative, a decrease by a small positive influence on quality of life, depressive symptoms and sleep quality.