18 resultados para 25-34
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Objective Arterial lactate, base excess (BE), lactate clearance, and Sequential Organ Failure Assessment (SOFA) score have been shown to correlate with outcome in severely injured patients. The goal of the present study was to separately assess their predictive value in patients suffering from traumatic brain injury (TBI) as opposed to patients suffering from injuries not related to the brain. Materials and methods A total of 724 adult trauma patients with an Injury Severity Score (ISS) ≥ 16 were grouped into patients without TBI (non-TBI), patients with isolated TBI (isolated TBI), and patients with a combination of TBI and non-TBI injuries (combined injuries). The predictive value of the above parameters was then analyzed using both uni- and multivariate analyses. Results The mean age of the patients was 39 years (77 % males), with a mean ISS of 32 (range 16–75). Mortality ranged from 14 % (non-TBI) to 24 % (combined injuries). Admission and serial lactate/BE values were higher in non-survivors of all groups (all p < 0.01), but not in patients with isolated TBI. Admission SOFA scores were highest in non-survivors of all groups (p = 0.023); subsequently septic patients also showed elevated SOFA scores (p < 0.01), except those with isolated TBI. In this group, SOFA score was the only parameter which showed significant differences between survivors and non-survivors. Receiver operating characteristic (ROC) analysis revealed lactate to be the best overall predictor for increased mortality and further septic complications, irrespective of the leading injury. Conclusion Lactate showed the best performance in predicting sepsis or death in all trauma patients except those with isolated TBI, and the differences were greatest in patients with substantial bleeding. Following isolated TBI, SOFA score was the only parameter which could differentiate survivors from non-survivors on admission, although the SOFA score, too, was not an independent predictor of death following multivariate analysis.
Resumo:
Subseafloor environments preserved in Archean greenstone belts provide an analogue for investigating potential subsurface habitats on Mars. The c. 3.5-3.4 Ga pillow lava metabasalts of the mid-Archean Barberton greenstone belt, South Africa, have been argued to contain the earliest evidence for microbial subseafloor life. This includes candidate trace fossils in the form of titanite microtextures, and sulfur isotopic signatures of pyrite preserved in metabasaltic glass of the c. 3.472 Ga Hooggenoeg Formation. It has been contended that similar microtextures in altered martian basalts may represent potential extraterrestrial biosignatures of microbe-fluid-rock interaction. But despite numerous studies describing these putative early traces of life, a detailed metamorphic characterization of the microtextures and their host alteration conditions in the ancient pillow lava metabasites is lacking. Here, we present a new nondestructive technique with which to study the in situ metamorphic alteration conditions associated with potential biosignatures in mafic-ultramafic rocks of the Hooggenoeg Formation. Our approach combines quantitative microscale compositional mapping by electron microprobe with inverse thermodynamic modeling to derive low-temperature chlorite crystallization conditions. We found that the titanite microtextures formed under subgreenschist to greenschist facies conditions. Two chlorite temperature groups were identified in the maps surrounding the titanite microtextures and record peak metamorphic conditions at 315 ± 40°C (XFe3+(chlorite) = 25-34%) and lower-temperature chlorite veins/microdomains at T = 210 ± 40°C (lower XFe3+(chlorite) = 40-45%). These results provide the first metamorphic constraints in textural context on the Barberton titanite microtextures and thereby improve our understanding of the local preservation conditions of these potential biosignatures. We suggest that this approach may prove to be an important tool in future studies to assess the biogenicity of these earliest candidate traces of life on Earth. Furthermore, we propose that this mapping approach could also be used to investigate altered mafic-ultramafic extraterrestrial samples containing candidate biosignatures.
Resumo:
Healthy soils are critical to agriculture, and both are essential to enabling food security. Soil-related challenges include using soils and other natural resources sustainably, combating land and soil degradation, avoiding further reduction of soil-related ecosystem services, and ensuring that all agricultural land is managed sustainably. Agricultural challenges include improving the quantity and quality of agricultural outputs to satisfy rising human needs, also in a 2 degrees world; maintaining diversity in agricultural systems while supporting those farms with the highest potential for closing existing yield gaps; and providing a livelihood for about 2.6 billion mostly poor land users. The greatest needs and potentials lie in small-scale farming, although there as elsewhere, trade-offs must be negotiated within the nexus of water, energy, land and food, including the role of soil therein.
Rapid drop in the reproduction number during the Ebola outbreak in the Democratic Republic of Congo.
Resumo:
The Democratic Republic of Congo (DRC) experienced a confined rural outbreak of Ebola virus disease (EVD) with 69 reported cases from July to October 2014. Understanding the transmission dynamics during the outbreak can provide important information for anticipating and controlling future EVD epidemics. I fitted an EVD transmission model to previously published data of this outbreak and estimated the basic reproduction number R 0 = 5.2 (95% CI [4.0-6.7]). The model suggests that the net reproduction number Rt fell below unity 28 days (95% CI [25-34] days) after the onset of symptoms in the index case. This study adds to previous epidemiological descriptions of the 2014 EVD outbreak in DRC, and is consistent with the notion that a rapid implementation of control interventions helped reduce further spread.
Resumo:
BACKGROUND Clinical trials yielded conflicting data about the benefit of adding systemic corticosteroids for treatment of community-acquired pneumonia. We assessed whether short-term corticosteroid treatment reduces time to clinical stability in patients admitted to hospital for community-acquired pneumonia. METHODS In this double-blind, multicentre, randomised, placebo-controlled trial, we recruited patients aged 18 years or older with community-acquired pneumonia from seven tertiary care hospitals in Switzerland within 24 h of presentation. Patients were randomly assigned (1:1 ratio) to receive either prednisone 50 mg daily for 7 days or placebo. The computer-generated randomisation was done with variable block sizes of four to six and stratified by study centre. The primary endpoint was time to clinical stability defined as time (days) until stable vital signs for at least 24 h, and analysed by intention to treat. This trial is registered with ClinicalTrials.gov, number NCT00973154. FINDINGS From Dec 1, 2009, to May 21, 2014, of 2911 patients assessed for eligibility, 785 patients were randomly assigned to either the prednisone group (n=392) or the placebo group (n=393). Median time to clinical stability was shorter in the prednisone group (3·0 days, IQR 2·5-3·4) than in the placebo group (4·4 days, 4·0-5·0; hazard ratio [HR] 1·33, 95% CI 1·15-1·50, p<0·0001). Pneumonia-associated complications until day 30 did not differ between groups (11 [3%] in the prednisone group and 22 [6%] in the placebo group; odds ratio [OR] 0·49 [95% CI 0·23-1·02]; p=0·056). The prednisone group had a higher incidence of in-hospital hyperglycaemia needing insulin treatment (76 [19%] vs 43 [11%]; OR 1·96, 95% CI 1·31-2·93, p=0·0010). Other adverse events compatible with corticosteroid use were rare and similar in both groups. INTERPRETATION Prednisone treatment for 7 days in patients with community-acquired pneumonia admitted to hospital shortens time to clinical stability without an increase in complications. This finding is relevant from a patient perspective and an important determinant of hospital costs and efficiency. FUNDING Swiss National Science Foundation, Viollier AG, Nora van Meeuwen Haefliger Stiftung, Julia und Gottfried Bangerter-Rhyner Stiftung.
Resumo:
We investigated the effects of different dietary vitamin D regimen on selected blood parameters in laying hens. Supplementation with vitamin D-3 only was compared with a combination of vitamin D-3 and its metabolite 25-hydroxy-cholecalciferol (25(OH)D-3). Blood concentrations of total calcium, phosphate and 25 (OH)D-3 were determined. Four thousand one-day-old LSL chicks were split in two treatment groups and distributed to eight pens. The control group was given a commercial animal diet containing 2800 IU synthetic vitamin D-3 in the starter feed and 2000 IU synthetic vitamin D-3 in the pullet feed. The experimental group was fed the same commercial diet in which half the synthetic vitamin D-3 content had been substituted with 25(OH)D-3 (Hy center dot D (R)). At 18 weeks of age, pullets were transferred to the layer house. At the ages of 11, 18 and 34 weeks, between 120 and 160 blood samples were collected from both the control and the experimental groups, respectively. The experimental group had higher levels of 25 (OH)D-3 than the control group at all three ages. Serum calcium levels did not differ between the treatment groups at any age. With the onset of laying, calcium levels rose significantly. Whereas blood serum concentration at 18 weeks was 3 mmol/L in both treatment groups, it increased to 8.32 mmol/L in the control group and to 8.66 mmol/L in the experimental group at week 34. At weeks 11 and 34, phosphate was significantly lower in the experimental group. In conclusion, HyD (R) significantly affected serum phosphate and 25(OH)D-3 levels. No effects of (25(OH)D-3 supplementation on performance, shell quality and fractures of keelbones were found.