62 resultados para Operating indicators
Resumo:
Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.
Resumo:
The ancient southern highlands on Mars (~3.5 Gyr old) contain > 600 regions that display spectral evidence in the infrared for the presence of chloride-bearing materials. Many of these locations were previously reported to display polygonal cracking patterns. We studied more than 80 of the chloride-bearing terrains using high-resolution (0.25-0.5 m/pixel) images, as well as near-infrared spectral data, to characterize the surface textures and the associated cracking patterns and mineralogies. Our study indicates that ~75% of the studied locations display polygonal cracks that resemble desiccation cracks, while some resemble salt expansion/thrust polygons. Furthermore, we detect, spectrally, the presence of smectites in association with ~30% of the studied fractured terrains. We note that smectites are a special class of swelling clay minerals that can induce formation of large desiccation cracks. As such, we suggest that the cracking patterns are indicative of the presence of smectite phyllosilicates even in the absence of spectral confirmation. Our results suggest that many chloride-bearing terrains have a lacustrine origin and a geologic setting similar to playas on Earth. Such locations would have contained ephemeral lakes that may have undergone repeated cycles of desiccation and recharging by a near-surface fluctuating water table in order to account for the salt-phyllosilicates associations. These results have notable implications for the ancient hydrology of Mars. We propose that the morphologies and sizes of the polygonal cracks can be used as paleoenvironmental, as well as lithological, indicators that could be helpful in planning future missions.
Resumo:
Decadal-to-century scale trends for a range of marine environmental variables in the upper mesopelagic layer (UML, 100–600 m) are investigated using results from seven Earth System Models forced by a high greenhouse gas emission scenario. The models as a class represent the observation-based distribution of oxygen (O2) and carbon dioxide (CO2), albeit major mismatches between observation-based and simulated values remain for individual models. By year 2100 all models project an increase in SST between 2 °C and 3 °C, and a decrease in the pH and in the saturation state of water with respect to calcium carbonate minerals in the UML. A decrease in the total ocean inventory of dissolved oxygen by 2% to 4% is projected by the range of models. Projected O2 changes in the UML show a complex pattern with both increasing and decreasing trends reflecting the subtle balance of different competing factors such as circulation, production, remineralization, and temperature changes. Projected changes in the total volume of hypoxic and suboxic waters remain relatively small in all models. A widespread increase of CO2 in the UML is projected. The median of the CO2 distribution between 100 and 600m shifts from 0.1–0.2 mol m−3 in year 1990 to 0.2–0.4 mol m−3 in year 2100, primarily as a result of the invasion of anthropogenic carbon from the atmosphere. The co-occurrence of changes in a range of environmental variables indicates the need to further investigate their synergistic impacts on marine ecosystems and Earth System feedbacks.
Resumo:
BACKGROUND It is often assumed that horses with mild respiratory clinical signs, such as mucous nasal discharge and occasional coughing, have an increased risk of developing recurrent airway obstruction (RAO). HYPOTHESIS Compared to horses without any clinical signs of respiratory disease, those with occasional coughing, mucous nasal discharge, or both have an increased risk of developing signs of RAO (frequent coughing, increased breathing effort, exercise intolerance, or a combination of these) as characterized by the Horse Owner Assessed Respiratory Signs Index (HOARSI 1-4). ANIMALS Two half-sibling families descending from 2 RAO-affected stallions (n = 65 and n = 47) and an independent replication population of unrelated horses (n = 88). METHODS In a retrospective cohort study, standardized information on occurrence and frequency of coughing, mucous nasal discharge, poor performance, and abnormal breathing effort-and these factors combined in the HOARSI-as well as management factors were collected at intervals of 1.3-5 years. RESULTS Compared to horses without clinical signs of respiratory disease (half-siblings 7%; unrelated horses 3%), those with mild respiratory signs developed clinical signs of RAO more frequently: half-siblings with mucous nasal discharge 35% (P < .001, OR: 7.0, sensitivity: 62%, specificity: 81%), with mucous nasal discharge and occasional coughing 43% (P < .001, OR: 9.9, sensitivity: 55%, specificity: 89%); unrelated horses with occasional coughing: 25% (P = .006, OR = 9.7, sensitivity: 75%, specificity: 76%). CONCLUSIONS AND CLINICAL IMPORTANCE Occasional coughing and mucous nasal discharge might represent an increased risk of developing RAO.
Resumo:
Changes in fire occurrence during the last decades in the southern Swiss Alps make knowledge on fire history essential to understand future evolution of the ecosystem composition and functioning. In this context, palaeoecology provides useful insights into processes operating at decadal-to-millennial time scales, such as the response of plant communities to intensified fire disturbances during periods of cultural change. We provide a high-resolution macroscopic charcoal and pollen series from Guèr, a well-dated peat sequence at mid-elevation (832 m.a.s.l.) in southern Switzerland, where the presence of local settlements is documented since the late Bronze Age and the Iron Age. Quantitative fire reconstruction shows that fire activity sharply increased from the Neolithic period (1–3 episodes/1000 year) to the late Bronze and Iron Age (7–9 episodes/1000 year), leading to extensive clearance of the former mixed deciduous forest (Alnus glutinosa, Betula, deciduous Quercus). The increase in anthropogenic pollen indicators (e.g. Cerealia-type, Plantago lanceolata) together with macroscopic charcoal suggests anthropogenic rather than climatic forcing as the main cause of the observed vegetation shift. Fire and controlled burning were extensively used during the late Roman Times and early Middle Ages to promote the introduction and establishment of chestnut (Castanea sativa) stands, which provided an important wood and food supply. Fire occurrence declined markedly (from 9 to 5–6 episodes/1000 year) during late Middle Ages because of fire suppression, biomass removal by human population, and landscape fragmentation. Land-abandonment during the last decades allowed forest to partly re-expand (mainly Alnus glutinosa, Betula) and fire frequency to increase.
Resumo:
BACKGROUND: The quality of surgical performance depends on the technical skills of the surgical team as well as on non-technical skills, including teamwork. The present study evaluated the impact of familiarity among members of the surgical team on morbidity in patients undergoing elective open abdominal surgery. METHODS: A retrospective analysis was performed to compare the surgical outcomes of patients who underwent major abdominal operations between the first month (period I) and the last month (period II) of a 6-month period of continuous teamwork (stable dyads of one senior and one junior surgeon formed every 6 months). Of 117 patients, 59 and 58 patients underwent operations during period I and period II, respectively, between January 2010 and June 2012. Team performance was assessed via questionnaire by specialized work psychologists; in addition, intraoperative sound levels were measured. RESULTS: The incidence of overall complications was significantly higher in period I than in period II (54.2 vs. 34.5 %; P = 0.041). Postoperative complications grade <3 were significantly more frequently diagnosed in patients who had operations during period I (39.0 vs. 15.5 %; P = 0.007), whereas no between-group differences in grade ≥3 complications were found (15.3 vs. 19.0 %; P = 0.807). Concentration scores from senior surgeons were significantly higher in period II than in period I (P = 0.033). Sound levels during the middle third part of the operations were significantly higher in period I (median above the baseline 8.85 dB [range 4.5-11.3 dB] vs. 7.17 dB [5.24-9.43 dB]; P < 0.001). CONCLUSIONS: Team familiarity improves team performance and reduces morbidity in patients undergoing abdominal surgery.
Resumo:
BACKGROUND: We evaluated Swiss slaughterhouse data for integration in a national syndromic surveillance system for the early detection of emerging diseases in production animals. We analysed meat inspection data for cattle, pigs and small ruminants slaughtered between 2007 and 2012 (including emergency slaughters of sick/injured animals); investigating patterns in the number of animals slaughtered and condemned; the reasons invoked for whole carcass condemnations; reporting biases and regional effects. RESULTS: Whole carcass condemnation rates were fairly uniform (1-2‰) over time and between the different types of production animals. Condemnation rates were much higher and less uniform following emergency slaughters. The number of condemnations peaked in December for both cattle and pigs, a time when individuals of lower quality are sent to slaughter when hay and food are limited and when certain diseases are more prevalent. Each type of production animal was associated with a different profile of condemnation reasons. The most commonly reported one was "severe lesions" for cattle, "abscesses" for pigs and "pronounced weight loss" for small ruminants. These reasons could constitute valuable syndromic indicators as they are unspecific clinical manifestations of a large range of animal diseases (as well as potential indicators of animal welfare). Differences were detected in the rate of carcass condemnation between cantons and between large and small slaughterhouses. A large percentage (>60% for all three animal categories) of slaughterhouses operating never reported a condemnation between 2007 and 2012, a potential indicator of widespread non-reporting bias in our database. CONCLUSIONS: The current system offers simultaneous coverage of cattle, pigs and small ruminants for the whole of Switzerland; and traceability of each condemnation to its farm of origin. The number of condemnations was significantly linked to the number of slaughters, meaning that the former should be always be offset by the later in analyses. Because this denominator is only communicated at the end of the month, condemnations may currently only be monitored on a monthly basis. Coupled with the lack of timeliness (30-60 days delay between condemnation and notification), this limits the use of the data for early-detection.