36 resultados para Increasing and decreasing expected prices
Resumo:
Plants generally respond to herbivore attack by increasing resistance and decreasing growth. This prioritization is achieved through the regulation of phytohormonal signaling networks. However, it remains unknown how this prioritization affects resistance against non-target herbivores. In this study, we identify WRKY70 as a specific herbivore-induced, mitogen-activated protein kinase-regulated rice transcription factor that physically interacts with W-box motifs and prioritizes defence over growth by positively regulating jasmonic acid (JA) and negatively regulating gibberellin (GA) biosynthesis upon attack by the chewing herbivore Chilo suppressalis. WRKY70-dependent JA biosynthesis is required for proteinase inhibitor activation and resistance against C. suppressalis. In contrast, WRKY70 induction increases plant susceptibility against the rice brown planthopper Nilaparvata lugens. Experiments with GA-deficient rice lines identify WRKY70-dependent GA signaling as the causal factor in N. lugens susceptibility. Our study shows that prioritizing defence over growth leads to a significant resistance trade-off with important implications for the evolution and agricultural exploitation of plant immunity.
Resumo:
The north-eastern escarpment of Madagascar contains the island’s last remaining large-scale humid forest massifs surrounded by diverse small-scale agricultural mosaics. There is high deforestation mainly caused by shifting cultivation practiced by local land users to produce upland rice for subsistence. Today, large protected areas restrict land users’ access to forests to collect wood and other forest products. Moreover, they are no more able to expand their cultivated land, which leads to shorter shifting cultivation cycles and decreasing plot sizes for irrigated rice and cash crop cultivation. Cash crop production of clove and vanilla is exposed to risks such as extreme inter-annual price fluctuations, pests and cyclones. In the absence of work opportunities, agricultural extension services and micro-finance schemes people are stuck in a poverty trap. New development strategies are needed to mitigate the trade-offs between forest conservation and human well-being. As landscape composition and livelihood strategies vary across the region, these strategies need to be spatially differentiated to avoid implementing generic solutions, which do not fit the local context. However, up to date, little is known about the spatial patterns of shifting cultivation and other land use systems at the regional level. This is mainly due to the high spatial and temporal dynamics inherent to shifting cultivation, which makes it difficult to monitor the dynamics of this land use system with remote sensing methods. Furthermore, knowledge about land users’ livelihood strategies and the risks and opportunities they face stems from very few local case studies. To overcome this challenge, firstly, we used remote sensing data and a landscape mosaic approach to delineate the main landscape types at the regional level. Secondly, we developed a land user typology based on socio-ecological data from household surveys in 45 villages spread throughout the region. Combining the land user typology with the landscape mosaic map allowed us to reveal spatial patterns of the interaction between landscapes and people and to better understand the trade-offs between forest conservation and local wellbeing. While shifting cultivation systems are being transformed into more intensive permanent agricultural systems in many countries around the globe, Madagascar seems to be an exception to this trend. Linking land cover information to human-environmental interactions over large areas is crucial to designing policies and to inform decision making for a more sustainable development of this resource-rich but poverty-prone context.
Resumo:
OBJECTIVE To assess whether exposure to high altitude induces cognitive dysfunction in young healthy European children and adolescents during acute, short-term exposure to an altitude of 3450 m and in an age-matched European population permanently living at this altitude. STUDY DESIGN We tested executive function (inhibition, shifting, and working memory), memory (verbal, short-term visuospatial, and verbal episodic memory), and speed processing ability in: (1) 48 healthy nonacclimatized European children and adolescents, 24 hours after arrival at high altitude and 3 months after return to low altitude; (2) 21 matched European subjects permanently living at high altitude; and (3) a matched control group tested twice at low altitude. RESULTS Short-term hypoxia significantly impaired all but 2 (visuospatial memory and processing speed) of the neuropsychological abilities that were tested. These impairments were even more severe in the children permanently living at high altitude. Three months after return to low altitude, the neuropsychological performances significantly improved and were comparable with those observed in the control group tested only at low altitude. CONCLUSIONS Acute short-term exposure to an altitude at which major tourist destinations are located induces marked executive and memory deficits in healthy children. These deficits are equally marked or more severe in children permanently living at high altitude and are expected to impair their learning abilities.
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
A deeper understanding of past vegetation dynamics is required to better assess future vegetation responses to global warming in the Alps. Lake sediments from Lac de Bretaye, a small subalpine lake in the Northern Swiss Alps (1780 m a.s.l.), were analysed to reconstruct past vegetation dynamics for the entire Holocene, using pollen, macrofossil and charcoal analyses as main proxies. The results show that timberline reached the lake’s catchment area at around 10,300 cal. BP, supporting the hypothesis of a delayed postglacial afforestation in the Northern Alps. At the same time, thermophilous trees such as Ulmus, Tilia and Acer established in the lowlands and expanded to the altitude of the lake, forming distinctive boreo-nemoral forests with Betula, Pinus cembra and Larix decidua. From about 5000 to 3500 cal. BP, thermophilous trees declined because of increasing human land use, mainly driven by the mass expansion of Picea abies and severe anthropogenic fire activity. From the Bronze Age onwards (c. 4200–2800 cal. BP), grazing indicators and high values for charcoal concentration and influx attest an intensifying human impact, fostering the expansion of Alnus viridis and Picea abies. Hence, biodiversity in alpine meadows increased, whereas forest diversity declined, as can be seen in other regional records. We argue that the anticipated climate change and decreasing human impact in the Alps today will not only lead to an upward movement of timberline with consequent loss of area for grasslands, but also to a disruption of Picea abies forests, which may allow the re-expansion of thermophilous tree species.
Resumo:
A continuous record of atmospheric lead since 12,370 carbon-14 years before the present (14C yr BP) is preserved in a Swiss peat bog. Enhanced fluxes caused by climate changes reached their maxima 10,590 14C yr BP (Younger Dryas) and 823014C yr BP. Soil erosion caused by forest clearing and agricultural tillage increased lead deposition after 532014C yr BP. Increasing lead/scandium and decreasing lead-206/lead-207 beginning 3000 14C yr BP indicate the beginning of lead pollution from mining and smelting, and anthropogenic sources have dominated lead emissions ever since. The greatest lead flux (15.7 milligrams per square meter per year in A.D. 1979) was 1570 times the natural, background value (0.01 milligram per square meter per year from 8030 to 5320 14C yr BP).