35 resultados para allocation of prizes
Resumo:
Prospective memory (ProM) is the ability to remember and carry out a planned intention in the future. ProM performance can be improved by instructing participants to prioritize the ProM task over the ongoing task. However, the improvement of ProM performance by emphasizing the relative importance typically restricted to situations in which the overlap between processing requirements of the ProM task and the ongoing task is low. Thus, additional processing resources are allocated to the ProM task and consequently, a cost emerges for the ongoing task. The aim of the present study was to further investigate this relationship. Participants were asked to respond to either semantic or perceptual ProM cues, which were embedded in a complex ongoing short term memory task. We manipulated absolute rather than relative importance by emphasizing the importance of the ProM task to half of the participants (i.e., without instructing them to prioritize it over the ongoing task). The results revealed that importance boosted ProM performance independent of the processing overlap between the ProM task and the ongoing task. Moreover, no additional cost was associated with absolute importance. These results challenge the view that importance always enhances the allocation of resources to the ProM task.
Resumo:
•Symbioses between plant roots and mycorrhizal fungi are thought to enhance plant uptake of nutrients through a favourable exchange for photosynthates. Ectomycorrhizal fungi are considered to play this vital role for trees in nitrogen (N)-limited boreal forests. •We followed symbiotic carbon (C)–N exchange in a large-scale boreal pine forest experiment by tracing 13CO2 absorbed through tree photosynthesis and 15N injected into a soil layer in which ectomycorrhizal fungi dominate the microbial community. •We detected little 15N in tree canopies, but high levels in soil microbes and in mycorrhizal root tips, illustrating effective soil N immobilization, especially in late summer, when tree belowground C allocation was high. Additions of N fertilizer to the soil before labelling shifted the incorporation of 15N from soil microbes and root tips to tree foliage. •These results were tested in a model for C–N exchange between trees and mycorrhizal fungi, suggesting that ectomycorrhizal fungi transfer small fractions of absorbed N to trees under N-limited conditions, but larger fractions if more N is available. We suggest that greater allocation of C from trees to ectomycorrhizal fungi increases N retention in soil mycelium, driving boreal forests towards more severe N limitation at low N supply.
Resumo:
The goal of the current investigation was to compare two monitoring processes (judgments of learning [JOLs] and confidence judgments [CJs]) and their corresponding control processes (allocation of study time and selection of answers to maximize accuracy, respectively) in 5- to 7-year-old children (N=101). Children learned the meaning of Japanese characters and provided JOLs after a study phase and CJs after a memory test. They were given the opportunity to control their learning in self-paced study phases, and to control their accuracy by placing correct answers into a treasure chest and incorrect answers into a trash can. All three age groups gave significantly higher CJs for correct compared to incorrect answers, with no age-related differences in the magnitude of this difference, suggesting robust metacognitive monitoring skills in children as young as 5. Furthermore, a link between JOLs and study time was found in the 6- and 7-year-olds, such that children spent more time studying items with low JOLs compared to items with high JOLs. Also, 6- and 7-year-olds but not 5-year-olds spent more time studying difficult items compared to easier items. Moreover, age-related improvements were found in children's use of CJs to guide their selection of answers: although children as young as 5 placed their most confident answers in the treasure chest and least confident answers in the trash can, this pattern was more robust in older children. Overall, results support the view that some metacognitive judgments may be acted upon with greater ease than others among young children.
Resumo:
Recent studies have demonstrated that the improved prognosis derived from resection of gliomas largely depends on the extent and quality of the resection, making maximum but safe resection the ultimate goal. Simultaneously, technical innovations and refined neurosurgical methods have rapidly improved efficacy and safety. Because gliomas derive from intrinsic brain cells, they often cannot be visually distinguished from the surrounding brain tissue during surgery. In order to appreciate the full extent of their solid compartment, various technologies have recently been introduced. However, radical resection of infiltrative glioma puts neurological function at risk, with potential detrimental consequences for patients' survival and quality of life. The allocation of various neurological functions within the brain varies in each patient and may undergo additional changes in the presence of a tumour (brain plasticity), making intra-operative localisation of eloquent areas mandatory for preservation of essential brain functions. Combining methods that visually distinguish tumour tissue and detect tissues responsible for critical functions now enables resection of tumours in brain regions that were previously considered off-limits, and benefits patients by enabling a more radical resection, while simultaneously lowering the risk of neurological deficits. Here we review recent and expected developments in microsurgery for glioma and their respective benefits.
Resumo:
Children typically hold very optimistic views of their own skills but so far, only a few studies have investigated possible correlates of the ability to predict performance accurately. Therefore, this study examined the role of individual differences in performance estimation accuracy as a global metacognitive index for different monitoring and control skills (item-level judgments of learning [JOLs] and confidence judgments [CJs]), metacognitive control processes (allocation of study time and control of answers), and executive functions (cognitive flexibility, inhibition, working memory) in 6-year-olds (N=93). The three groups of under estimators, realists and over estimators differed significantly in their monitoring and control abilities: the under estimators outperformed the over estimators by showing a higher discrimination in CJs between correct and incorrect recognition. Also, the under estimators scored higher on the adequate control of incorrectly recognized items. Regarding the interplay of monitoring and control processes, under estimators spent more time studying items with low JOLs, and relied more systematically on their monitoring when controlling their recognition compared to over estimators. At the same time, the three groups did not differ significantly from each other in their executive functions. Overall, results indicate that differences in performance estimation accuracy are systematically related to other global and item-level metacognitive monitoring and control abilities in children as young as six years of age, while no meaningful association between performance estimation accuracy and executive functions was found.
Resumo:
Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
Repetitive transcranial magnetic stimulation (rTMS) applied over the right posterior parietal cortex (PPC) in healthy participants has been shown to trigger a significant rightward shift in the spatial allocation of visual attention, temporarily mimicking spatial deficits observed in neglect. In contrast, rTMS applied over the left PPC triggers a weaker or null attentional shift. However, large interindividual differences in responses to rTMS have been reported. Studies measuring changes in brain activation suggest that the effects of rTMS may depend on both interhemispheric and intrahemispheric interactions between cortical loci controlling visual attention. Here, we investigated whether variability in the structural organization of human white matter pathways subserving visual attention, as assessed by diffusion magnetic resonance imaging and tractography, could explain interindividual differences in the effects of rTMS. Most participants showed a rightward shift in the allocation of spatial attention after rTMS over the right intraparietal sulcus (IPS), but the size of this effect varied largely across participants. Conversely, rTMS over the left IPS resulted in strikingly opposed individual responses, with some participants responding with rightward and some with leftward attentional shifts. We demonstrate that microstructural and macrostructural variability within the corpus callosum, consistent with differential effects on cross-hemispheric interactions, predicts both the extent and the direction of the response to rTMS. Together, our findings suggest that the corpus callosum may have a dual inhibitory and excitatory function in maintaining the interhemispheric dynamics that underlie the allocation of spatial attention. SIGNIFICANCE STATEMENT: The posterior parietal cortex (PPC) controls allocation of attention across left versus right visual fields. Damage to this area results in neglect, characterized by a lack of spatial awareness of the side of space contralateral to the brain injury. Transcranial magnetic stimulation over the PPC is used to study cognitive mechanisms of spatial attention and to examine the potential of this technique to treat neglect. However, large individual differences in behavioral responses to stimulation have been reported. We demonstrate that the variability in the structural organization of the corpus callosum accounts for these differences. Our findings suggest novel dual mechanism of the corpus callosum function in spatial attention and have broader implications for the use of stimulation in neglect rehabilitation.
Resumo:
Investigations were focused on light effects on allocation of root-borne macronutrients (calcium, magnesium and potassium) and micronutrients (iron, manganese, zinc and copper) in roots, shoots and harvested grains of wheat (Triticum aestivum L.). Plants were exposed to low (100 μmol photons m−2 s−1) or high light (380 μmol photons m−2 s−1). High light stimulated both root and shoot growth. While the total contents per plant of some nutrients were markedly higher (calcium and potassium) or lower (copper) under high light, no major differences were observed for other nutrients. The distribution of nutrients and the further redistribution within the shoot were influenced by the light intensity in an element-specific manner. Nutrients were selectively directed to the leaves of the main shoot (low light) or to the tillers (high light). The quality of the harvested grains was also affected by the light intensity.
Resumo:
An impairment of the spatial deployment of visual attention during exploration of static (i.e., motionless) stimuli is a common finding after an acute, right-hemispheric stroke. However, less is known about how these deficits: a) are modulated through naturalistic motion (i.e., without directional, specific spatial features); and, b) evolve in the subacute/chronic post-stroke phase. In the present study, we investigated free visual exploration in three patient groups with subacute/chronic right-hemispheric stroke and in healthy subjects. The first group included patients with left visual neglect and a left visual field defect (VFD), the second patients with a left VFD but no neglect, and the third patients without neglect or VFD. Eye movements were measured in all participants while they freely explored a traffic scene without (static condition) and with (dynamic condition) naturalistic motion, i.e., cars moving from the right or left. In the static condition, all patient groups showed similar deployment of visual exploration (i.e., as measured by the cumulative fixation duration) as compared to healthy subjects, suggesting that recovery processes took place, with normal spatial allocation of attention. However, the more demanding dynamic condition with moving cars elicited different re-distribution patterns of visual attention, quite similar to those typically observed in acute stroke. Neglect patients with VFD showed a significant decrease of visual exploration in the contralesional space, whereas patients with VFD but no neglect showed a significant increase of visual exploration in the contralesional space. No differences, as compared to healthy subjects, were found in patients without neglect or VFD. These results suggest that naturalistic motion, without directional, specific spatial features, may critically influence the spatial distribution of visual attention in subacute/chronic stroke patients.
Resumo:
The Long Term Evolution (LTE) cellular technology is expected to extend the capacity and improve the performance of current 3G cellular networks. Among the key mechanisms in LTE responsible for traffic management is the packet scheduler, which handles the allocation of resources to active flows in both the frequency and time dimension. This paper investigates for various scheduling scheme how they affect the inter-cell interference characteristics and how the interference in turn affects the user’s performance. A special focus in the analysis is on the impact of flow-level dynamics resulting from the random user behaviour. For this we use a hybrid analytical/simulation approach which enables fast evaluation of flow-level performance measures. Most interestingly, our findings show that the scheduling policy significantly affects the inter-cell interference pattern but that the scheduler specific pattern has little impact on the flow-level performance.
Resumo:
1. Egg yolks contain carotenoids that protect biological molecules against free-radical damage and promote maturation of the immune system. Availability of carotenoids to birds is often limited. Trade-offs can thus arise in the allocation of carotenoids to different physiological functions, and mothers may influence the immunocompetence of nestlings by modulating the transfer of carotenoid to the yolk.;2. In the great tit Parus major, we experimentally manipulated the dietary supply of carotenoid to mothers, and partially cross-fostered hatchlings to investigate the effect of an increased availability of carotenoids during egg laying on immunocompetence of nestlings.;3. In addition, we infested half of the nests with hen fleas Ceratophyllus gallinae to investigate the relationship between carotenoid availability, resistance to ectoparasites and immunocompetence.;4. We found that the procedure of cross-fostering can reduce the immune response of nestlings, but this effect can be compensated by the maternally transferred carotenoids. Cross-fostered nestlings of carotenoid-supplemented females show a similar immune response to non-cross-fostered nestlings, while cross-fostered nestlings of control females mounted a weaker cell-mediated immune response. This suggests that yolk carotenoids may help nestlings to cope with stress, for example the one generated by cross-fostering and/or they may enhance nestling competitiveness.;5. There was no statistically significant interaction between parasite and carotenoid treatments, as would be expected if carotenoids helped nestlings to fight parasites. Under parasite pressure, however, lighter nestlings raised a lower immune response, while the immune response was only weakly correlated with body mass in uninfested nests.
Resumo:
Clonality is frequently positively correlated with plant invasiveness, but which aspects of clonality make some clonal species more invasive than others is not known. Due to their spreading growth form, clonal plants are likely to experience spatial heterogeneity in nutrient availability. Plasticity in allocation of biomass to clonal growth organs and roots may allow these plants to forage for high-nutrient patches. We investigated whether this foraging response is stronger in species that have become invasive than in species that have not. We used six confamilial pairs of native European clonal plant species differing in invasion success in the USA. We grew all species in large pots under homogeneous or heterogeneous nutrient conditions in a greenhouse, and compared their nutrient-foraging response and performance. Neither invasive nor non-invasive species showed significant foraging responses to heterogeneity in clonal growth organ biomass or in aboveground biomass of clonal offspring. Invasive species had, however, a greater positive foraging response in terms of root and belowground biomass than non-invasive species. Invasive species also produced more total biomass. Our results suggest that the ability for strong root foraging is among the characteristics promoting invasiveness in clonal plants.
Resumo:
Aus der Lebensverlaufsperspektive wird die intergenerationale Mobilität von Männern und Frauen in den Kohorten 1929-31, 1939-41 und 1949-51 untersucht. In welchem Umfang hat die Expansion des öffentlichen Dienstes Mobilitätschancen eröffnet? Inwieweit hat der öffentliche Dienst als Sonderstruktur im Gegensatz zur Privatwirtschaft seine Funktion als "Mobilitätskanal" ausgeweitet? Modifizieren für den öffentlichen Dienst charakteristische institutionelle Regelungen der Rekrutierung und Allokation von Arbeitskräften diese Funktion? Für empirische Analysen wurden Längsschnittdaten des Lebensverlaufsprojekts am Berliner Max-Planck-Institut für Bildungsforschung herangezogen. Zunehmende herkunftsbedingte und bildungsmäßige Ungleichheit bestimmen einen Großteil der Chancen intergenerationaler Mobilität. Die Ausdehnung der Staatsbeschäftigung hat dazu geführt, daß in der Kohortenabfolge vor allem die Berufsanfänger aufstiegen, die in der Lage waren, in den öffentlichen Dienst einzutreten. Das Nachholen beim Berufseinstieg verpaßter Aufstiege ist kaum möglich, und dies gelingt auch nicht durch die Beschäftigung im öffentlichen Dienst. Für die Wahrscheinlichkeit intergenerationaler Aufstiege im Berufsverlauf gibt es keine sektorspezifischen Unterschiede. Staatsbeschäftigte unterliegen aufgrund der Besitzstandswahrung einem deutlich geringeren Abstiegsrisiko als privatwirtschaftlich Beschäftigte. Der Staatssektor hat seine Funktion als Aufstiegskanal für Berufsanfänger ausgeweitet und garantiert seinen langfristig Beschäftigten die erreichte Statuslage. Damit ist der öffentliche Dienst ein weiteres Strukturprinzip sozialer Ungleichheit.
Resumo:
In several regions of the world, climate change is expected to have severe impacts on agricultural systems. Changes in land management are one way to adapt to future climatic conditions, including land-use changes and local adjustments of agricultural practices. In previous studies, options for adaptation have mostly been explored by testing alternative scenarios. Systematic explorations of land management possibilities using optimization approaches were so far mainly restricted to studies of land and resource management under constant climatic conditions. In this study, we bridge this gap and exploit the benefits of multi-objective regional optimization for identifying optimum land management adaptations to climate change. We design a multi-objective optimization routine that integrates a generic crop model and considers two climate scenarios for 2050 in a meso-scale catchment on the Swiss Central Plateau with already limited water resources. The results indicate that adaptation will be necessary in the study area to cope with a decrease in productivity by 0–10 %, an increase in soil loss by 25–35 %, and an increase in N-leaching by 30–45 %. Adaptation options identified here exhibit conflicts between productivity and environmental goals, but compromises are possible. Necessary management changes include (i) adjustments of crop shares, i.e. increasing the proportion of early harvested winter cereals at the expense of irrigated spring crops, (ii) widespread use of reduced tillage, (iii) allocation of irrigated areas to soils with low water-retention capacity at lower elevations, and (iv) conversion of some pre-alpine grasslands to croplands.