980 resultados para Task Complexity
Resumo:
This CEPS Task Force Report focuses on how to improve water efficiency in Europe, notably in public supply, households, agriculture, energy and manufacturing as well as across sectors. It presents a number of recommendations on how to make better use of economic policy instruments to sustainably manage the EU’s water resources. Published in the run-up to the European Commission’s “Blueprint to Safeguard Europe’s Waters”, the report contributes to the policy deliberations in two ways. First, by assessing the viability of economic policy instruments, it addresses a major shortcoming that has so far prevented the 2000 EU Water Framework Directive (WFD) from becoming fully effective in practice: the lack of appropriate, coherent and effective instruments in (some) member states. Second, as the Task Force report is the result of an interactive process involving a variety of stakeholders, it is able to point to the key differences in interpreting and applying WFD principles that have led to a lack of policy coherence across the EU and to offer some pragmatic advice on moving forward.
Resumo:
Drawing on discussions within a CEPS Task Force on the revised EU emissions trading system, this report provides a comprehensive assessment of the pros and cons of the various measures put forward by different stakeholders to address the level and stability of the price of carbon in the EU. It argues that the European Commission, the member states, the European Parliament and other stakeholders need to give serious consideration to introducing some kind of ‘dynamic’ adjustment provision to address the relatively inelastic supply. The report also suggests that there is a need to improve communication of market-sensitive information, for example by leaving the management of the ETS to a specialised body.
Resumo:
With the rapid development in technology over recent years, construction, in common with many areas of industry, has become increasingly complex. It would, therefore, seem to be important to develop and extend the understanding of complexity so that industry in general and in this case the construction industry can work with greater accuracy and efficiency to provide clients with a better service. This paper aims to generate a definition of complexity and a method for its measurement in order to assess its influence upon the accuracy of the quantity surveying profession in UK new build office construction. Quantitative data came from an analysis of twenty projects of varying size and value and qualitative data came from interviews with professional quantity surveyors. The findings highlight the difficulty in defining and measuring project complexity. The correlation between accuracy and complexity was not straightforward, being subjected to many extraneous variables, particularly the impact of project size. Further research is required to develop a better measure of complexity. This is in order to improve the response of quantity surveyors, so that an appropriate level of effort can be applied to individual projects, permitting greater accuracy and enabling better resource planning within the profession.
Resumo:
The Iowa gambling task (IGT) is one of the most influential behavioral paradigms in reward-related decision making and has been, most notably, associated with ventromedial prefrontal cortex function. However, performance in the IGT relies on a complex set of cognitive subprocesses, in particular integrating information about the outcome of choices into a continuously updated decision strategy under ambiguous conditions. The complexity of the task has made it difficult for neuroimaging studies to disentangle the underlying neurocognitive processes. In this study, we used functional magnetic resonance imaging in combination with a novel adaptation of the task, which allowed us to examine separately activation associated with the moment of decision or the evaluation of decision outcomes. Importantly, using whole-brain regression analyses with individual performance, in combination with the choice/outcome history of individual subjects, we aimed to identify the neural overlap between areas that are involved in the evaluation of outcomes and in the progressive discrimination of the relative value of available choice options, thus mapping the two fundamental cognitive processes that lead to adaptive decision making. We show that activation in right ventromedial and dorsolateral prefrontal cortex was predictive of adaptive performance, in both discriminating disadvantageous from advantageous decisions and confirming negative decision outcomes. We propose that these two prefrontal areas mediate shifting away from disadvantageous choices through their sensitivity to accumulating negative outcomes. These findings provide functional evidence of the underlying processes by which these prefrontal subregions drive adaptive choice in the task, namely through contingency-sensitive outcome evaluation.
Resumo:
Three experiments investigated the effectiveness of presenting procedural information through different media and their combinations. Experiment 1 examined the effectiveness of text, line drawings, text and line drawings, video. and video stills for learning a first aid task. The results showed an advantage of text and line drawings and of the video presentation over the other three conditions for both bandaging performance and answering questions about the task. Experiment 2 showed that the beneficial effect of the combination of text and pictures could not be accounted for simply in terms of a dual coding explanation. Rather, the effectiveness of the media and their combinations was influenced by the extent to which they conveyed action information. Finally, Experiment 3 showed no evidence of a contiguity effect: text and pictures were as effective when presented together on the same screen as when they were presented separately. Copyright © 2000 John Wiley & Sons, Ltd.
Resumo:
Two experiments examined the learning of a set of Greek pronunciation rules through explicit and implicit modes of rule presentation. Experiment 1 compared the effectiveness of implicit and explicit modes of presentation in two modalities, visual and auditory. Subjects in the explicit or rule group were presented with the rule set, and those in the implicit or natural group were shown a set of Greek words, composed of letters from the rule set, linked to their pronunciations. Subjects learned the Greek words to criterion and were then given a series of tests which aimed to tap different types of knowledge. The results showed an advantage of explicit study of the rules. In addition, an interaction was found between mode of presentation and modality. Explicit instruction was more effective in the visual than in the auditory modality, whereas there was no modality effect for implicit instruction. Experiment 2 examined a possible reason for the advantage of the rule groups by comparing different combinations of explicit and implicit presentation in the study and learning phases. The results suggested that explicit presentation of the rules is only beneficial when it is followed by practice at applying them.
Resumo:
This paper reports three experiments that examine the role of similarity processing in McGeorge and Burton's (1990) incidental learning task. In the experiments subjects performed a distractor task involving four-digit number strings, all of which conformed to a simple hidden rule. They were then given a forced-choice memory test in which they were presented with pairs of strings and were led to believe that one string of each pair had appeared in the prior learning phase. Although this was not the case, one string of each pair did conform to the hidden rule. Experiment 1 showed that, as in the McGeorge and Burton study, subjects were significantly more likely to select test strings that conformed to the hidden rule. However, additional analyses suggested that rather than having implicitly abstracted the rule, subjects may have been selecting strings that were in some way similar to those seen during the learning phase. Experiments 2 and 3 were designed to try to separate out effects due to similarity from those due to implicit rule abstraction. It was found that the results were more consistent with a similarity-based model than implicit rule abstraction per se.
Resumo:
Frequent pattern discovery in structured data is receiving an increasing attention in many application areas of sciences. However, the computational complexity and the large amount of data to be explored often make the sequential algorithms unsuitable. In this context high performance distributed computing becomes a very interesting and promising approach. In this paper we present a parallel formulation of the frequent subgraph mining problem to discover interesting patterns in molecular compounds. The application is characterized by a highly irregular tree-structured computation. No estimation is available for task workloads, which show a power-law distribution in a wide range. The proposed approach allows dynamic resource aggregation and provides fault and latency tolerance. These features make the distributed application suitable for multi-domain heterogeneous environments, such as computational Grids. The distributed application has been evaluated on the well known National Cancer Institute’s HIV-screening dataset.