53 resultados para SCHEDULING OF GRID TASKS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an investigation into learners’ and teachers’ perceptions of and criteria for task difficulty. Ten second language learners performed four oral narrative tasks and were retrospectively interviewed about which tasks they perceived as difficult, what factors affected this difficulty and how they identified and defined this task difficulty. Ten EFL/ESOL teachers were given the same tasks and asked to consider the difficulty of the tasks for their learners, and were invited to discuss the factors they believed contributed to this difficulty. Qualitative analysis of the data revealed that, although there were some differences between the two groups’ perceptions of task difficulty, there was substantial similarity between them in terms of the criteria they considered in identifying and defining task difficulty. The findings of this study lend support to the tenets of a cognitive approach to task-based language learning, and demonstrate which aspects of two models of task difficulty reflect the teachers’ and learners’ perceptions and perspectives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The very first numerical models which were developed more than 20 years ago were drastic simplifications of the real atmosphere and they were mostly restricted to describe adiabatic processes. For prediction of a day or two of the mid tropospheric flow these models often gave reasonable results but the result deteriorated quickly when the prediction was extended further in time. The prediction of the surface flow was unsatisfactory even for short predictions. It was evident that both the energy generating processes as well as the dissipative processes have to be included in numerical models in order to predict the weather patterns in the lower part of the atmosphere and to predict the atmosphere in general beyond a day or two. Present-day computers make it possible to attack the weather forecasting problem in a more comprehensive and complete way and substantial efforts have been made during the last decade in particular to incorporate the non-adiabatic processes in numerical prediction models. The physics of radiational transfer, condensation of moisture, turbulent transfer of heat, momentum and moisture and the dissipation of kinetic energy are the most important processes associated with the formation of energy sources and sinks in the atmosphere and these have to be incorporated in numerical prediction models extended over more than a few days. The mechanisms of these processes are mainly related to small scale disturbances in space and time or even molecular processes. It is therefore one of the basic characteristics of numerical models that these small scale disturbances cannot be included in an explicit way. The reason for this is the discretization of the model's atmosphere by a finite difference grid or the use of a Galerkin or spectral function representation. The second reason why we cannot explicitly introduce these processes into a numerical model is due to the fact that some physical processes necessary to describe them (such as the local buoyance) are a priori eliminated by the constraints of hydrostatic adjustment. Even if this physical constraint can be relaxed by making the models non-hydrostatic the scale problem is virtually impossible to solve and for the foreseeable future we have to try to incorporate the ensemble or gross effect of these physical processes on the large scale synoptic flow. The formulation of the ensemble effect in terms of grid-scale variables (the parameters of the large-scale flow) is called 'parameterization'. For short range prediction of the synoptic flow at middle and high latitudes, very simple parameterization has proven to be rather successful.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional resource management has had as its main objective the optimisation of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional resource management has had as its main objective the optimisation of throughput, based on pa- rameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the present research, we conducted 4 studies designed to examine the hypothesis that perceived competence moderates the relation between performance-approach and performance-avoidance goals. Each study yielded supportive data, indicating that the correlation between the 2 goals is lower when perceived competence is high. This pattern was observed at the between- and within-subject level of analysis, with correlational and experimental methods and using both standard and novel achievement goal assessments, multiple operationalizations of perceived competence, and several different types of focal tasks. The findings from this research contribute to the achievement goal literature on theoretical, applied, and methodological fronts and highlight the importance of and need for additional empirical work in this area. (PsycINFO Database Record (c) 2012 APA, all rights reserved)(journal abstract)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last decade issues related to the financial viability of development have become increasingly important to the English planning system. As part of a wider shift towards the compartmentalisation of planning tasks, expert consultants are required to quantify, in an attempt to rationalise, planning decisions in terms of economic ‘viability’. Often with a particular focus on planning obligations, the results of development viability modelling have emerged as a key part of the evidence base used in site-specific negotiations and in planning policy formation. Focussing on the role of clients and other stakeholders, this paper investigates how development viability is tested in practice. It draws together literature on the role of calculative practices in policy formation, client feedback and influence in real estate appraisals and stakeholder engagement and consultation in the planning literature to critically evaluate the role of clients and other interest groups in influencing the production and use of development viability appraisal models. The paper draws upon semi-structured interviews with the main producers of development viability appraisals to conclude that, whilst appraisals have the potential to be biased by client and stakeholder interests, there are important controlling influences on potential opportunistic behaviour. One such control is local authorities’ weak understanding of development viability appraisal techniques which limits their capacity to question the outputs of appraisal models. However, this also is of concern given that viability is now a central feature of the town planning system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the current article is to support the investigation of linguistic relativity in second language acquisition and sketch methodological and theoretical prerequisites toward developing the domain into a full research program. We identify and discuss three theoretical-methodological components that we believe are needed to succeed in this enterprise. First, we highlight the importance of using nonverbal methods to study linguistic relativity effects in second language (L2) speakers. The use of nonverbal tasks is necessary in order to avoid the circularity that arises when inferences about nonverbal behavior are made on the basis of verbal evidence alone. Second, we identify and delineate the likely cognitive mechanisms underpinning cognitive restructuring in L2 speakers by introducing the theoretical framework of associative learning. By doing so, we demonstrate that the extent and nature of cognitive restructuring in L2 speakers is essentially a function of variation in individual learners’ trajectories. Third, we offer an in-depth discussion of the factors (e.g., L2 proficiency and L2 use) that characterize those trajectories, anchoring them to the framework of associative learning, and reinterpreting their relative strength in predicting L2 speaker cognition