843 resultados para SCHEDULING OF GRID TASKS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ice cloud representation in general circulation models remains a challenging task, due to the lack of accurate observations and the complexity of microphysical processes. In this article, we evaluate the ice water content (IWC) and ice cloud fraction statistical distributions from the numerical weather prediction models of the European Centre for Medium-Range Weather Forecasts (ECMWF) and the UK Met Office, exploiting the synergy between the CloudSat radar and CALIPSO lidar. Using the last three weeks of July 2006, we analyse the global ice cloud occurrence as a function of temperature and latitude and show that the models capture the main geographical and temperature-dependent distributions, but overestimate the ice cloud occurrence in the Tropics in the temperature range from −60 °C to −20 °C and in the Antarctic for temperatures higher than −20 °C, but underestimate ice cloud occurrence at very low temperatures. A global statistical comparison of the occurrence of grid-box mean IWC at different temperatures shows that both the mean and range of IWC increases with increasing temperature. Globally, the models capture most of the IWC variability in the temperature range between −60 °C and −5 °C, and also reproduce the observed latitudinal dependencies in the IWC distribution due to different meteorological regimes. Two versions of the ECMWF model are assessed. The recent operational version with a diagnostic representation of precipitating snow and mixed-phase ice cloud fails to represent the IWC distribution in the −20 °C to 0 °C range, but a new version with prognostic variables for liquid water, ice and snow is much closer to the observed distribution. The comparison of models and observations provides a much-needed analysis of the vertical distribution of IWC across the globe, highlighting the ability of the models to reproduce much of the observed variability as well as the deficiencies where further improvements are required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Extreme fear of contamination within Obsessive Compulsive Disorder is traditionally conceptualised as a physical phenomenon. More recent research has supported the notion of ‘mental’ contamination, in which people feel contaminated in the absence of physical contact. The current research sought to determine whether feelings of contact and mental contamination could be induced within a non-clinical sample, whether the impact of mental and contact contamination was comparable in terms of associated feelings and behaviour and whether related psychopathology related to the impact of the tasks. Methods: Undergraduate students (n=60) completed OCD relevant measures and were randomly assigned to either a contact contamination condition (CC: moving a bucket of fake vomit) or a mental contamination condition (MC: thinking about a bucket of vomit). Results: Both manipulations induced feelings of contamination. Participants in the contact condition had significantly greater urges to wash than those in the mental condition. Neutralising behaviour did not differ across conditions. Conclusions: Feelings of contamination can be induced in the absence of physical contact and for those in the MC group, some aspects of OCD-relevant psychopathology were related to the impact of the manipulation. These findings have implications for the understanding and treatment of contamination-related fears in OCD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article reports on a detailed empirical study of the way narrative task design influences the oral performance of second-language (L2) learners. Building on previous research findings, two dimensions of narrative design were chosen for investigation: narrative complexity and inherent narrative structure. Narrative complexity refers to the presence of simultaneous storylines; in this case, we compared single-story narratives with dual-story narratives. Inherent narrative structure refers to the order of events in a narrative; we compared narratives where this was fixed to others where the events could be reordered without loss of coherence. Additionally, we explored the influence of learning context on performance by gathering data from two comparable groups of participants: 60 learners in a foreign language context in Teheran and 40 in an L2 context in London. All participants recounted two of four narratives from cartoon pictures prompts, giving a between-subjects design for narrative complexity and a within-subjects design for inherent narrative structure. The results show clearly that for both groups, L2 performance was affected by the design of the task: Syntactic complexity was supported by narrative storyline complexity and grammatical accuracy was supported by an inherently fixed narrative structure. We reason that the task of recounting simultaneous events leads learners into attempting more hypotactic language, such as subordinate clauses that follow, for example, while, although, at the same time as, etc. We reason also that a tight narrative structure allows learners to achieve greater accuracy in the L2 (within minutes of performing less accurately on a loosely structured narrative) because the tight ordering of events releases attentional resources that would otherwise be spent on finding connections between the pictures. The learning context was shown to have no effect on either accuracy or fluency but an unexpectedly clear effect on syntactic complexity and lexical diversity. The learners in London seem to have benefited from being in the target language environment by developing not more accurate grammar but a more diverse resource of English words and syntactic choices. In a companion article (Foster & Tavakoli, 2009) we compared their performance with native-speaker baseline data and see that, in terms of nativelike selection of vocabulary and phrasing, the learners in London are closing in on native-speaker norms. The study provides empirical evidence that L2 performance is affected by task design in predictable ways. It also shows that living within the target language environment, and presumably using the L2 in a host of everyday tasks outside the classroom, confers a distinct lexical advantage, not a grammatical one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study contributes to a central debate within contemporary generative second language (L2) theorizing: the extent to which adult learners are (un)able to acquire new functional features that result in a L2 grammar that is mentally structured like the native target (see White, 2003). The adult acquisition of L2 nominal phi-features is explored, with focus on the syntactic and semantic reflexes in the related domain of adjective placement in two experimental groups: English-speaking intermediate (n = 21) and advanced (n = 24) learners of Spanish, as compared to a native-speaker control group (n = 15). Results show that, on some of the tasks, the intermediate L2 learners appear to have acquired the syntactic properties of the Spanish determiner phrase but, on other tasks, to show some delay with the semantic reflexes of prenominal and postnominal adjectives. Crucially, however, our data demonstrate full convergence by all advanced learners and thus provide evidence in contra the predictions of representational deficit accounts (e.g., Hawkins & Chan, 1997; Hawkins & Franceschina, 2004; Hawkins & Hattori, 2006).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose first, a simple task for the eliciting attitudes toward risky choice, the SGG lottery-panel task, which consists in a series of lotteries constructed to compensate riskier options with higher risk-return trade-offs. Using Principal Component Analysis technique, we show that the SGG lottery-panel task is capable of capturing two dimensions of individual risky decision making i.e. subjects’ average risk taking and their sensitivity towards variations in risk-return. From the results of a large experimental dataset, we confirm that the task systematically captures a number of regularities such as: A tendency to risk averse behavior (only around 10% of choices are compatible with risk neutrality); An attraction to certain payoffs compared to low risk lotteries, compatible with over-(under-) weighting of small (large) probabilities predicted in PT and; Gender differences, i.e. males being consistently less risk averse than females but both genders being similarly responsive to the increases in risk-premium. Another interesting result is that in hypothetical choices most individuals increase their risk taking responding to the increase in return to risk, as predicted by PT, while across panels with real rewards we see even more changes, but opposite to the expected pattern of riskier choices for higher risk-returns. Therefore, we conclude from our data that an “economic anomaly” emerges in the real reward choices opposite to the hypothetical choices. These findings are in line with Camerer's (1995) view that although in many domains, paid subjects probably do exert extra mental effort which improves their performance, choice over money gambles is not likely to be a domain in which effort will improve adherence to rational axioms (p. 635). Finally, we demonstrate that both dimensions of risk attitudes, average risk taking and sensitivity towards variations in the return to risk, are desirable not only to describe behavior under risk but also to explain behavior in other contexts, as illustrated by an example. In the second study, we propose three additional treatments intended to elicit risk attitudes under high stakes and mixed outcome (gains and losses) lotteries. Using a dataset obtained from a hypothetical implementation of the tasks we show that the new treatments are able to capture both dimensions of risk attitudes. This new dataset allows us to describe several regularities, both at the aggregate and within-subjects level. We find that in every treatment over 70% of choices show some degree of risk aversion and only between 0.6% and 15.3% of individuals are consistently risk neutral within the same treatment. We also confirm the existence of gender differences in the degree of risk taking, that is, in all treatments females prefer safer lotteries compared to males. Regarding our second dimension of risk attitudes we observe, in all treatments, an increase in risk taking in response to risk premium increases. Treatment comparisons reveal other regularities, such as a lower degree of risk taking in large stake treatments compared to low stake treatments and a lower degree of risk taking when losses are incorporated into the large stake lotteries. Results that are compatible with previous findings in the literature, for stake size effects (e.g., Binswanger, 1980; Antoni Bosch-Domènech & Silvestre, 1999; Hogarth & Einhorn, 1990; Holt & Laury, 2002; Kachelmeier & Shehata, 1992; Kühberger et al., 1999; B. J. Weber & Chapman, 2005; Wik et al., 2007) and domain effect (e.g., Brooks and Zank, 2005, Schoemaker, 1990, Wik et al., 2007). Whereas for small stake treatments, we find that the effect of incorporating losses into the outcomes is not so clear. At the aggregate level an increase in risk taking is observed, but also more dispersion in the choices, whilst at the within-subjects level the effect weakens. Finally, regarding responses to risk premium, we find that compared to only gains treatments sensitivity is lower in the mixed lotteries treatments (SL and LL). In general sensitivity to risk-return is more affected by the domain than the stake size. After having described the properties of risk attitudes as captured by the SGG risk elicitation task and its three new versions, it is important to recall that the danger of using unidimensional descriptions of risk attitudes goes beyond the incompatibility with modern economic theories like PT, CPT etc., all of which call for tests with multiple degrees of freedom. Being faithful to this recommendation, the contribution of this essay is an empirically and endogenously determined bi-dimensional specification of risk attitudes, useful to describe behavior under uncertainty and to explain behavior in other contexts. Hopefully, this will contribute to create large datasets containing a multidimensional description of individual risk attitudes, while at the same time allowing for a robust context, compatible with present and even future more complex descriptions of human attitudes towards risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is increasingly important to know about when energy is used in the home, at work and on the move. Issues of time and timing have not featured strongly in energy policy analysis and in modelling, much of which has focused on estimating and reducing total average annual demand per capita. If smarter ways of balancing supply and demand are to take hold, and if we are to make better use of decarbonised forms of supply, it is essential to understand and intervene in patterns of societal synchronisation. This calls for detailed knowledge of when, and on what occasions many people engage in the same activities at the same time, of how such patterns are changing, and of how might they be shaped. In addition, the impact of smart meters and controls partly depends on whether there is, in fact scope for shifting the timing of what people do, and for changing the rhythm of the day. Is the scheduling of daily life an arena that policy can influence, and if so how? The DEMAND Centre has been linking time use, energy consumption and travel diary data as a means of addressing these questions and in this working paper we present some of the issues and results arising from that exercise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Event-related desynchronization (ERD) of the electroencephalogram (EEG) from the motor cortex is associated with execution, observation, and mental imagery of motor tasks. Generation of ERD by motor imagery (MI) has been widely used for brain-computer interfaces (BCIs) linked to neuroprosthetics and other motor assistance devices. Control of MI-based BCIs can be acquired by neurofeedback training to reliably induce MI-associated ERD. To develop more effective training conditions, we investigated the effect of static and dynamic visual representations of target movements (a picture of forearms or a video clip of hand grasping movements) during the BCI training. After 4 consecutive training days, the group that performed MI while viewing the video showed significant improvement in generating MI-associated ERD compared with the group that viewed the static image. This result suggests that passively observing the target movement during MI would improve the associated mental imagery and enhance MI-based BCIs skills.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Possible impairments of memory in end-stage renal disease (ESRD) were investigated in two experiments. In Experiment 1, in which stimulus words were presented visually, participants were tested on conceptual or perceptual memory tasks, with retrieval being either explicit or implicit. Compared with healthy controls, ESRD patients were impaired when memory required conceptual but not when it required perceptual processing, regardless of whether retrieval was explicit or implicit. An impairment of conceptual implicit memory (priming) in the ESRD group represented a previously unreported deficit compared to healthy aging. There were no significant differences between pre- and immediate post-dialysis memory performance in ESRD patients on any of the tasks. In Experiment 2, in which presentation was auditory, patients again performed worse than controls on an explicit conceptual memory task. We conclude that the type of processing required by the task (conceptual vs. perceptual) is more important than the type of retrieval (explicit vs. implicit) in memory failures in ESRD patients, perhaps because temporal brain regions are more susceptible to the effects of the illness than are posterior regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: High levels of parental anxiety are associated with poor treatment outcomes for children with anxiety disorders. Associated parental cognitions and behaviours have been implicated as impediments to successful treatment. We examined the association between parental responsibility beliefs, maternal anxiety and parenting behaviours in the context of childhood anxiety disorders. Methods: Anxious and non-anxious mothers of 7-12 year old children with a current anxiety disorder reported their parental responsibility beliefs using a questionnaire measure. Parental behaviours towards their child during a stressor task were measured. Results: Parents with a current anxiety disorder reported a greater sense of responsibility for their child’s actions and wellbeing than parents who scored within the normal range for anxiety. Furthermore, higher parental responsibility was associated with more intrusive and less warm behaviours in parent-child interactions and there was an indirect effect between maternal anxiety and maternal intrusive behaviours via parental responsibility beliefs. Limitations: The sample was limited to a treatment-seeking, relatively high socio-economic population and only mothers were included so replication with more diverse groups is needed. The use of a range of stressor tasks may have allowed for a more comprehensive assessment of parental behaviours. Conclusions: The findings suggest that parental anxiety disorder is associated with an elevated sense of parental responsibility and may promote parental behaviours likely to inhibit optimum child treatment outcomes. Parental responsibility beliefs may therefore be important to target in child anxiety treatments in the context of parental anxiety disorders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Terrain following coordinates are widely used in operational models but the cut cell method has been proposed as an alternative that can more accurately represent atmospheric dynamics over steep orography. Because the type of grid is usually chosen during model implementation, it becomes necessary to use different models to compare the accuracy of different grids. In contrast, here a C-grid finite volume model enables a like-for-like comparison of terrain following and cut cell grids. A series of standard two-dimensional tests using idealised terrain are performed: tracer advection in a prescribed horizontal velocity field, a test starting from resting initial conditions, and orographically induced gravity waves described by nonhydrostatic dynamics. In addition, three new tests are formulated: a more challenging resting atmosphere case, and two new advection tests having a velocity field that is everywhere tangential to the terrain following coordinate surfaces. These new tests present a challenge on cut cell grids. The results of the advection tests demonstrate that accuracy depends primarily upon alignment of the flow with the grid rather than grid orthogonality. A resting atmosphere is well-maintained on all grids. In the gravity waves test, results on all grids are in good agreement with existing results from the literature, although terrain following velocity fields lead to errors on cut cell grids. Due to semi-implicit timestepping and an upwind-biased, explicit advection scheme, there are no timestep restrictions associated with small cut cells. We do not find the significant advantages of cut cells or smoothed coordinates that other authors find.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We assess the corticomuscular coherence (CMC) of the contralateral primary motor cortex and the hand muscles during a finger force-tracking task and explore whether the pattern of finger coordination has an impact on the CMC level. Six healthy subjects (three men and three women) were recruited to conduct the force-tracking tasks comprising two finger patterns, i.e., natural combination of index and middle fingers and unnatural combination of index and middle fingers (i.e., simultaneously producing equal force strength in index and middle finger). During the conducting of the tasks with right index and middle finger, MEG and sEMG signals were recorded from left primary motor cortex (M1) and right flexor digitorum superficialis (FDS), respectively; the contralateral CMC was calculated to assess the neuromuscular interaction. Finger force-tracking tasks of Common-IM only induce beta-band CMC, whereas Uncommon-IM tasks produce CMC in both beta and low-gamma band. Compared to the force-tracking tasks of Common-IM, the Uncommon-IM task is associated with the most intensive contralateral CMC. Our study demonstrated that the pattern of finger coordination had significant impact on the CMC between the contralateral M1 and hand muscles, and more corticomuscular interaction was necessary for unnaturally coordinated finger activities to regulate the fixed neural drive of hand muscles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A lot sizing and scheduling problem prevalent in small market-driven foundries is studied. There are two related decision levels: (I the furnace scheduling of metal alloy production, and (2) moulding machine planning which specifies the type and size of production lots. A mixed integer programming (MIP) formulation of the problem is proposed, but is impractical to solve in reasonable computing time for non-small instances. As a result, a faster relax-and-fix (RF) approach is developed that can also be used on a rolling horizon basis where only immediate-term schedules are implemented. As well as a MIP method to solve the basic RF approach, three variants of a local search method are also developed and tested using instances based on the literature. Finally, foundry-based tests with a real-order book resulted in a very substantial reduction of delivery delays and finished inventory, better use of capacity, and much faster schedule definition compared to the foundry`s own practice. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The InteGrade middleware intends to exploit the idle time of computing resources in computer laboratories. In this work we investigate the performance of running parallel applications with communication among processors on the InteGrade grid. As costly communication on a grid can be prohibitive, we explore the so-called systolic or wavefront paradigm to design the parallel algorithms in which no global communication is used. To evaluate the InteGrade middleware we considered three parallel algorithms that solve the matrix chain product problem, the 0-1 Knapsack Problem, and the local sequence alignment problem, respectively. We show that these three applications running under the InteGrade middleware and MPI take slightly more time than the same applications running on a cluster with only LAM-MPI support. The results can be considered promising and the time difference between the two is not substantial. The overhead of the InteGrade middleware is acceptable, in view of the benefits obtained to facilitate the use of grid computing by the user. These benefits include job submission, checkpointing, security, job migration, etc. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to achieve the high performance, we need to have an efficient scheduling of a parallelprogram onto the processors in multiprocessor systems that minimizes the entire executiontime. This problem of multiprocessor scheduling can be stated as finding a schedule for ageneral task graph to be executed on a multiprocessor system so that the schedule length can be minimize [10]. This scheduling problem is known to be NP- Hard.In multi processor task scheduling, we have a number of CPU’s on which a number of tasksare to be scheduled that the program’s execution time is minimized. According to [10], thetasks scheduling problem is a key factor for a parallel multiprocessor system to gain betterperformance. A task can be partitioned into a group of subtasks and represented as a DAG(Directed Acyclic Graph), so the problem can be stated as finding a schedule for a DAG to beexecuted in a parallel multiprocessor system so that the schedule can be minimized. Thishelps to reduce processing time and increase processor utilization. The aim of this thesis workis to check and compare the results obtained by Bee Colony algorithm with already generatedbest known results in multi processor task scheduling domain.