953 resultados para task model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL’s Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60% of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present research, a 3 × 2 model of achievement goals is proposed and tested. The model is rooted in the definition and valence components of competence, and encompasses 6 goal constructs: task-approach, task-avoidance, self-approach, self-avoidance, other-approach, and other-avoidance. The results from 2 studies provided strong support for the proposed model, most notably the need to separate task-based and self-based goals. Studies 1 and 2 yielded data establishing the 3 × 2 structure of achievement goals, and Study 2 documented the antecedents and consequences of each of the goals in the 3 × 2 model. Terminological, conceptual, and applied issues pertaining to the 3 × 2 model are discussed. (PsycINFO Database Record (c) 2012 APA, all rights reserved)(journal abstract)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main purpose of the work described in this paper is to examine the extent to which the L2 developmental changes predicted by Kroll and Stewart's (1994) Revised Hierarchical Model (RHM) can be understood by word association response behaviour. The RHM attempts to account for the relative “strength of the links between words and concepts in each of the bilingual's languages” (Kroll, Van Hell, Tokowicz & Green, 2010, p. 373). It proposes that bilinguals with higher L2 proficiency tend to rely less on mediation, while less proficient L2 learners tend to rely on mediation and access L2 words by translating from L1 equivalents. In this paper, I present findings from a simple word association task. More proficient learners provided a greater proportion of collocational links, suggesting that they mediate less when compared to less proficient learners. The results provide tentative support for Kroll and Stewart's model

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our digital universe is rapidly expanding,more and more daily activities are digitally recorded, data arrives in streams, it needs to be analyzed in real time and may evolve over time. In the last decade many adaptive learning algorithms and prediction systems, which can automatically update themselves with the new incoming data, have been developed. The majority of those algorithms focus on improving the predictive performance and assume that model update is always desired as soon as possible and as frequently as possible. In this study we consider potential model update as an investment decision, which, as in the financial markets, should be taken only if a certain return on investment is expected. We introduce and motivate a new research problem for data streams ? cost-sensitive adaptation. We propose a reference framework for analyzing adaptation strategies in terms of costs and benefits. Our framework allows to characterize and decompose the costs of model updates, and to asses and interpret the gains in performance due to model adaptation for a given learning algorithm on a given prediction task. Our proof-of-concept experiment demonstrates how the framework can aid in analyzing and managing adaptation decisions in the chemical industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The detection of physiological signals from the motor system (electromyographic signals) is being utilized in the practice clinic to guide the therapist in a more precise and accurate diagnosis of motor disorders. In this context, the process of decomposition of EMG (electromyographic) signals that includes the identification and classification of MUAP (Motor Unit Action Potential) of a EMG signal, is very important to help the therapist in the evaluation of motor disorders. The EMG decomposition is a complex task due to EMG features depend on the electrode type (needle or surface), its placement related to the muscle, the contraction level and the health of the Neuromuscular System. To date, the majority of researches on EMG decomposition utilize EMG signals acquired by needle electrodes, due to their advantages in processing this type of signal. However, relatively few researches have been conducted using surface EMG signals. Thus, this article aims to contribute to the clinical practice by presenting a technique that permit the decomposition of surface EMG signal via the use of Hidden Markov Models. This process is supported by the use of differential evolution and spectral clustering techniques. The developed system presented coherent results in: (1) identification of the number of Motor Units actives in the EMG signal; (2) presentation of the morphological patterns of MUAPs in the EMG signal; (3) identification of the firing sequence of the Motor Units. The model proposed in this work is an advance in the research area of decomposition of surface EMG signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-model ensembles are frequently used to assess understanding of the response of ozone and methane lifetime to changes in emissions of ozone precursors such as NOx, VOCs (volatile organic compounds) and CO. When these ozone changes are used to calculate radiative forcing (RF) (and climate metrics such as the global warming potential (GWP) and global temperature-change potential (GTP)) there is a methodological choice, determined partly by the available computing resources, as to whether the mean ozone (and methane) concentration changes are input to the radiation code, or whether each model's ozone and methane changes are used as input, with the average RF computed from the individual model RFs. We use data from the Task Force on Hemispheric Transport of Air Pollution source–receptor global chemical transport model ensemble to assess the impact of this choice for emission changes in four regions (East Asia, Europe, North America and South Asia). We conclude that using the multi-model mean ozone and methane responses is accurate for calculating the mean RF, with differences up to 0.6% for CO, 0.7% for VOCs and 2% for NOx. Differences of up to 60% for NOx 7% for VOCs and 3% for CO are introduced into the 20 year GWP. The differences for the 20 year GTP are smaller than for the GWP for NOx, and similar for the other species. However, estimates of the standard deviation calculated from the ensemble-mean input fields (where the standard deviation at each point on the model grid is added to or subtracted from the mean field) are almost always substantially larger in RF, GWP and GTP metrics than the true standard deviation, and can be larger than the model range for short-lived ozone RF, and for the 20 and 100 year GWP and 100 year GTP. The order of averaging has most impact on the metrics for NOx, as the net values for these quantities is the residual of the sum of terms of opposing signs. For example, the standard deviation for the 20 year GWP is 2–3 times larger using the ensemble-mean fields than using the individual models to calculate the RF. The source of this effect is largely due to the construction of the input ozone fields, which overestimate the true ensemble spread. Hence, while the average of multi-model fields are normally appropriate for calculating mean RF, GWP and GTP, they are not a reliable method for calculating the uncertainty in these fields, and in general overestimate the uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background In the UK occupational therapy pre-discharge home visits are routinely carried out as a means of facilitating safe transfer from the hospital to home. Whilst they are an integral part of practice, there is little evidence to demonstrate they have a positive outcome on the discharge process. Current issues for patients are around the speed of home visits and the lack of shared decision making in the process, resulting in less than 50 % of the specialist equipment installed actually being used by patients on follow-up. To improve practice there is an urgent need to examine other ways of conducting home visits to facilitate safe discharge. We believe that Computerised 3D Interior Design Applications (CIDAs) could be a means to support more efficient, effective and collaborative practice. A previous study explored practitioners perceptions of using CIDAs; however it is important to ascertain older adult’s views about the usability of technology and to compare findings. This study explores the perceptions of community dwelling older adults with regards to adopting and using CIDAs as an assistive tool for the home adaptations process. Methods Ten community dwelling older adults participated in individual interactive task-focused usability sessions with a customised CIDA, utilising the think-aloud protocol and individual semi-structured interviews. Template analysis was used to carry out both deductive and inductive analysis of the think-aloud and interview data. Initially, a deductive stance was adopted, using the three pre-determined high-level themes of the technology acceptance model (TAM): Perceived Usefulness (PU), Perceived Ease of Use (PEOU), Actual Use (AU). Inductive template analysis was then carried out on the data within these themes, from which a number of sub-thmes emerged. Results Regarding PU, participants believed CIDAs served as a useful visual tool and saw clear potential to facilitate shared understanding and partnership in care delivery. For PEOU, participants were able to create 3D home environments however a number of usability issues must still be addressed. The AU theme revealed the most likely usage scenario would be collaborative involving both patient and practitioner, as many participants did not feel confident or see sufficient value in using the application autonomously. Conclusions This research found that older adults perceived that CIDAs were likely to serve as a valuable tool which facilitates and enhances levels of patient/practitioner collaboration and empowerment. Older adults also suggested a redesign of the interface so that less sophisticated dexterity and motor functions are required. However, older adults were not confident, or did not see sufficient value in using the application autonomously. Future research is needed to further customise the CIDA software, in line with the outcomes of this study, and to explore the potential of collaborative application patient/practitioner-based deployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The incorporation of new representations into the mental lexicon has raised numerous questions about the organisational principles that govern the process. A number of studies have argued that similarity between the new L3 items and existing representations in the L1 and L2 is the main incorporating force (Hall & Ecke, 2003; Herwig, 2001). Experimental evidence obtained through a primed picture-naming task with L1 Polish-L2 English learners of L3 Russian supports Hall and Ecke’s Parasitic Model of L3 vocabulary acquisition, displaying a significant main effect for both priming and proficiency. These results complement current models of vocabulary acquisition and lexical access in multilingual speakers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact of extreme sea ice initial conditions on modelled climate is analysed for a fully coupled atmosphere ocean sea ice general circulation model, the Hadley Centre climate model HadCM3. A control run is chosen as reference experiment with greenhouse gas concentration fixed at preindustrial conditions. Sensitivity experiments show an almost complete recovery from total removal or strong increase of sea ice after four years. Thus, uncertainties in initial sea ice conditions seem to be unimportant for climate modelling on decadal or longer time scales. When the initial conditions of the ocean mixed layer were adjusted to ice-free conditions, a few substantial differences remained for more than 15 model years. But these differences are clearly smaller than the uncertainty of the HadCM3 run and all the other 19 IPCC fourth assessment report climate model preindustrial runs. It is an important task to improve climate models in simulating the past sea ice variability to enable them to make reliable projections for the 21st century.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present work, the effects of spatial constraints on the efficiency of task execution in systems underlain by geographical complex networks are investigated, where the probability of connection decreases with the distance between the nodes. The investigation considers several configurations of the parameters defining the network connectivity, and the Barabasi-Albert network model is also considered for comparisons. The results show that the effect of connectivity is significant only for shorter tasks, the locality of connection simplied by the spatial constraints reduces efficiency, and the addition of edges can improve the efficiency of the execution, although with increasing locality of the connections the improvement is small.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The specification of Quality of Service (QoS) constraints over software design requires measures that ensure such requirements are met by the delivered product. Achieving this goal is non-trivial, as it involves, at least, identifying how QoS constraint specifications should be checked at the runtime. In this paper we present an implementation of a Model Driven Architecture (MDA) based framework for the runtime monitoring of QoS properties. We incorporate the UML2 superstructure and the UML profile for Quality of Service to provide abstract descriptions of component-and-connector systems. We then define transformations that refine the UML2 models to conform with the Distributed Management Taskforce (DMTF) Common Information Model (CIM) (Distributed Management Task Force Inc. 2006), a schema standard for management and instrumentation of hardware and software. Finally, we provide a mapping the CIM metamodel to a .NET-based metamodel for implementation of the monitoring infrastructure utilising various .NET features including the Windows Management Instrumentation (WMI) interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project constructs a structural model of the United States Economy. This task is tackled in two separate ways: first econometric methods and then using a neural network, both with a structure that mimics the structure of the U.S. economy. The structural model tracks the performance of U.S. GDP rather well in a dynamic simulation, with an average error of just over 1 percent. The neural network performed well, but suffered from some theoretical, as well as some implementation issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Predispatch model (PD) calculates a short-term generation policy for power systems. In this work a PD model is proposed that improves two modeling aspects generally neglected in the literature: voltage/reactive power constraints and ramp rate constraints for generating units. Reactive power constraints turn the PD into a non-linear problem and the ramp rate constraints couple the problem dynamically in time domain. The solution of the PD is turned into a harder task when such constraints are introduced. The dual decomposition/ lagrangian relaxation technique is used in the solution approach for handing dynamic constraints. As a result the PD is decomposed into a series of independent Optimal Power Flow (FPO) sub problems, in which the reactive power is represented in detail. The solution of the independent FPO is coordinated by means of Lagrange multipliers, so that dynamic constraints are iteratively satisfied. Comparisons between dispatch policies calculated with and without the representation of ramp rate constraints are performed, using the IEEE 30 bus test system. The results point-out the importance of representing such constraints in the generation dispatch policy. © 2004 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJETIVOS:traduzir e adaptar culturalmente para a língua portuguesa do Brasil o modelo Developing Nurses' Thinking, utilizado como estratégia ao ensino do raciocínio clínico.MÉTODO:a tradução e adaptação cultural foi realizada por meio de tradução inicial, síntese das traduções, retrotradução, avaliação por comitê de especialistas e pré-teste com 33 estudantes de graduação em enfermagem.RESULTADOS:as etapas de tradução inicial, síntese das traduções e retrotradução foram realizadas a contento, havendo a necessidade de pequenos ajustes. Na avaliação pelo comitê de especialistas da versão traduzida, todos os itens obtiveram concordância superior a 80% na primeira rodada de avaliação e no pré-teste com os estudantes. O modelo mostrou-se adequado à sua finalidade.CONCLUSÃO:recomenda-se o uso do modelo como uma estratégia complementar ao ensino do raciocínio diagnóstico, visando a formação de enfermeiros mais conscientes sobre a tarefa diagnóstica e a importância da segurança do paciente.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The frequency spectrums are inefficiently utilized and cognitive radio has been proposed for full utilization of these spectrums. The central idea of cognitive radio is to allow the secondary user to use the spectrum concurrently with the primary user with the compulsion of minimum interference. However, designing a model with minimum interference is a challenging task. In this paper, a transmission model based on cyclic generalized polynomial codes discussed in [2] and [15], is proposed for the improvement in utilization of spectrum. The proposed model assures a non interference data transmission of the primary and secondary users. Furthermore, analytical results are presented to show that the proposed model utilizes spectrum more efficiently as compared to traditional models.