952 resultados para Task analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Issues of boundary maintenance are implicit in all studies of national identity. By definition, national communities consist of those who are included but surrounded (literally or metaphorically) by those who are excluded. Most extant research on national identity explores criteria for national membership largely in terms of official or public definitions described, for example, in citizenship and immigration laws or in texts of popular culture. We know much less about how ordinary people in various nations reason about these issues. An analysis of cross-national (N = 23) survey data from the 1995 International Social Science Program reveals a core pattern in most of the countries studied. Respondents were asked how important various criteria were in being 'truly' a member of a particular nation. Exploratory factor analysis shows that these items cluster in terms of two underlying dimensions. Ascriptive/objectivist criteria relating to birth, religion and residence can be distinguished from civic/voluntarist criteria relating to subjective feelings of membership and belief in core institutions. In most nations the ascriptive/objectivist dimension of national identity was more prominent than the subjective civic/voluntarist dimension. Taken overall, these findings suggest an unanticipated homogeneity in the ways that citizens around the world think about national identity. To the extent that these dimensions also mirror the well-known distinction between ethnic and civic national identification, they suggest that the former remains robust despite globalization, mass migration and cultural pluralism. Throughout the world official definitions of national identification have tended to shift towards a civic model. Yet citizens remain remarkably traditional in outlook. A task for future research is to investigate the macrosociological forces that produce both commonality and difference in the core patterns we have identified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study describes a preliminary examination of the viability and suitability of the physiologic technique electromagnetic articulography (EMA) in investigating lingual fatigue in myasthenia gravis (MG). A 52.9-year-old female diagnosed with MG at the age of 18 years, but who was in remission, participated in the study with a matched control subject. Changes in the duration, speed, and range of tongue-tip and tongue-back movements during repetition of /taka/ over two minutes were investigated. Results revealed that the MG subject did not exhibit significant changes in duration, maximum velocity, maximum acceleration, or the distance travelled by her tongue as measured by EMA over the task. The kinematic results were, in part, expected since the MG subject was in remission. The results, therefore, may not be representative of the majority of individuals with active MG. The examination of the current case did highlight, however, the potential advantages of EMA in providing detailed, objective information regarding lingual kinematics for future investigations of individuals with MG. It also showed that EMA may be sensitive in detecting subclinical kinematic features of fatigue in individuals who are in remission from MG. Finally, EMA led to the identification of possible physiologic factors underlying the CV transform effect, which was evident for the MG subject's syllable productions. In the past, the effect had been assumed to be a purely perceptual-based phenomenon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the picture-word interference task, naming responses are facilitated when a distractor word is orthographically and phonologically related to the depicted object as compared to an unrelated word. We used event-related functional magnetic resonance imaging (fMRI) to investigate the cerebral hemodynamic responses associated with this priming effect. Serial (or independent-stage) and interactive models of word production that explicitly account for picture-word interference effects assume that the locus of the effect is at the level of retrieving phonological codes, a role attributed recently to the left posterior superior temporal cortex (Wernicke's area). This assumption was tested by randomly presenting participants with trials from orthographically related and unrelated distractor conditions and acquiring image volumes coincident with the estimated peak hemodynamic response for each trial. Overt naming responses occurred in the absence of scanner noise, allowing reaction time data to be recorded. Analysis of this data confirmed the priming effect. Analysis of the fMRI data revealed blood oxygen level-dependent signal decreases in Wernicke's area and the right anterior temporal cortex, whereas signal increases were observed in the anterior cingulate, the right orbitomedial prefrontal, somatosensory, and inferior parietal cortices, and the occipital lobe. The results are interpreted as supporting the locus for the facilitation effect as assumed by both classes of theoretical model of word production. In addition, our results raise the possibilities that, counterintuitively, picture-word interference might be increased by the presentation of orthographically related distractors, due to competition introduced by activation of phonologically related word forms, and that this competition requires inhibitory processes to be resolved. The priming effect is therefore viewed as being sufficient to offset the increased interference. We conclude that information from functional imaging studies might be useful for constraining theoretical models of word production. (C) 2002 Elsevier Science (USA).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present paper reviews the findings of 30 years of verbal/manual dual task studies, the method most commonly used to assess lateralization of speech production in non-clinical samples. Meta-analysis of 64 results revealed that both the type of manual task used and the nature of practice that is given influence the size of the laterality effect. A meta-analysis of 36 results examining the effect size of sex differences in estimate,, of lateralization of speech production indicated that males appear to show, slightly larger laterality effects than females. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dust is a complex mixture of particles of organic and inorganic origin and different gases absorbed in aerosol droplets. In a poultry unit include dried faecal matter and urine, skin flakes, ammonia, carbon dioxide, pollens, feed and litter particles, feathers, grain mites, fungi spores, bacteria, viruses and their constituents. Dust particles vary in size and differentiation between particle size fractions is important in health studies in order to quantify penetration within the respiratory system. A descriptive study was developed in order to assess exposure to particles in a poultry unit during different operations, namely routine examination and floor turn over. Direct-reading equipment was used (Lighthouse, model 3016 IAQ). Particle measurement was performed in 5 different sizes (PM0.5; PM1.0; PM2.5; PM5.0; PM10). The chemical composition of poultry litter was also determined by neutron activation analysis. Normally, the litter of poultry pavilions is turned over weekly and it was during this operation that the higher exposure of particles was observed. In all the tasks considered PM5.0 and PM10.0 were the sizes with higher concentrations values. PM10 is what turns out to have higher values and PM0.5 the lowest values. The chemical element with the highest concentration was Mg (5.7E6 mg.kg-1), followed by K (1.5E4 mg.kg-1), Ca (4.8E3 mg.kg-1), Na (1.7E3 mg.kg-1), Fe (2.1E2 mg.kg-1) and Zn (4.2E1 mg.kg-1). This high presence of particles in the respirable range (<5–7μm) means that poultry dust particles can penetrate into the gas exchange region of the lung. Larger particles (PM10) present a range of concentrations from 5.3E5 and 3.0E6 mg/m3.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we survey the most relevant results for the prioritybased schedulability analysis of real-time tasks, both for the fixed and dynamic priority assignment schemes. We give emphasis to the worst-case response time analysis in non-preemptive contexts, which is fundamental for the communication schedulability analysis. We define an architecture to support priority-based scheduling of messages at the application process level of a specific fieldbus communication network, the PROFIBUS. The proposed architecture improves the worst-case messages’ response time, overcoming the limitation of the first-come-first-served (FCFS) PROFIBUS queue implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

LLF (Least Laxity First) scheduling, which assigns a higher priority to a task with a smaller laxity, has been known as an optimal preemptive scheduling algorithm on a single processor platform. However, little work has been made to illuminate its characteristics upon multiprocessor platforms. In this paper, we identify the dynamics of laxity from the system’s viewpoint and translate the dynamics into LLF multiprocessor schedulability analysis. More specifically, we first characterize laxity properties under LLF scheduling, focusing on laxity dynamics associated with a deadline miss. These laxity dynamics describe a lower bound, which leads to the deadline miss, on the number of tasks of certain laxity values at certain time instants. This lower bound is significant because it represents invariants for highly dynamic system parameters (laxity values). Since the laxity of a task is dependent of the amount of interference of higher-priority tasks, we can then derive a set of conditions to check whether a given task system can go into the laxity dynamics towards a deadline miss. This way, to the author’s best knowledge, we propose the first LLF multiprocessor schedulability test based on its own laxity properties. We also develop an improved schedulability test that exploits slack values. We mathematically prove that the proposed LLF tests dominate the state-of-the-art EDZL tests. We also present simulation results to evaluate schedulability performance of both the original and improved LLF tests in a quantitative manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In real-time systems, there are two distinct trends for scheduling task sets on unicore systems: non-preemptive and preemptive scheduling. Non-preemptive scheduling is obviously not subject to any preemption delay but its schedulability may be quite poor, whereas fully preemptive scheduling is subject to preemption delay, but benefits from a higher flexibility in the scheduling decisions. The time-delay involved by task preemptions is a major source of pessimism in the analysis of the task Worst-Case Execution Time (WCET) in real-time systems. Preemptive scheduling policies including non-preemptive regions are a hybrid solution between non-preemptive and fully preemptive scheduling paradigms, which enables to conjugate both world's benefits. In this paper, we exploit the connection between the progression of a task in its operations, and the knowledge of the preemption delays as a function of its progression. The pessimism in the preemption delay estimation is then reduced in comparison to state of the art methods, due to the increase in information available in the analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current industry trend is towards using Commercially available Off-The-Shelf (COTS) based multicores for developing real time embedded systems, as opposed to the usage of custom-made hardware. In typical implementation of such COTS-based multicores, multiple cores access the main memory via a shared bus. This often leads to contention on this shared channel, which results in an increase of the response time of the tasks. Analyzing this increased response time, considering the contention on the shared bus, is challenging on COTS-based systems mainly because bus arbitration protocols are often undocumented and the exact instants at which the shared bus is accessed by tasks are not explicitly controlled by the operating system scheduler; they are instead a result of cache misses. This paper makes three contributions towards analyzing tasks scheduled on COTS-based multicores. Firstly, we describe a method to model the memory access patterns of a task. Secondly, we apply this model to analyze the worst case response time for a set of tasks. Although the required parameters to obtain the request profile can be obtained by static analysis, we provide an alternative method to experimentally obtain them by using performance monitoring counters (PMCs). We also compare our work against an existing approach and show that our approach outperforms it by providing tighter upper-bound on the number of bus requests generated by a task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

LLF (Least Laxity First) scheduling, which assigns a higher priority to a task with smaller laxity, has been known as an optimal preemptive scheduling algorithm on a single processor platform. However, its characteristics upon multiprocessor platforms have been little studied until now. Orthogonally, it has remained open how to efficiently schedule general task systems, including constrained deadline task systems, upon multiprocessors. Recent studies have introduced zero laxity (ZL) policy, which assigns a higher priority to a task with zero laxity, as a promising scheduling approach for such systems (e.g., EDZL). Towards understanding the importance of laxity in multiprocessor scheduling, this paper investigates the characteristics of ZL policy and presents the first ZL schedulability test for any work-conserving scheduling algorithm that employs this policy. It then investigates the characteristics of LLF scheduling, which also employs the ZL policy, and derives the first LLF-specific schedulability test on multiprocessors. It is shown that the proposed LLF test dominates the ZL test as well as the state-of-art EDZL test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consider the problem of scheduling a set of sporadically arriving tasks on a uniform multiprocessor with the goal of meeting deadlines. A processor p has the speed Sp. Tasks can be preempted but they cannot migrate between processors. We propose an algorithm which can schedule all task sets that any other possible algorithm can schedule assuming that our algorithm is given processors that are three times faster.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consider the problem of scheduling a set of sporadically arriving tasks on a uniform multiprocessor with the goal of meeting deadlines. A processor p has the speed Sp. Tasks can be preempted but they cannot migrate between processors. On each processor, tasks are scheduled according to rate-monotonic. We propose an algorithm that can schedule all task sets that any other possible algorithm can schedule assuming that our algorithm is given processors that are √2 / √2−1 ≈ 3.41 times faster. No such guarantees are previously known for partitioned static-priority scheduling on uniform multiprocessors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new algorithm is proposed for scheduling preemptible arbitrary-deadline sporadic task systems upon multiprocessor platforms, with interprocessor migration permitted. This algorithm is based on a task-splitting approach - while most tasks are entirely assigned to specific processors, a few tasks (fewer than the number of processors) may be split across two processors. This algorithm can be used for two distinct purposes: for actually scheduling specific sporadic task systems, and for feasibility analysis. Simulation- based evaluation indicates that this algorithm offers a significant improvement on the ability to schedule arbitrary- deadline sporadic task systems as compared to the contemporary state-of-art. With regard to feasibility analysis, the new algorithm is proved to offer superior performance guarantees in comparison to prior feasibility tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environment monitoring has an important role in occupational exposure assessment. However, due to several factors is done with insufficient frequency and normally don´t give the necessary information to choose the most adequate safety measures to avoid or control exposure. Identifying all the tasks developed in each workplace and conducting a task-based exposure assessment help to refine the exposure characterization and reduce assessment errors. A task-based assessment can provide also a better evaluation of exposure variability, instead of assessing personal exposures using continuous 8-hour time weighted average measurements. Health effects related with exposure to particles have mainly been investigated with mass-measuring instruments or gravimetric analysis. However, more recently, there are some studies that support that size distribution and particle number concentration may have advantages over particle mass concentration for assessing the health effects of airborne particles. Several exposure assessments were performed in different occupational settings (bakery, grill house, cork industry and horse stable) and were applied these two resources: task-based exposure assessment and particle number concentration by size. The results showed interesting results: task-based approach applied permitted to identify the tasks with higher exposure to the smaller particles (0.3 μm) in the different occupational settings. The data obtained allow more concrete and effective risk assessment and the identification of priorities for safety investments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-time scheduling usually considers worst-case values for the parameters of task (or message stream) sets, in order to provide safe schedulability tests for hard real-time systems. However, worst-case conditions introduce a level of pessimism that is often inadequate for a certain class of (soft) real-time systems. In this paper we provide an approach for computing the stochastic response time of tasks where tasks have inter-arrival times described by discrete probabilistic distribution functions, instead of minimum inter-arrival (MIT) values.