241 resultados para information processing
Resumo:
Pain changes postural activation of the trunk muscles. The cause of these changes is not known but one possibility relates to the information processing requirements and the stressful nature of pain. This study investigated this possibility by evaluating electromyographic activity (EMG) of the deep and superficial trunk muscles associated with voluntary rapid arm movement. Data were collected from control trials, trials during low back pain (LBP) elicited by injection of hypertonic saline into the back muscles, trials during a non-painful attention-demanding task, and during the same task that was also stressful. Pain did not change the reaction time (RT) of the movement, had variable effects on RT of the superficial trunk muscles, but consistently increased RT of the deepest abdominal muscle. The effect of the attention-demanding task was opposite: increased RT of the movement and the superficial trunk muscles but no effect on RT of the deep trunk muscles. Thus, activation of the deep trunk muscles occurred earlier relative to the movement. When the attention-demanding task was made stressful, the RT of the movement and superficial trunk muscles was unchanged but the RT of the deep trunk muscles was increased. Thus, the temporal relationship between deep trunk muscle activation and arm movement was restored. This means that although postural activation of the deep trunk muscles is not affected when central nervous system resources are limited, it is delayed when the individual is also under stress. However, a non-painful attention-demanding task does not replicate the effect of pain on postural control of the trunk muscles even when the task is stressful.
Resumo:
Interconnecting business processes across systems and organisations is considered to provide significant benefits, such as greater process transparency, higher degrees of integration, facilitation of communication, and consequently higher throughput in a given time interval. However, to achieve these benefits requires tackling constraints. In the context of this paper these are privacy-requirements of the involved workflows and their mutual dependencies. Workflow views are a promising conceptional approach to address the issue of privacy; however this approach requires addressing the issue of interdependencies between workflow view and adjacent private workflow. In this paper we focus on three aspects concerning the support for execution of cross-organisational workflows that have been modelled with a workflow view approach: (i) communication between the entities of a view-based workflow model, (ii) their impact on an extended workflow engine, and (iii) the design of a cross-organisational workflow architecture (CWA). We consider communication aspects in terms of state dependencies and control flow dependencies. We propose to tightly couple private workflow and workflow view with state dependencies, whilst to loosely couple workflow views with control flow dependencies. We introduce a Petri-Net-based state transition approach that binds states of private workflow tasks to their adjacent workflow view-task. On the basis of these communication aspects we develop a CWA for view-based cross-organisational workflow execution. Its concepts are valid for mediated and unmediated interactions and express no choice of a particular technology. The concepts are demonstrated by a scenario, run by two extended workflow management systems. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Human faces and bodies are both complex and interesting perceptual objects, and both convey important social information. Given these similarities between faces and bodies, we can ask how similar are the visual processing mechanisms used to recognize them. It has long been argued that faces are subject to dedicated and unique perceptual processes, but until recently, relatively little research has focused on how we perceive the human. body. Some recent paradigms indicate that faces and bodies are processed differently; others show similarities in face and body perception. These similarities and differences depend on the type of perceptual task and the level of processing involved. Future research should take these issues into account.
Resumo:
To determine whether the visuospatial n-back working memory task is a reliable and valid measure of cognitive processes believed to underlie intelligence, this study compared the reaction times and accuracy of perforniance of 70 participants, with performance on the Multidimensional Aptitude Battery (MAB). Testing was conducted over two sessions separated by 1 week. Participants completed the MAB during the second test session. Moderate testretest reliability for percentage accuracy scores was found across the four levels of the n-back task, whilst reaction times were highly reliable. Furthermore, participants' performance on the MAB was negatively correlated with accuracy of performance at the easier levels of the n-back task and positively correlated with accuracy of performance at the harder task levels. These findings confirm previous research examining the cognitive basis of intelligence, and suggest that intelligence is the product of faster speed of information processing, as well as superior working memory capacity. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Research has suggested that the integrity of semantic processing may be compromised in Parkinson's disease (PD), which may account for difficulties in complex sentence comprehension. In order to investigate the time course and integrity of semantic activation in PD, 20 patients with PD and 23 healthy controls performed a lexical decision task based on the multi-priming paradigm. Semantic priming effects were measured across stimulus onset asynchronies of 250 ms, 600 ms, and 1200 ms. Further, PD participants performed an auditory comprehension task. The results revealed significantly different patterns of semantic priming for the PD group at the 250-ms and 1200-ms SOAs. In addition, a delayed time course of semantic activation was evident for PD patients with poor comprehension of complex sentences. These results provide further support to suggest that both automatic and controlled aspects of semantic activation may be compromised in PD. Furthermore, the results also suggest that some sentence comprehension deficits in PD may be related to a reduction in information processing speed.
Resumo:
Finding motifs that can elucidate rules that govern peptide binding to medically important receptors is important for screening targets for drugs and vaccines. This paper focuses on elucidation of peptide binding to I-A(g7) molecule of the non-obese diabetic (NOD) mouse - an animal model for insulin-dependent diabetes mellitus (IDDM). A number of proposed motifs that describe peptide binding to I-A(g7) have been proposed. These motifs results from independent experimental studies carried out on small data sets. Testing with multiple data sets showed that each of the motifs at best describes only a subset of the solution space, and these motifs therefore lack generalization ability. This study focuses on seeking a motif with higher generalization ability so that it can predict binders in all A(g7) data sets with high accuracy. A binding score matrix representing peptide binding motif to A(g7) was derived using genetic algorithm (GA). The evolved score matrix significantly outperformed previously reported
Resumo:
We propose a scheme for quantum information processing based on donor electron spins in semiconductors, with an architecture complementary to the original Kane proposal. We show that a naive implementation of electron spin qubits provides only modest improvement over the Kane scheme, however through the introduction of global gate control we are able to take full advantage of the fast electron evolution timescales. We estimate that the latent clock speed is 100-1000 times that of the nuclear spin quantum computer with the ratio T-2/T-ops approaching the 10(6) level.
Resumo:
With growing success in experimental implementations it is critical to identify a gold standard for quantum information processing, a single measure of distance that can be used to compare and contrast different experiments. We enumerate a set of criteria that such a distance measure must satisfy to be both experimentally and theoretically meaningful. We then assess a wide range of possible measures against these criteria, before making a recommendation as to the best measures to use in characterizing quantum information processing.
Resumo:
The notorious "dimensionality curse" is a well-known phenomenon for any multi-dimensional indexes attempting to scale up to high dimensions. One well-known approach to overcome degradation in performance with respect to increasing dimensions is to reduce the dimensionality of the original dataset before constructing the index. However, identifying the correlation among the dimensions and effectively reducing them are challenging tasks. In this paper, we present an adaptive Multi-level Mahalanobis-based Dimensionality Reduction (MMDR) technique for high-dimensional indexing. Our MMDR technique has four notable features compared to existing methods. First, it discovers elliptical clusters for more effective dimensionality reduction by using only the low-dimensional subspaces. Second, data points in the different axis systems are indexed using a single B+-tree. Third, our technique is highly scalable in terms of data size and dimension. Finally, it is also dynamic and adaptive to insertions. An extensive performance study was conducted using both real and synthetic datasets, and the results show that our technique not only achieves higher precision, but also enables queries to be processed efficiently. Copyright Springer-Verlag 2005
Resumo:
Workflow systems have traditionally focused on the so-called production processes which are characterized by pre-definition, high volume, and repetitiveness. Recently, the deployment of workflow systems in non-traditional domains such as collaborative applications, e-learning and cross-organizational process integration, have put forth new requirements for flexible and dynamic specification. However, this flexibility cannot be offered at the expense of control, a critical requirement of business processes. In this paper, we will present a foundation set of constraints for flexible workflow specification. These constraints are intended to provide an appropriate balance between flexibility and control. The constraint specification framework is based on the concept of pockets of flexibility which allows ad hoc changes and/or building of workflows for highly flexible processes. Basically, our approach is to provide the ability to execute on the basis of a partially specified model, where the full specification of the model is made at runtime, and may be unique to each instance. The verification of dynamically built models is essential. Where as ensuring that the model conforms to specified constraints does not pose great difficulty, ensuring that the constraint set itself does not carry conflicts and redundancy is an interesting and challenging problem. In this paper, we will provide a discussion on both the static and dynamic verification aspects. We will also briefly present Chameleon, a prototype workflow engine that implements these concepts. (c) 2004 Elsevier Ltd. All rights reserved.