94 resultados para Textual information processing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This multi-modal investigation aimed to refine analytic tools including proton magnetic resonance spectroscopy (1H-MRS) and fatty acid gas chromatography-mass spectrometry (GC-MS) analysis, for use with adult and paediatric populations, to investigate potential biochemical underpinnings of cognition (Chapter 1). Essential fatty acids (EFAs) are vital for the normal development and function of neural cells. There is increasing evidence of behavioural impairments arising from dietary deprivation of EFAs and their long-chain fatty acid metabolites (Chapter 2). Paediatric liver disease was used as a deficiency model to examine the relationships between EFA status and cognitive outcomes. Age-appropriate Wechsler assessments measured Full-scale IQ (FSIQ) and Information Processing Speed (IPS) in clinical and healthy cohorts; GC-MS quantified surrogate markers of EFA status in erythrocyte membranes; and 1H-MRS quantified neurometabolite markers of neuronal viability and function in cortical tissue (Chapter 3). Post-transplant children with early-onset liver disease demonstrated specific deficits in IPS compared to age-matched acute liver failure transplant patients and sibling controls, suggesting that the time-course of the illness is a key factor (Chapter 4). No signs of EFA deficiency were observed in the clinical cohort, suggesting that EFA metabolism was not significantly impacted by liver disease. A strong, negative correlation was observed between omega-6 fatty acids and FSIQ, independent of disease diagnosis (Chapter 5). In a study of healthy adults, effect sizes for the relationship between 1H-MRS- detectable neurometabolites and cognition fell within the range of previous work, but were not statistically significant. Based on these findings, recommendations are made emphasising the need for hypothesis-driven enquiry and greater subtlety of data analysis (Chapter 6). Consistency of metabolite values between paediatric clinical cohorts and controls indicate normal neurodevelopment, but the lack of normative, age-matched data makes it difficult to assess the true strength of liver disease-associated metabolite changes (Chapter 7). Converging methods offer a challenging but promising and novel approach to exploring brain-behaviour relationships from micro- to macroscopic levels of analysis (Chapter 8).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The nature of Discrete-Event Simulation (DES) and the use of DES in organisations is changing. Two important developments are the use of Visual Interactive Modelling systems and the use of DES in Business Process Management (BPM) projects. Survey research is presented that shows that despite these developments usage of DES remains relatively low due to a lack of knowledge of the benefits of the technique. This paper considers two factors that could lead to a greater achievement and appreciation of the full benefit of DES and thus lead to greater usage. Firstly in relation to using DES to investigate social systems, both in the process of undertaking a simulation project and in the interpretation of the findings a 'soft' approach may generate more knowledge from the DES intervention and thus increase its benefit to businesses. Secondly in order to assess the full range of outcomes of DES the technique could be considered from the perspective of an information processing tool within the organisation. This will allow outcomes to be considered under the three modes of organisational information use of sense making, knowledge creating and decision making which relate to the theoretical areas of knowledge management, organisational learning and decision making respectively. The association of DES with these popular techniques could further increase its usage in business.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Different forms of strategic flexibility allow for reactive adaptation to different changing environments and the proactive driving of change. It is therefore becoming increasingly important for decision makers to not only possess marketing capabilities, but also the capabilities for strategic flexibility in its various forms. However, our knowledge of the relationships between decision makers’ different ways of thinking and their capabilities for strategic flexibility is limited. This limitation is constraining research and understanding. In this article we develop a theoretical cognitive content framework that postulates relationships between different ways of thinking about strategy and different information-processing demands. We then outline how the contrasting beliefs of decision makers may influence their capabilities to generate different hybrid forms of strategic flexibility at the cognitive level. Theoretically, the framework is embedded in resource-based theory, personal construct theory and schema theory. The implications for research and theory are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Research on diversity in teams and organizations has revealed ambiguous results regarding the effects of group composition on workgroup performance. The categorization—elaboration model (van Knippenberg et al., 2004) accounts for this variety and proposes two different underlying processes. On the one hand diversity may bring about intergroup bias which leads to less group identification, which in turn is followed by more conflict and decreased workgroup performance. On the other hand, the information processing approach proposes positive effects of diversity because of a more elaborate processing of information brought about by a wider pool and variety of perspectives in more diverse groups. We propose that the former process is contingent on individual team members' beliefs that diversity is good or bad for achieving the team's aims. We predict that the relationship between subjective diversity and identification is more positive in ethnically diverse project teams when group members hold beliefs that are pro-diversity. Results of two longitudinal studies involving postgraduate students working in project teams confirm this hypothesis. Analyses further reveal that group identification is positively related to students' desire to stay in their groups and to their information elaboration. Finally, we found evidence for the expected moderated mediation model with indirect effects of subjective diversity on elaboration and the desire to stay, mediated through group identification, moderated by diversity beliefs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introductory accounts of artificial neural networks often rely for motivation on analogies with models of information processing in biological networks. One limitation of such an approach is that it offers little guidance on how to find optimal algorithms, or how to verify the correct performance of neural network systems. A central goal of this paper is to draw attention to a quite different viewpoint in which neural networks are seen as algorithms for statistical pattern recognition based on a principled, i.e. theoretically well-founded, framework. We illustrate the concept of a principled viewpoint by considering a specific issue concerned with the interpretation of the outputs of a trained network. Finally, we discuss the relevance of such an approach to the issue of the validation and verification of neural network systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In 2002, we published a paper [Brock, J., Brown, C., Boucher, J., Rippon, G., 2002. The temporal binding deficit hypothesis of autism. Development and Psychopathology 142, 209-224] highlighting the parallels between the psychological model of 'central coherence' in information processing [Frith, U., 1989. Autism: Explaining the Enigma. Blackwell, Oxford] and the neuroscience model of neural integration or 'temporal binding'. We proposed that autism is associated with abnormalities of information integration that is caused by a reduction in the connectivity between specialised local neural networks in the brain and possible overconnectivity within the isolated individual neural assemblies. The current paper updates this model, providing a summary of theoretical and empirical advances in research implicating disordered connectivity in autism. This is in the context of changes in the approach to the core psychological deficits in autism, of greater emphasis on 'interactive specialisation' and the resultant stress on early and/or low-level deficits and their cascading effects on the developing brain [Johnson, M.H., Halit, H., Grice, S.J., Karmiloff-Smith, A., 2002. Neuroimaging of typical and atypical development: a perspective from multiple levels of analysis. Development and Psychopathology 14, 521-536].We also highlight recent developments in the measurement and modelling of connectivity, particularly in the emerging ability to track the temporal dynamics of the brain using electroencephalography (EEG) and magnetoencephalography (MEG) and to investigate the signal characteristics of this activity. This advance could be particularly pertinent in testing an emerging model of effective connectivity based on the balance between excitatory and inhibitory cortical activity [Rubenstein, J.L., Merzenich M.M., 2003. Model of autism: increased ratio of excitation/inhibition in key neural systems. Genes, Brain and Behavior 2, 255-267; Brown, C., Gruber, T., Rippon, G., Brock, J., Boucher, J., 2005. Gamma abnormalities during perception of illusory figures in autism. Cortex 41, 364-376]. Finally, we note that the consequence of this convergence of research developments not only enables a greater understanding of autism but also has implications for prevention and remediation. © 2006.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effect of having a fixed differential group delay term in the coarse step method results in a periodic pattern in the inserting a varying DGD term at each integration step, according to a Gaussian distribution. Simulation results are given to illustrate the phenomenon and provide some evidence about its statistical nature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The visual evoked magnetic response (VEMR) was measured over the occipital cortex to pattern and flash stimuli in 86 normal subjects aged 15-86 years. The latency of the major positive component (outgoing magnetic field) to the pattern reversal stimulus (P100M) increased with age, particularly after 55 years, while the amplitude of the P100M decreased more gradually over the lifespan. By contrast, the latency of the major positive component to the flash stimulus (P2M) increased more slowly with age after about 50 years, while its amplitude may have decreased in only a proportion of the elderly subjects. The changes in the P100M with age may reflect senile changes in the eye and optic nerve, e.g. senile miosis, degenerative changes in the retina or geniculostriate deficits. The P2M may be more susceptible to senile changes in the visual cortex. The data suggest that the contrast channels of visual information processing deteriorate more rapidly with age than the luminance channels.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diffusion processes are a family of continuous-time continuous-state stochastic processes that are in general only partially observed. The joint estimation of the forcing parameters and the system noise (volatility) in these dynamical systems is a crucial, but non-trivial task, especially when the system is nonlinear and multimodal. We propose a variational treatment of diffusion processes, which allows us to compute type II maximum likelihood estimates of the parameters by simple gradient techniques and which is computationally less demanding than most MCMC approaches. We also show how a cheap estimate of the posterior over the parameters can be constructed based on the variational free energy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The assessment of the reliability of systems which learn from data is a key issue to investigate thoroughly before the actual application of information processing techniques to real-world problems. Over the recent years Gaussian processes and Bayesian neural networks have come to the fore and in this thesis their generalisation capabilities are analysed from theoretical and empirical perspectives. Upper and lower bounds on the learning curve of Gaussian processes are investigated in order to estimate the amount of data required to guarantee a certain level of generalisation performance. In this thesis we analyse the effects on the bounds and the learning curve induced by the smoothness of stochastic processes described by four different covariance functions. We also explain the early, linearly-decreasing behaviour of the curves and we investigate the asymptotic behaviour of the upper bounds. The effect of the noise and the characteristic lengthscale of the stochastic process on the tightness of the bounds are also discussed. The analysis is supported by several numerical simulations. The generalisation error of a Gaussian process is affected by the dimension of the input vector and may be decreased by input-variable reduction techniques. In conventional approaches to Gaussian process regression, the positive definite matrix estimating the distance between input points is often taken diagonal. In this thesis we show that a general distance matrix is able to estimate the effective dimensionality of the regression problem as well as to discover the linear transformation from the manifest variables to the hidden-feature space, with a significant reduction of the input dimension. Numerical simulations confirm the significant superiority of the general distance matrix with respect to the diagonal one.In the thesis we also present an empirical investigation of the generalisation errors of neural networks trained by two Bayesian algorithms, the Markov Chain Monte Carlo method and the evidence framework; the neural networks have been trained on the task of labelling segmented outdoor images.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prior research suggests management can employ cognitively demanding job attributes to promote employee creativity. However, it is not clear what specific type of cognitive demand is particularly important for creativity, what processes underpin the relationship between demanding job conditions and creativity and what factors lead to employee perceptions of demanding job attributes. This research sets out to address the aforementioned issues by examining: (i) problem-solving demand (PDS), a specific type of cognitive demand, and the processes that link PSD to creativity, and (ii) antecedents to PSD. Based on social cognitive theory, PSD was hypothesized to be positively related to creativity through the motivational mechanism of creative self-efficacy. However, the relationship between PSD and creative self-efficacy was hypothesized to be contingent on levels of intrinsic motivation. Social information processing perspective and the job crafting model were used to identify antecedents of PSD. Consequently, two social-contextual factors (supervisor developmental feedback and job autonomy) and one individual factor (proactive personality) were hypothesized to be precursors to PSD perceptions. The theorized model was tested with data obtained from a sample of 270 employees and their supervisors from 3 organisations in the People’s Republic of China. Regression results revealed that PSD was positively related to creativity but this relationship was partially mediated by creative self-efficacy. Additionally, intrinsic motivation moderated the relationship between PSD and creative self-efficacy such that the relationship was stronger for individuals high rather than low in intrinsic motivation. The findings represent a productive first step in identifying a specific cognitive demand that is conducive to employee creativity. In addition, the findings contribute to the literature by identifying a psychological mechanism that may link cognitively demanding job attributes and creativity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exporting is one of the main ways in which organizations internationalize. With the more turbulent, heterogeneous, sophisticated and less familiar export environment, the organizational learning ability of the exporting organization may become its only source of sustainable competitive advantage. However, achieving a competitive level of learning is not easy. Companies must be able to find ways to improve their learning capability by enhancing the different aspects of the learning process. One of these is export memory. Building from an export information processing framework this research work particularly focuses on the quality of export memory, its determinants, its subsequent use in decision-making, and its ultimate relationship with export performance. Within export memory use, four export memory use dimensions have been discovered: instrumental, conceptual, legitimizing and manipulating. Results from the qualitative study based on the data from a mail survey with 354 responses reveal that the development of export memory quality is positively related with quality of export information acquisition, the quality of export information interpretation, export coordination, and integration of the information into the organizational system. Several company and environmental factors have also been examined in terms of their relationship with export memory use. The two factors found to be significantly related to the extent of export memory use are acquisition of export information quality and export memory quality. The results reveal that export memory quality is positively related to the extent of export memory use which in turn was found to be positively related to export performance. Furthermore, results of the study show that there is only one aspect of export memory use that significantly affects export performance – the extent of export memory use. This finding could mean that there is no particular type of export memory use favored since the choice of the type of use is situation specific. Additional results reveal that environmental turbulence and export memory overload have moderating effects on the relationship between export memory use and export performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The existence of different varieties of the acquired reading disorder termed "phonological dyslexia" is demonstrated in this thesis. The data are interpreted in terms of an information-processing model of normal reading which postulates autonomous routes for pronouncing lexical and non-lexical items and identifies a number of separable sub-processes within both lexical and non-lexical routes. A case study approach is used and case reports on ten patients who have particular difficulty in processing non-lexical stimuli following cerebral insult are presented, Chapters 1 and 2 describe the theoretical background to the investigation. Cognitive models of reading are examined in Chapter 1 and the theoretical status of the current taxonomy of the acquired dyslexias discussed in relation to the models. In Chapter 2 the symptoms associated with phonological dyslexia are discussed both in terms of the theoretical models and in terms of the cosistency with which they are reported to occur in clinical studies. Published cases of phonological dyslexia are reviewed. Chapter 3 describes the tests administered and the analysis of error responses. The majority of tests require reading aloud of single lexical or non-lexical items and investigate the effect of different variables on reading performance. Chapter 4 contains the case reports. The final chapter summarises the different patterns of reading behaviour observed. The theoretical model predicts the selective impairment of subsystems within the phonological route. The data provide evidence of such selective impairment. It is concluded that there are different varieties of phonological dyslexia corresponding to the different loci of impairment within the phonological route. It is also concluded that the data support the hypothesis that there are two lexical routes. A further subdivision of phonological dyslexia is made on the basis of selective impairment of the direct or lexical-semantic routes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Safety enforcement practitioners within Europe and marketers, designers or manufacturers of consumer products need to determine compliance with the legal test of "reasonable safety" for consumer goods, to reduce the "risks" of injury to the minimum. To enable freedom of movement of products, a method for safety appraisal is required for use as an "expert" system of hazard analysis by non-experts in safety testing of consumer goods for implementation consistently throughout Europe. Safety testing approaches and the concept of risk assessment and hazard analysis are reviewed in developing a model for appraising consumer product safety which seeks to integrate the human factors contribution of risk assessment, hazard perception, and information processing. The model develops a system of hazard identification, hazard analysis and risk assessment which can be applied to a wide range of consumer products through use of a series of systematic checklists and matrices and applies alternative numerical and graphical methods for calculating a final product safety risk assessment score. It is then applied in its pilot form by selected "volunteer" Trading Standards Departments to a sample of consumer products. A series of questionnaires is used to select participating Trading Standards Departments, to explore the contribution of potential subjective influences, to establish views regarding the usability and reliability of the model and any preferences for the risk assessment scoring system used. The outcome of the two stage hazard analysis and risk assessment process is considered to determine consistency in results of hazard analysis, final decisions regarding the safety of the sample product and to determine any correlation in the decisions made using the model and alternative scoring methods of risk assessment. The research also identifies a number of opportunities for future work, and indicates a number of areas where further work has already begun.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.