939 resultados para INFORMATION PROCESSING
Resumo:
Based on a simple convexity lemma, we develop bounds for different types of Bayesian prediction errors for regression with Gaussian processes. The basic bounds are formulated for a fixed training set. Simpler expressions are obtained for sampling from an input distribution which equals the weight function of the covariance kernel, yielding asymptotically tight results. The results are compared with numerical experiments.
Resumo:
We discuss the Application of TAP mean field methods known from Statistical Mechanics of disordered systems to Bayesian classification with Gaussian processes. In contrast to previous applications, no knowledge about the distribution of inputs is needed. Simulation results for the Sonar data set are given.
Resumo:
We apply methods of Statistical Mechanics to study the generalization performance of Support vector Machines in large data spaces.
Resumo:
We employ the methods presented in the previous chapter for decoding corrupted codewords, encoded using sparse parity check error correcting codes. We show the similarity between the equations derived from the TAP approach and those obtained from belief propagation, and examine their performance as practical decoding methods.
Resumo:
We analyse Gallager codes by employing a simple mean-field approximation that distorts the model geometry and preserves important interactions between sites. The method naturally recovers the probability propagation decoding algorithm as a minimization of a proper free-energy. We find a thermodynamical phase transition that coincides with information theoretical upper-bounds and explain the practical code performance in terms of the free-energy landscape.
Resumo:
We combine the replica approach from statistical physics with a variational approach to analyze learning curves analytically. We apply the method to Gaussian process regression. As a main result we derive approximative relations between empirical error measures, the generalization error and the posterior variance.
Resumo:
The problem of resource allocation in sparse graphs with real variables is studied using methods of statistical physics. An efficient distributed algorithm is devised on the basis of insight gained from the analysis and is examined using numerical simulations, showing excellent performance and full agreement with the theoretical results.
Resumo:
We consider the problem of illusory or artefactual structure from the visualisation of high-dimensional structureless data. In particular we examine the role of the distance metric in the use of topographic mappings based on the statistical field of multidimensional scaling. We show that the use of a squared Euclidean metric (i.e. the SSTRESs measure) gives rise to an annular structure when the input data is drawn from a high-dimensional isotropic distribution, and we provide a theoretical justification for this observation.
Resumo:
The Thouless-Anderson-Palmer (TAP) approach was originally developed for analysing the Sherrington-Kirkpatrick model in the study of spin glass models and has been employed since then mainly in the context of extensively connected systems whereby each dynamical variable interacts weakly with the others. Recently, we extended this method for handling general intensively connected systems where each variable has only O(1) connections characterised by strong couplings. However, the new formulation looks quite different with respect to existing analyses and it is only natural to question whether it actually reproduces known results for systems of extensive connectivity. In this chapter, we apply our formulation of the TAP approach to an extensively connected system, the Hopfield associative memory model, showing that it produces identical results to those obtained by the conventional formulation.
Resumo:
This multi-modal investigation aimed to refine analytic tools including proton magnetic resonance spectroscopy (1H-MRS) and fatty acid gas chromatography-mass spectrometry (GC-MS) analysis, for use with adult and paediatric populations, to investigate potential biochemical underpinnings of cognition (Chapter 1). Essential fatty acids (EFAs) are vital for the normal development and function of neural cells. There is increasing evidence of behavioural impairments arising from dietary deprivation of EFAs and their long-chain fatty acid metabolites (Chapter 2). Paediatric liver disease was used as a deficiency model to examine the relationships between EFA status and cognitive outcomes. Age-appropriate Wechsler assessments measured Full-scale IQ (FSIQ) and Information Processing Speed (IPS) in clinical and healthy cohorts; GC-MS quantified surrogate markers of EFA status in erythrocyte membranes; and 1H-MRS quantified neurometabolite markers of neuronal viability and function in cortical tissue (Chapter 3). Post-transplant children with early-onset liver disease demonstrated specific deficits in IPS compared to age-matched acute liver failure transplant patients and sibling controls, suggesting that the time-course of the illness is a key factor (Chapter 4). No signs of EFA deficiency were observed in the clinical cohort, suggesting that EFA metabolism was not significantly impacted by liver disease. A strong, negative correlation was observed between omega-6 fatty acids and FSIQ, independent of disease diagnosis (Chapter 5). In a study of healthy adults, effect sizes for the relationship between 1H-MRS- detectable neurometabolites and cognition fell within the range of previous work, but were not statistically significant. Based on these findings, recommendations are made emphasising the need for hypothesis-driven enquiry and greater subtlety of data analysis (Chapter 6). Consistency of metabolite values between paediatric clinical cohorts and controls indicate normal neurodevelopment, but the lack of normative, age-matched data makes it difficult to assess the true strength of liver disease-associated metabolite changes (Chapter 7). Converging methods offer a challenging but promising and novel approach to exploring brain-behaviour relationships from micro- to macroscopic levels of analysis (Chapter 8).
Resumo:
The nature of Discrete-Event Simulation (DES) and the use of DES in organisations is changing. Two important developments are the use of Visual Interactive Modelling systems and the use of DES in Business Process Management (BPM) projects. Survey research is presented that shows that despite these developments usage of DES remains relatively low due to a lack of knowledge of the benefits of the technique. This paper considers two factors that could lead to a greater achievement and appreciation of the full benefit of DES and thus lead to greater usage. Firstly in relation to using DES to investigate social systems, both in the process of undertaking a simulation project and in the interpretation of the findings a 'soft' approach may generate more knowledge from the DES intervention and thus increase its benefit to businesses. Secondly in order to assess the full range of outcomes of DES the technique could be considered from the perspective of an information processing tool within the organisation. This will allow outcomes to be considered under the three modes of organisational information use of sense making, knowledge creating and decision making which relate to the theoretical areas of knowledge management, organisational learning and decision making respectively. The association of DES with these popular techniques could further increase its usage in business.
Resumo:
Different forms of strategic flexibility allow for reactive adaptation to different changing environments and the proactive driving of change. It is therefore becoming increasingly important for decision makers to not only possess marketing capabilities, but also the capabilities for strategic flexibility in its various forms. However, our knowledge of the relationships between decision makers’ different ways of thinking and their capabilities for strategic flexibility is limited. This limitation is constraining research and understanding. In this article we develop a theoretical cognitive content framework that postulates relationships between different ways of thinking about strategy and different information-processing demands. We then outline how the contrasting beliefs of decision makers may influence their capabilities to generate different hybrid forms of strategic flexibility at the cognitive level. Theoretically, the framework is embedded in resource-based theory, personal construct theory and schema theory. The implications for research and theory are discussed.
Resumo:
Research on diversity in teams and organizations has revealed ambiguous results regarding the effects of group composition on workgroup performance. The categorization—elaboration model (van Knippenberg et al., 2004) accounts for this variety and proposes two different underlying processes. On the one hand diversity may bring about intergroup bias which leads to less group identification, which in turn is followed by more conflict and decreased workgroup performance. On the other hand, the information processing approach proposes positive effects of diversity because of a more elaborate processing of information brought about by a wider pool and variety of perspectives in more diverse groups. We propose that the former process is contingent on individual team members' beliefs that diversity is good or bad for achieving the team's aims. We predict that the relationship between subjective diversity and identification is more positive in ethnically diverse project teams when group members hold beliefs that are pro-diversity. Results of two longitudinal studies involving postgraduate students working in project teams confirm this hypothesis. Analyses further reveal that group identification is positively related to students' desire to stay in their groups and to their information elaboration. Finally, we found evidence for the expected moderated mediation model with indirect effects of subjective diversity on elaboration and the desire to stay, mediated through group identification, moderated by diversity beliefs.
Resumo:
Introductory accounts of artificial neural networks often rely for motivation on analogies with models of information processing in biological networks. One limitation of such an approach is that it offers little guidance on how to find optimal algorithms, or how to verify the correct performance of neural network systems. A central goal of this paper is to draw attention to a quite different viewpoint in which neural networks are seen as algorithms for statistical pattern recognition based on a principled, i.e. theoretically well-founded, framework. We illustrate the concept of a principled viewpoint by considering a specific issue concerned with the interpretation of the outputs of a trained network. Finally, we discuss the relevance of such an approach to the issue of the validation and verification of neural network systems.
Resumo:
In 2002, we published a paper [Brock, J., Brown, C., Boucher, J., Rippon, G., 2002. The temporal binding deficit hypothesis of autism. Development and Psychopathology 142, 209-224] highlighting the parallels between the psychological model of 'central coherence' in information processing [Frith, U., 1989. Autism: Explaining the Enigma. Blackwell, Oxford] and the neuroscience model of neural integration or 'temporal binding'. We proposed that autism is associated with abnormalities of information integration that is caused by a reduction in the connectivity between specialised local neural networks in the brain and possible overconnectivity within the isolated individual neural assemblies. The current paper updates this model, providing a summary of theoretical and empirical advances in research implicating disordered connectivity in autism. This is in the context of changes in the approach to the core psychological deficits in autism, of greater emphasis on 'interactive specialisation' and the resultant stress on early and/or low-level deficits and their cascading effects on the developing brain [Johnson, M.H., Halit, H., Grice, S.J., Karmiloff-Smith, A., 2002. Neuroimaging of typical and atypical development: a perspective from multiple levels of analysis. Development and Psychopathology 14, 521-536].We also highlight recent developments in the measurement and modelling of connectivity, particularly in the emerging ability to track the temporal dynamics of the brain using electroencephalography (EEG) and magnetoencephalography (MEG) and to investigate the signal characteristics of this activity. This advance could be particularly pertinent in testing an emerging model of effective connectivity based on the balance between excitatory and inhibitory cortical activity [Rubenstein, J.L., Merzenich M.M., 2003. Model of autism: increased ratio of excitation/inhibition in key neural systems. Genes, Brain and Behavior 2, 255-267; Brown, C., Gruber, T., Rippon, G., Brock, J., Boucher, J., 2005. Gamma abnormalities during perception of illusory figures in autism. Cortex 41, 364-376]. Finally, we note that the consequence of this convergence of research developments not only enables a greater understanding of autism but also has implications for prevention and remediation. © 2006.