998 resultados para Quantum cognition
Resumo:
What type of probability theory best describes the way humans make judgments under uncertainty and decisions under conflict? Although rational models of cognition have become prominent and have achieved much success, they adhere to the laws of classical probability theory despite the fact that human reasoning does not always conform to these laws. For this reason we have seen the recent emergence of models based on an alternative probabilistic framework drawn from quantum theory. These quantum models show promise in addressing cognitive phenomena that have proven recalcitrant to modeling by means of classical probability theory. This review compares and contrasts probabilistic models based on Bayesian or classical versus quantum principles, and highlights the advantages and disadvantages of each approach.
Resumo:
Much of the work currently occurring in the field of Quantum Interaction (QI) relies upon Projective Measurement. This is perhaps not optimal, cognitive states are not nearly as well behaved as standard quantum mechanical systems; they exhibit violations of repeatability, and the operators that we use to describe measurements do not appear to be naturally orthogonal in cognitive systems. Here we attempt to map the formalism of Positive Operator Valued Measure (POVM) theory into the domain of semantic memory, showing how it might be used to construct Bell-type inequalities.
Resumo:
This article examines manual textual categorisation by human coders with the hypothesis that the law of total probability may be violated for difficult categories. An empirical evaluation was conducted to compare a one step categorisation task with a two step categorisation task using crowdsourcing. It was found that the law of total probability was violated. Both a quantum and classical probabilistic interpretations for this violation are presented. Further studies are required to resolve whether quantum models are more appropriate for this task.
Resumo:
This article presents a study of how humans perceive and judge the relevance of documents. Humans are adept at making reasonably robust and quick decisions about what information is relevant to them, despite the ever increasing complexity and volume of their surrounding information environment. The literature on document relevance has identified various dimensions of relevance (e.g., topicality, novelty, etc.), however little is understood about how these dimensions may interact. We performed a crowdsourced study of how human subjects judge two relevance dimensions in relation to document snippets retrieved from an internet search engine. The order of the judgment was controlled. For those judgments exhibiting an order effect, a q–test was performed to determine whether the order effects can be explained by a quantum decision model based on incompatible decision perspectives. Some evidence of incompatibility was found which suggests incompatible decision perspectives is appropriate for explaining interacting dimensions of relevance in such instances.
Resumo:
Conceptual combination performs a fundamental role in creating the broad range of compound phrases utilised in everyday language. While the systematicity and productivity of language provide a strong argument in favour of assuming compositionality, this very assumption is still regularly questioned in both cognitive science and philosophy. This article provides a novel probabilistic framework for assessing whether the semantics of conceptual combinations are compositional, and so can be considered as a function of the semantics of the constituent concepts, or not. Rather than adjudicating between different grades of compositionality, the framework presented here contributes formal methods for determining a clear dividing line between compositional and non-compositional semantics. Compositionality is equated with a joint probability distribution modelling how the constituent concepts in the combination are interpreted. Marginal selectivity is emphasised as a pivotal probabilistic constraint for the application of the Bell/CH and CHSH systems of inequalities (referred to collectively as Bell-type). Non-compositionality is then equated with either a failure of marginal selectivity, or, in the presence of marginal selectivity, with a violation of Bell-type inequalities. In both non-compositional scenarios, the conceptual combination cannot be modelled using a joint probability distribution with variables corresponding to the interpretation of the individual concepts. The framework is demonstrated by applying it to an empirical scenario of twenty-four non-lexicalised conceptual combinations.
Resumo:
It is well known that different arguments appeal to different people. We all process information in ways that are adapted to be consistent with our underlying ideologies. These ideologies can sometimes be framed in terms of particular axes or dimensions, which makes it possible to represent some aspects of an ideology as a region in the kind of vector space that is typical of many generalised quantum models. Such models can then be used to explain and predict, in broad strokes, whether a particular argument or proposal is likely to appeal to an individual with a particular ideology. The choice of suitable arguments to bring about desired actions is traditionally part of the art or science of rhetoric, and today's highly polarised society means that this skill is becoming more important than ever. This paper presents a basic model for understanding how different goals will appeal to people with different ideologies, and thus how different rhetorical positions can be adopted to promote the same desired outcome. As an example, we consider different narratives and hence actions with respect to the environment and climate change, an important but currently highly controversial topic.
Resumo:
Much of our understanding of human thinking is based on probabilistic models. This innovative book by Jerome R. Busemeyer and Peter D. Bruza argues that, actually, the underlying mathematical structures from quantum theory provide a much better account of human thinking than traditional models. They introduce the foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory. The first, "contextuality", is a way to understand interference effects found with inferences and decisions under conditions of uncertainty. The second, "entanglement", allows cognitive phenomena to be modelled in non-reductionist ways. Employing these principles drawn from quantum theory allows us to view human cognition and decision in a totally new light...
Resumo:
New models of human cognition inspired by quantum theory could underpin information technologies that are better aligned with howwe recall information.
Resumo:
This talk proceeds from the premise that IR should engage in a more substantial dialogue with cognitive science. After all, how users decide relevance, or how they chose terms to modify a query are processes rooted in human cognition. Recently, there has been a growing literature applying quantum theory (QT) to model cognitive phenomena. This talk will survey recent research, in particular, modelling interference effects in human decision making. One aspect of QT will be illustrated - how quantum entanglement can be used to model word associations in human memory. The implications of this will be briefly discussed in terms of a new approach for modelling concept combinations. Tentative links to human adductive reasoning will also be drawn. The basic theme behind this talk is QT can potentially provide a new genre of information processing models (including search) more aligned with human cognition.
Resumo:
Quantum psychopathology holds the so called “quantum mind” hypothesis, which is controversial. In addition, this hypothesis focuses attention onto quantum processes in the brain, and how this may relate to psychopathological issues. This is very “low level”. As a consequence, it is challenging to form bridges to “higher level” problems related to psychopathology. By adopting the stance used in the quantum interaction community or researchers, this reply puts forward the idea that an idealistic approach may circumvent the controversy and opens the way for addressing challenges at higher levels of psychopathology.
Resumo:
Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.
Resumo:
The aim of this thesis is to investigate the nature of quantum computation and the question of the quantum speed-up over classical computation by comparing two different quantum computational frameworks, the traditional quantum circuit model and the cluster-state quantum computer. After an introductory survey of the theoretical and epistemological questions concerning quantum computation, the first part of this thesis provides a presentation of cluster-state computation suitable for a philosophical audience. In spite of the computational equivalence between the two frameworks, their differences can be considered as structural. Entanglement is shown to play a fundamental role in both quantum circuits and cluster-state computers; this supports, from a new perspective, the argument that entanglement can reasonably explain the quantum speed-up over classical computation. However, quantum circuits and cluster-state computers diverge with regard to one of the explanations of quantum computation that actually accords a central role to entanglement, i.e. the Everett interpretation. It is argued that, while cluster-state quantum computation does not show an Everettian failure in accounting for the computational processes, it threatens that interpretation of being not-explanatory. This analysis presented here should be integrated in a more general work in order to include also further frameworks of quantum computation, e.g. topological quantum computation. However, what is revealed by this work is that the speed-up question does not capture all that is at stake: both quantum circuits and cluster-state computers achieve the speed-up, but the challenges that they posit go besides that specific question. Then, the existence of alternative equivalent quantum computational models suggests that the ultimate question should be moved from the speed-up to a sort of “representation theorem” for quantum computation, to be meant as the general goal of identifying the physical features underlying these alternative frameworks that allow for labelling those frameworks as “quantum computation”.
Resumo:
All four of the most important figures in the early twentieth-century development of quantum physics-Niels Bohr, Erwin Schroedinger, Werner Heisenberg and Wolfgang Pauli-had strong interests in the traditional mind-brain, or 'hard,' problem. This paper reviews their approach to this problem, showing the influence of Bohr's complementarity thesis, the significance of Schroedinger's small book, 'What is life?,' the updated Platonism of Heisenberg and, perhaps most interesting of all, the interaction of Carl Jung and Wolfgang Pauli in the latter's search for a unification of mind and matter. © 2005 Elsevier Inc. All rights reserved.
Resumo:
This is the second part of a review of the work of quantum physicists on the ‘hard part’ of the problem of mind. After an introduction which sets the scene and a brief review of contemporary work on the neural correlates of consciousness (NCC) the work of four prominent modern investigators is examined: J.C. Eccles/Friedrich Beck; Henry Stapp; Stuart Hameroff/Roger Penrose; David Bohm. With the exception of David Bohm, all attempt to show where in the brain’s microstructure quantum affects could make themselves felt. It is reluctantly concluded that none have neurobiological plausibility. They are all instances, to paraphrase T.H. Huxley, of a beautiful hypothesis destroyed by ugly facts. David Bohm does not attempt to fit his new quantum physics to contemporary neurobiology but instead asks for a radical rethink of our conventional scientific paradigm. He suggests that we should look towards developing a ‘pan-experientialism’ or ‘dual-aspect monism’ where consciousness goes ‘all the way down’ and that the ‘hard problem’ is not soluble within the framework of ideas provided by ‘classical’ natural science.