192 resultados para Penning traps, quantum electrodynamic, electron
Resumo:
This position paper provides an overview of work conducted and an outlook of future directions within the field of Information Retrieval (IR) that aims to develop novel models, methods and frameworks inspired by Quantum Theory (QT).
Resumo:
Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.
Resumo:
As computers approach the physical limits of information storable in memory, new methods will be needed to further improve information storage and retrieval. We propose a quantum inspired vector based approach, which offers a contextually dependent mapping from the subsymbolic to the symbolic representations of information. If implemented computationally, this approach would provide exceptionally high density of information storage, without the traditionally required physical increase in storage capacity. The approach is inspired by the structure of human memory and incorporates elements of Gardenfors’ Conceptual Space approach and Humphreys et al.’s matrix model of memory.
Resumo:
Compositionality is a frequently made assumption in linguistics, and yet many human subjects reveal highly non-compositional word associations when confronted with novel concept combinations. This article will show how a non-compositional account of concept combinations can be supplied by modelling them as interacting quantum systems.
Resumo:
An anatase TiO 2 material with hierarchically structured spheres consisting of ultrathin nanosheets with 100% of the [001] facet exposed was employed to fabricate dye-sensitized solar cells (DSC s). Investigation of the electron transport and back reaction of the DSCs by electrochemical impedance spectroscopy showed that the spheres had a threefold lower electron recombination rate compared to the conventional TiO 2 nanoparticles. In contrast, the effective electron diffusion coefficient, D n, was not sensitive to the variation of the TiO 2 morphology. The TiO 2 spheres showed the same Dn as that of the nanoparticles. The influence of TiCl 4 post-treatment on the conduction band of the TiO 2 spheres and on the kinetics of electron transport and back reactions was also investigated. It was found that the TiCl 4 post-treatment caused a downward shift of the TiO 2 conduction band edge by 30 meV. Meanwhile, a fourfold increase of the effective electron lifetime of the DSC was also observed after TiCl4 treatment. The synergistic effect of the variation of the TiO 2 conduction band and the electron recombination determined the open-circuit voltage of the DSC. © 2012 Wang et al.
Resumo:
We utilise the well-developed quantum decision models known to the QI community to create a higher order social decision making model. A simple Agent Based Model (ABM) of a society of agents with changing attitudes towards a social issue is presented, where the private attitudes of individuals in the system are represented using a geometric structure inspired by quantum theory. We track the changing attitudes of the members of that society, and their resulting propensities to act, or not, in a given social context. A number of new issues surrounding this "scaling up" of quantum decision theories are discussed, as well as new directions and opportunities.
Resumo:
This work is a theoretical investigation into the coupling of a single excited quantum emitter to the plasmon mode of a V groove waveguide. The V groove waveguide consists of a triangular channel milled in gold and the emitter is modeled as a dipole emitter, and could represent a quantum dot, nitrogen vacancy in diamond, or similar. In this work the dependence of coupling efficiency of emitter to plasmon mode is determined for various geometrical parameters of the emitter-waveguide system. Using the finite element method, the effect on coupling efficiency of the emitter position and orientation, groove angle, groove depth, and tip radius, is studied in detail. We demonstrate that all parameters, with the exception of groove depth, have a significant impact on the attainable coupling efficiency. Understanding the effect of various geometrical parameters on the coupling between emitters and the plasmonic mode of the waveguide is essential for the design and optimization of quantum dot–V groove devices.
Resumo:
Key establishment is a crucial primitive for building secure channels in a multi-party setting. Without quantum mechanics, key establishment can only be done under the assumption that some computational problem is hard. Since digital communication can be easily eavesdropped and recorded, it is important to consider the secrecy of information anticipating future algorithmic and computational discoveries which could break the secrecy of past keys, violating the secrecy of the confidential channel. Quantum key distribution (QKD) can be used generate secret keys that are secure against any future algorithmic or computational improvements. QKD protocols still require authentication of classical communication, although existing security proofs of QKD typically assume idealized authentication. It is generally considered folklore that QKD when used with computationally secure authentication is still secure against an unbounded adversary, provided the adversary did not break the authentication during the run of the protocol. We describe a security model for quantum key distribution extending classical authenticated key exchange (AKE) security models. Using our model, we characterize the long-term security of the BB84 QKD protocol with computationally secure authentication against an eventually unbounded adversary. By basing our model on traditional AKE models, we can more readily compare the relative merits of various forms of QKD and existing classical AKE protocols. This comparison illustrates in which types of adversarial environments different quantum and classical key agreement protocols can be secure.
Resumo:
Much of our understanding of human thinking is based on probabilistic models. This innovative book by Jerome R. Busemeyer and Peter D. Bruza argues that, actually, the underlying mathematical structures from quantum theory provide a much better account of human thinking than traditional models. They introduce the foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory. The first, "contextuality", is a way to understand interference effects found with inferences and decisions under conditions of uncertainty. The second, "entanglement", allows cognitive phenomena to be modelled in non-reductionist ways. Employing these principles drawn from quantum theory allows us to view human cognition and decision in a totally new light...
Resumo:
The term “vagueness” describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno’s sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib’s and Pelletier’s (2011) theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substan- tial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.
Resumo:
One of the next great challenges of cell biology is the determination of the enormous number of protein structures encoded in genomes. In recent years, advances in electron cryo-microscopy and high-resolution single particle analysis have developed to the point where they now provide a methodology for high resolution structure determination. Using this approach, images of randomly oriented single particles are aligned computationally to reconstruct 3-D structures of proteins and even whole viruses. One of the limiting factors in obtaining high-resolution reconstructions is obtaining a large enough representative dataset ($>100,000$ particles). Traditionally particles have been manually picked which is an extremely labour intensive process. The problem is made especially difficult by the low signal-to-noise ratio of the images. This paper describes the development of automatic particle picking software, which has been tested with both negatively stained and cryo-electron micrographs. This algorithm has been shown to be capable of selecting most of the particles, with few false positives. Further work will involve extending the software to detect differently shaped and oriented particles.
Resumo:
There is a growing interest in the use of megavoltage cone-beam computed tomography (MV CBCT) data for radiotherapy treatment planning. To calculate accurate dose distributions, knowledge of the electron density (ED) of the tissues being irradiated is required. In the case of MV CBCT, it is necessary to determine a calibration-relating CT number to ED, utilizing the photon beam produced for MV CBCT. A number of different parameters can affect this calibration. This study was undertaken on the Siemens MV CBCT system, MVision, to evaluate the effect of the following parameters on the reconstructed CT pixel value to ED calibration: the number of monitor units (MUs) used (5, 8, 15 and 60 MUs), the image reconstruction filter (head and neck, and pelvis), reconstruction matrix size (256 by 256 and 512 by 512), and the addition of extra solid water surrounding the ED phantom. A Gammex electron density CT phantom containing EDs from 0.292 to 1.707 was imaged under each of these conditions. The linear relationship between MV CBCT pixel value and ED was demonstrated for all MU settings and over the range of EDs. Changes in MU number did not dramatically alter the MV CBCT ED calibration. The use of different reconstruction filters was found to affect the MV CBCT ED calibration, as was the addition of solid water surrounding the phantom. Dose distributions from treatment plans calculated with simulated image data from a 15 MU head and neck reconstruction filter MV CBCT image and a MV CBCT ED calibration curve from the image data parameters and a 15 MU pelvis reconstruction filter showed small and clinically insignificant differences. Thus, the use of a single MV CBCT ED calibration curve is unlikely to result in any clinical differences. However, to ensure minimal uncertainties in dose reporting, MV CBCT ED calibration measurements could be carried out using parameter-specific calibration measurements.