149 resultados para QUANTUM-CLASSICAL DYNAMICS
em Queensland University of Technology - ePrints Archive
Resumo:
Key establishment is a crucial primitive for building secure channels in a multi-party setting. Without quantum mechanics, key establishment can only be done under the assumption that some computational problem is hard. Since digital communication can be easily eavesdropped and recorded, it is important to consider the secrecy of information anticipating future algorithmic and computational discoveries which could break the secrecy of past keys, violating the secrecy of the confidential channel. Quantum key distribution (QKD) can be used generate secret keys that are secure against any future algorithmic or computational improvements. QKD protocols still require authentication of classical communication, although existing security proofs of QKD typically assume idealized authentication. It is generally considered folklore that QKD when used with computationally secure authentication is still secure against an unbounded adversary, provided the adversary did not break the authentication during the run of the protocol. We describe a security model for quantum key distribution extending classical authenticated key exchange (AKE) security models. Using our model, we characterize the long-term security of the BB84 QKD protocol with computationally secure authentication against an eventually unbounded adversary. By basing our model on traditional AKE models, we can more readily compare the relative merits of various forms of QKD and existing classical AKE protocols. This comparison illustrates in which types of adversarial environments different quantum and classical key agreement protocols can be secure.
Resumo:
Quantum key distribution (QKD) promises secure key agreement by using quantum mechanical systems. We argue that QKD will be an important part of future cryptographic infrastructures. It can provide long-term confidentiality for encrypted information without reliance on computational assumptions. Although QKD still requires authentication to prevent man-in-the-middle attacks, it can make use of either information-theoretically secure symmetric key authentication or computationally secure public key authentication: even when using public key authentication, we argue that QKD still offers stronger security than classical key agreement.
Resumo:
In this third Quantum Interaction (QI) meeting it is time to examine our failures. One of the weakest elements of QI as a field, arises in its continuing lack of models displaying proper evolutionary dynamics. This paper presents an overview of the modern generalised approach to the derivation of time evolution equations in physics, showing how the notion of symmetry is essential to the extraction of operators in quantum theory. The form that symmetry might take in non-physical models is explored, with a number of viable avenues identified.
Resumo:
This article introduces a “pseudo classical” notion of modelling non-separability. This form of non-separability can be viewed as lying between separability and quantum-like non-separability. Non-separability is formalized in terms of the non-factorizabilty of the underlying joint probability distribution. A decision criterium for determining the non-factorizability of the joint distribution is related to determining the rank of a matrix as well as another approach based on the chi-square-goodness-of-fit test. This pseudo-classical notion of non-separability is discussed in terms of quantum games and concept combinations in human cognition.
Resumo:
The term “vagueness” describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno’s sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib’s and Pelletier’s (2011) theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substan- tial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.
Resumo:
A one-time program is a hypothetical device by which a user may evaluate a circuit on exactly one input of his choice, before the device self-destructs. One-time programs cannot be achieved by software alone, as any software can be copied and re-run. However, it is known that every circuit can be compiled into a one-time program using a very basic hypothetical hardware device called a one-time memory. At first glance it may seem that quantum information, which cannot be copied, might also allow for one-time programs. But it is not hard to see that this intuition is false: one-time programs for classical or quantum circuits based solely on quantum information do not exist, even with computational assumptions. This observation raises the question, "what assumptions are required to achieve one-time programs for quantum circuits?" Our main result is that any quantum circuit can be compiled into a one-time program assuming only the same basic one-time memory devices used for classical circuits. Moreover, these quantum one-time programs achieve statistical universal composability (UC-security) against any malicious user. Our construction employs methods for computation on authenticated quantum data, and we present a new quantum authentication scheme called the trap scheme for this purpose. As a corollary, we establish UC-security of a recent protocol for delegated quantum computation.
Resumo:
We propose to use a simple and effective way to achieve secure quantum direct secret sharing. The proposed scheme uses the properties of fountain codes to allow a realization of the physical conditions necessary for the implementation of no-cloning principle for eavesdropping-check and authentication. In our scheme, to achieve a variety of security purposes, nonorthogonal state particles are inserted in the transmitted sequence carrying the secret shares to disorder it. However, the positions of the inserted nonorthogonal state particles are not announced directly, but are obtained by sending degrees and positions of a sequence that are pre-shared between Alice and each Bob. Moreover, they can confirm that whether there exists an eavesdropper without exchanging classical messages. Most importantly, without knowing the positions of the inserted nonorthogonal state particles and the sequence constituted by the first particles from every EPR pair, the proposed scheme is shown to be secure.
Resumo:
In this paper we introduce a formalization of Logical Imaging applied to IR in terms of Quantum Theory through the use of an analogy between states of a quantum system and terms in text documents. Our formalization relies upon the Schrodinger Picture, creating an analogy between the dynamics of a physical system and the kinematics of probabilities generated by Logical Imaging. By using Quantum Theory, it is possible to model more precisely contextual information in a seamless and principled fashion within the Logical Imaging process. While further work is needed to empirically validate this, the foundations for doing so are provided.
Resumo:
The controlled growth of ultra-small Ge/Si quantum dot (QD) nuclei (≈1 nm) suitable for the synthesis of uniform nanopatterns with high surface coverage, is simulated using atom-only and size non-uniform cluster fluxes. It is found that seed nuclei of more uniform sizes are formed when clusters of non-uniform size are deposited. This counter-intuitive result is explained via adatom-nanocluster interactions on Si(100) surfaces. Our results are supported by experimental data on the geometric characteristics of QD patterns synthesized by nanocluster deposition. This is followed by a description of the role of plasmas as non-uniform cluster sources and the impact on surface dynamics. The technique challenges conventional growth modes and is promising for deterministic synthesis of nanodot arrays.
Resumo:
A key concept in many Information Retrieval (IR) tasks, e.g. document indexing, query language modelling, aspect and diversity retrieval, is the relevance measurement of topics, i.e. to what extent an information object (e.g. a document or a query) is about the topics. This paper investigates the interference of relevance measurement of a topic caused by another topic. For example, consider that two user groups are required to judge whether a topic q is relevant to a document d, and q is presented together with another topic (referred to as a companion topic). If different companion topics are used for different groups, interestingly different relevance probabilities of q given d can be reached. In this paper, we present empirical results showing that the relevance of a topic to a document is greatly affected by the companion topic’s relevance to the same document, and the extent of the impact differs with respect to different companion topics. We further analyse the phenomenon from classical and quantum-like interference perspectives, and connect the phenomenon to nonreality and contextuality in quantum mechanics. We demonstrate that quantum like model fits in the empirical data, could be potentially used for predicting the relevance when interference exists.
Resumo:
A crucial issue with hybrid quantum secret sharing schemes is the amount of data that is allocated to the participants. The smaller the amount of allocated data, the better the performance of a scheme. Moreover, quantum data is very hard and expensive to deal with, therefore, it is desirable to use as little quantum data as possible. To achieve this goal, we first construct extended unitary operations by the tensor product of n, n ≥ 2, basic unitary operations, and then by using those extended operations, we design two quantum secret sharing schemes. The resulting dual compressible hybrid quantum secret sharing schemes, in which classical data play a complementary role to quantum data, range from threshold to access structure. Compared with the existing hybrid quantum secret sharing schemes, our proposed schemes not only reduce the number of quantum participants, but also the number of particles and the size of classical shares. To be exact, the number of particles that are used to carry quantum data is reduced to 1 while the size of classical secret shares also is also reduced to l−2 m−1 based on ((m+1, n′)) threshold and to l−2 r2 (where r2 is the number of maximal unqualified sets) based on adversary structure. Consequently, our proposed schemes can greatly reduce the cost and difficulty of generating and storing EPR pairs and lower the risk of transmitting encoded particles.
Resumo:
What type of probability theory best describes the way humans make judgments under uncertainty and decisions under conflict? Although rational models of cognition have become prominent and have achieved much success, they adhere to the laws of classical probability theory despite the fact that human reasoning does not always conform to these laws. For this reason we have seen the recent emergence of models based on an alternative probabilistic framework drawn from quantum theory. These quantum models show promise in addressing cognitive phenomena that have proven recalcitrant to modeling by means of classical probability theory. This review compares and contrasts probabilistic models based on Bayesian or classical versus quantum principles, and highlights the advantages and disadvantages of each approach.
Resumo:
In this paper we image the highly confined long range plasmons of a nanoscale metal stripe waveguide using quantum emitters. Plasmons were excited using a highly focused 633 nm laser beam and a specially designed grating structure to provide stronger incoupling to the desired mode. A homogeneous thin layer of quantum dots was used to image the near field intensity of the propagating plasmons on the waveguide. We observed that the photoluminescence is quenched when the QD to metal surface distance is less than 10 nm. The optimised spacer layer thickness for the stripe waveguides was found to be around 20 nm. Authors believe that the findings of this paper prove beneficial for the development of plasmonic devices utilising stripe waveguides.
Resumo:
Topological insulators (TIs) exhibit novel physics with great promise for new devices, but considerable challenges remain to identify TIs with high structural stability and large nontrivial band gap suitable for practical applications. Here we predict by first-principles calculations a two-dimensional (2D) TI, also known as a quantum spin Hall (QSH) insulator, in a tetragonal bismuth bilayer (TB-Bi) structure that is dynamically and thermally stable based on phonon calculations and finite-temperature molecular dynamics simulations. Density functional theory and tight-binding calculations reveal a band inversion among the Bi-p orbits driven by the strong intrinsic spin-orbit coupling, producing a large nontrivial band gap, which can be effectively tuned by moderate strains. The helical gapless edge states exhibit a linear dispersion with a high Fermi velocity comparable to that of graphene, and the QSHphase remains robust on a NaCl substrate. These remarkable properties place TB-Bi among the most promising 2D TIs for high-speed spintronic devices, and the present results provide insights into the intriguing QSH phenomenon in this new Bi structure and offer guidance for its implementation in potential applications.
Resumo:
This article presents and evaluates Quantum Inspired models of Target Activation using Cued-Target Recall Memory Modelling over multiple sources of Free Association data. Two components were evaluated: Whether Quantum Inspired models of Target Activation would provide a better framework than their classical psychological counterparts and how robust these models are across the different sources of Free Association data. In previous work, a formal model of cued-target recall did not exist and as such Target Activation was unable to be assessed directly. Further to that, the data source used was suspected of suffering from temporal and geographical bias. As a consequence, Target Activation was measured against cued-target recall data as an approximation of performance. Since then, a formal model of cued-target recall (PIER3) has been developed [10] with alternative sources of data also becoming available. This allowed us to directly model target activation in cued-target recall with human cued-target recall pairs and use multiply sources of Free Association Data. Featural Characteristics known to be important to Target Activation were measured for each of the data sources to identify any major differences that may explain variations in performance for each of the models. Each of the activation models were used in the PIER3 memory model for each of the data sources and was benchmarked against cued-target recall pairs provided by the University of South Florida (USF). Two methods where used to evaluate performance. The first involved measuring the divergence between the sets of results using the Kullback Leibler (KL) divergence with the second utilizing a previous statistical analysis of the errors [9]. Of the three sources of data, two were sourced from human subjects being the USF Free Association Norms and the University of Leuven (UL) Free Association Networks. The third was sourced from a new method put forward by Galea and Bruza, 2015 in which pseudo Free Association Networks (Corpus Based Association Networks - CANs) are built using co-occurrence statistics on large text corpus. It was found that the Quantum Inspired Models of Target Activation not only outperformed the classical psychological model but was more robust across a variety of data sources.