48 resultados para QUANTUM INFORMATION
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Positive-operator-valued measurements on a finite number of N identically prepared systems of arbitrary spin J are discussed. Pure states are characterized in terms of Bloch-like vectors restricted by a SU(2J+1) covariant constraint. This representation allows for a simple description of the equations to be fulfilled by optimal measurements. We explicitly find the minimal positive-operator-valued measurement for the N=2 case, a rigorous bound for N=3, and set up the analysis for arbitrary N.
Resumo:
In the quest to completely describe entanglement in the general case of a finite number of parties sharing a physical system of finite-dimensional Hilbert space an entanglement magnitude is introduced for its pure and mixed states: robustness. It corresponds to the minimal amount of mixing with locally prepared states which washes out all entanglement. It quantifies in a sense the endurance of entanglement against noise and jamming. Its properties are studied comprehensively. Analytical expressions for the robustness are given for pure states of two-party systems, and analytical bounds for mixed states of two-party systems. Specific results are obtained mainly for the qubit-qubit system (qubit denotes quantum bit). As by-products local pseudomixtures are generalized, a lower bound for the relative volume of separable states is deduced, and arguments for considering convexity a necessary condition of any entanglement measure are put forward.
Resumo:
The decay of orthopositronium into three photons produces a physical realization of a pure state with three-party entanglement. Its quantum correlations are analyzed using recent results on quantum information theory, looking for the final state that has the maximal amount of Greenberger, Horne, and Zeilinger like correlations. This state allows for a statistical dismissal of local realism stronger than the one obtained using any entangled state of two spin one-half particles.
Resumo:
Quantum states can be used to encode the information contained in a direction, i.e., in a unit vector. We present the best encoding procedure when the quantum state is made up of N spins (qubits). We find that the quality of this optimal procedure, which we quantify in terms of the fidelity, depends solely on the dimension of the encoding space. We also investigate the use of spatial rotations on a quantum state, which provide a natural and less demanding encoding. In this case we prove that the fidelity is directly related to the largest zeros of the Legendre and Jacobi polynomials. We also discuss our results in terms of the information gain.
Resumo:
Recently a new Bell inequality has been introduced by Collins et al. [Phys. Rev. Lett. 88, 040404 (2002)], which is strongly resistant to noise for maximally entangled states of two d-dimensional quantum systems. We prove that a larger violation, or equivalently a stronger resistance to noise, is found for a nonmaximally entangled state. It is shown that the resistance to noise is not a good measure of nonlocality and we introduce some other possible measures. The nonmaximally entangled state turns out to be more robust also for these alternative measures. From these results it follows that two von Neumann measurements per party may be not optimal for detecting nonlocality. For d=3,4, we point out some connections between this inequality and distillability. Indeed, we demonstrate that any state violating it, with the optimal von Neumann settings, is distillable.
Resumo:
We apply majorization theory to study the quantum algorithms known so far and find that there is a majorization principle underlying the way they operate. Grover's algorithm is a neat instance of this principle where majorization works step by step until the optimal target state is found. Extensions of this situation are also found in algorithms based in quantum adiabatic evolution and the family of quantum phase-estimation algorithms, including Shor's algorithm. We state that in quantum algorithms the time arrow is a majorization arrow.
Resumo:
ty that low-energy effective field theory could be sufficient to understand the microscopic degrees of freedom underlying black hole entropy. We propose a qualitative physical picture in which black hole entropy refers to a space of quasicoherent states of infalling matter, together with its gravitational field. We stress that this scenario might provide a low-energy explanation of both the black hole entropy and the information puzzle.
Resumo:
We propose a criterion for the validity of semiclassical gravity (SCG) which is based on the stability of the solutions of SCG with respect to quantum metric fluctuations. We pay special attention to the two-point quantum correlation functions for the metric perturbations, which contain both intrinsic and induced fluctuations. These fluctuations can be described by the Einstein-Langevin equation obtained in the framework of stochastic gravity. Specifically, the Einstein-Langevin equation yields stochastic correlation functions for the metric perturbations which agree, to leading order in the large N limit, with the quantum correlation functions of the theory of gravity interacting with N matter fields. The homogeneous solutions of the Einstein-Langevin equation are equivalent to the solutions of the perturbed semiclassical equation, which describe the evolution of the expectation value of the quantum metric perturbations. The information on the intrinsic fluctuations, which are connected to the initial fluctuations of the metric perturbations, can also be retrieved entirely from the homogeneous solutions. However, the induced metric fluctuations proportional to the noise kernel can only be obtained from the Einstein-Langevin equation (the inhomogeneous term). These equations exhibit runaway solutions with exponential instabilities. A detailed discussion about different methods to deal with these instabilities is given. We illustrate our criterion by showing explicitly that flat space is stable and a description based on SCG is a valid approximation in that case.
Resumo:
The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.
Resumo:
A common belief is that further quantum corrections near the singularity of a large black hole should not substantially modify the semiclassical picture of black hole evaporation; in particular, the outgoing spectrum of radiation should be very close to the thermal spectrum predicted by Hawking. In this paper we explore a possible counterexample: in the context of dilaton gravity, we find that nonperturbative quantum corrections which are important in strong-coupling regions may completely alter the semiclassical picture, to the extent that the presumptive spacelike boundary becomes timelike, changing in this way the causal structure of the semiclassical geometry. As a result, only a small fraction of the total energy is radiated outside the fake event horizon; most of the energy comes in fact at later retarded times and there is no problem of information loss. This may constitute a general characteristic of quantum black holes, that is, quantum gravity might be such as to prevent the formation of global event horizons.
Resumo:
The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.
Resumo:
This paper analyzes the role of traders' priors (proper versus improper) on the implications of market transparency by comparing a pre-trade transparent market with an opaque market in a set-up based on Madhavan (1996). We show that prices may be more informative in the opaque market, regardless of how priors are modelled. In contrast, the comparison of market liquidity and volatility in the two market structures are affected by prior specification. Key words: Market microstructure, Transparency, Prior information
Resumo:
Information sharing in oligopoly has been analyzed by assuming that firms behave as a sole economic agent. In this paper I assume that ownership and management are separated. Managers are allowed to falsely report their costs to owners and rivals. Under such circumstances, if owners want to achieve information sharing they must use managerial contracts that implement truthful cost reporting by managers as a dominant strategy. I show that, contrary to the classical result, without the inclusion of message-dependent payments in managerial contracts there will be no information sharing. On the other hand, with the inclusion of such publicly observable payments and credible ex-ante commitment by owners not to modify these payments, there will be perfect information sharing without the need for third parties. Keywords: Information sharing, Delegation, Managerial contracts. JEL classification numbers: D21, D82, L13, L21
Resumo:
This paper studies the impact of instrumental voting on information demand and mass media behaviour during electoral campaigns. If voters act instrumentally then information demand should increase with the closeness of an election. Mass media are modeled as profit-maximizing firms that take into account information demand, the value of customers to advertisers and the marginal cost of customers. Information supply should be larger in electoral constituencies where the contest is expected to be closer, there is a higher population density, and customers are on average more profitable for advertisers. The impact of electorate size is theoretically undetermined. These conclusions are then tested with comfortable results on data from the 1997 general election in Britain.
Resumo:
This paper examines competition in a spatial model of two-candidate elections, where one candidate enjoys a quality advantage over the other candidate. The candidates care about winning and also have policy preferences. There is two-dimensional private information. Candidate ideal points as well as their tradeoffs between policy preferences and winning are private information. The distribution of this two-dimensional type is common knowledge. The location of the median voter's ideal point is uncertain, with a distribution that is commonly known by both candidates. Pure strategy equilibria always exist in this model. We characterize the effects of increased uncertainty about the median voter, the effect of candidate policy preferences, and the effects of changes in the distribution of private information. We prove that the distribution of candidate policies approaches the mixed equilibrium of Aragones and Palfrey (2002a), when both candidates' weights on policy preferences go to zero.