796 resultados para Hidden-variable Theories
Resumo:
Laudisa (Found. Phys. 38:1110-1132, 2008) claims that experimental research on the class of non-local hidden-variable theories introduced by Leggett is misguided, because these theories are irrelevant for the foundations of quantum mechanics. I show that Laudisa's arguments fail to establish the pessimistic conclusion he draws from them. In particular, it is not the case that Leggett-inspired research is based on a mistaken understanding of Bell's theorem, nor that previous no-hidden-variable theorems already exclude Leggett's models. Finally, I argue that the framework of Bohmian mechanics brings out the importance of Leggett tests, rather than proving their irrelevance, as Laudisa supposes.
Resumo:
We demonstrate a contradiction of quantum mechanics with local hidden variable theories for continuous quadrature phase amplitude (position and momentum) measurements. For any quantum state, this contradiction is lost for situations where the quadrature phase amplitude results are always macroscopically distinct. We show that for optical realizations of this experiment, where one uses homodyne detection techniques to perform the quadrature phase amplitude measurement, one has an amplification prior to detection, so that macroscopic fields are incident on photodiode detectors. The high efficiencies of such detectors may open a way for a loophole-free test of local hidden variable theories.
Resumo:
This paper introduces a probability model, the mixture of trees that can account for sparse, dynamically changing dependence relationships. We present a family of efficient algorithms that use EMand the Minimum Spanning Tree algorithm to find the ML and MAP mixtureof trees for a variety of priors, including the Dirichlet and the MDL priors.
Resumo:
This paper introduces a probability model, the mixture of trees that can account for sparse, dynamically changing dependence relationships. We present a family of efficient algorithms that use EM and the Minimum Spanning Tree algorithm to find the ML and MAP mixture of trees for a variety of priors, including the Dirichlet and the MDL priors. We also show that the single tree classifier acts like an implicit feature selector, thus making the classification performance insensitive to irrelevant attributes. Experimental results demonstrate the excellent performance of the new model both in density estimation and in classification.
Resumo:
In this paper, we present an analog of Bell's inequalities violation test for N qubits to be performed in a nuclear magnetic resonance (NMR) quantum computer. This can be used to simulate or predict the results for different Bell's inequality tests, with distinct configurations and a larger number of qubits. To demonstrate our scheme, we implemented a simulation of the violation of the Clauser, Horne, Shimony and Holt (CHSH) inequality using a two-qubit NMR system and compared the results to those of a photon experiment. The experimental results are well described by the quantum mechanics theory and a local realistic hidden variables model (LRHVM) that was specifically developed for NMR. That is why we refer to this experiment as a simulation of Bell's inequality violation. Our result shows explicitly how the two theories can be compatible with each other due to the detection loophole. In the last part of this work, we discuss the possibility of testing some fundamental features of quantum mechanics using NMR with highly polarized spins, where a strong discrepancy between quantum mechanics and hidden variables models can be expected.
Resumo:
We show that quantum mechanics predicts a contradiction with local hidden variable theories for photon number measurements which have limited resolving power, to the point of imposing an uncertainty in the photon number result which is macroscopic in absolute terms. We show how this can be interpreted as a failure of a new premise, macroscopic local realism.
Resumo:
We review the field of quantum optical information from elementary considerations to quantum computation schemes. We illustrate our discussion with descriptions of experimental demonstrations of key communication and processing tasks from the last decade and also look forward to the key results likely in the next decade. We examine both discrete (single photon) type processing as well as those which employ continuous variable manipulations. The mathematical formalism is kept to the minimum needed to understand the key theoretical and experimental results.
Resumo:
Intracavity and external third order correlations in the damped nondegenerate parametric oscillator are calculated for quantum mechanics and stochastic electrodynamics (SED), a semiclassical theory. The two theories yield greatly different results, with the correlations of quantum mechanics being cubic in the system's nonlinear coupling constant and those of SED being linear in the same constant. In particular, differences between the two theories are present in at least a mesoscopic regime. They also exist when realistic damping is included. Such differences illustrate distinctions between quantum mechanics and a hidden variable theory for continuous variables.
Resumo:
We demonstrate that the self-similarity of some scale-free networks with respect to a simple degree-thresholding renormalization scheme finds a natural interpretation in the assumption that network nodes exist in hidden metric spaces. Clustering, i.e., cycles of length three, plays a crucial role in this framework as a topological reflection of the triangle inequality in the hidden geometry. We prove that a class of hidden variable models with underlying metric spaces are able to accurately reproduce the self-similarity properties that we measured in the real networks. Our findings indicate that hidden geometries underlying these real networks are a plausible explanation for their observed topologies and, in particular, for their self-similarity with respect to the degree-based renormalization.
Resumo:
We show that Local Realistic Theories, defined as obeying the Bells's locality condition, cannot satisfy the prefect anti-correlations without at the same time maximally violating rotational symmetry at the hidden variable level. We examine whether the rotational symmetry can be restored after the statistical average. We also comment on the question whether such theories are necessarily deterministic at the hidden variable leva. © 1999 Elsevier Science B.V. All rights reserved.
Resumo:
Dans cette thèse l’ancienne question philosophique “tout événement a-t-il une cause ?” sera examinée à la lumière de la mécanique quantique et de la théorie des probabilités. Aussi bien en physique qu’en philosophie des sciences la position orthodoxe maintient que le monde physique est indéterministe. Au niveau fondamental de la réalité physique – au niveau quantique – les événements se passeraient sans causes, mais par chance, par hasard ‘irréductible’. Le théorème physique le plus précis qui mène à cette conclusion est le théorème de Bell. Ici les prémisses de ce théorème seront réexaminées. Il sera rappelé que d’autres solutions au théorème que l’indéterminisme sont envisageables, dont certaines sont connues mais négligées, comme le ‘superdéterminisme’. Mais il sera argué que d’autres solutions compatibles avec le déterminisme existent, notamment en étudiant des systèmes physiques modèles. Une des conclusions générales de cette thèse est que l’interprétation du théorème de Bell et de la mécanique quantique dépend crucialement des prémisses philosophiques desquelles on part. Par exemple, au sein de la vision d’un Spinoza, le monde quantique peut bien être compris comme étant déterministe. Mais il est argué qu’aussi un déterminisme nettement moins radical que celui de Spinoza n’est pas éliminé par les expériences physiques. Si cela est vrai, le débat ‘déterminisme – indéterminisme’ n’est pas décidé au laboratoire : il reste philosophique et ouvert – contrairement à ce que l’on pense souvent. Dans la deuxième partie de cette thèse un modèle pour l’interprétation de la probabilité sera proposé. Une étude conceptuelle de la notion de probabilité indique que l’hypothèse du déterminisme aide à mieux comprendre ce que c’est qu’un ‘système probabiliste’. Il semble que le déterminisme peut répondre à certaines questions pour lesquelles l’indéterminisme n’a pas de réponses. Pour cette raison nous conclurons que la conjecture de Laplace – à savoir que la théorie des probabilités présuppose une réalité déterministe sous-jacente – garde toute sa légitimité. Dans cette thèse aussi bien les méthodes de la philosophie que de la physique seront utilisées. Il apparaît que les deux domaines sont ici solidement reliés, et qu’ils offrent un vaste potentiel de fertilisation croisée – donc bidirectionnelle.
Resumo:
The phase diagram of an asymmetric N = 3 Ashkin-Teller model is obtained by a numerical analysis which combines Monte Carlo renormalization group and reweighting techniques. Present results reveal several differences with those obtained by mean-field calculations and a Hamiltonian approach. In particular, we found Ising critical exponents along a line where Goldschmidt has located the Kosterlitz-Thouless multicritical point. On the other hand, we did find nonuniversal exponents along another transition line. Symmetry breaking in this case is very similar to the N = 2 case, since the symmetries associated to only two of the Ising variables are broken. However, for large values of the coupling constant ratio XW = W/K, when the only broken symmetry is of a hidden variable, we detected first-order phase transitions giving evidence supporting the existence of a multicritical point, as suggested by Goldschmidt, but in a different region of the phase diagram. © 2002 Elsevier Science B.V. All rights reserved.
Resumo:
According to Bell's theorem a large class of hidden-variable models obeying Bell's notion of local causality (LC) conflict with the predictions of quantum mechanics. Recently, a Bell-type theorem has been proven using a weaker notion of LC, yet assuming the existence of perfectly correlated event types. Here we present a similar Bell-type theorem without this latter assumption. The derived inequality differs from the Clauser-Horne inequality by some small correction terms, which render it less constraining.
Resumo:
It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise or corruption. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which allows for input noise given that some model of the noise process exists. In the limit where this noise process is small and symmetric it is shown, using the Laplace approximation, that there is an additional term to the usual Bayesian error bar which depends on the variance of the input noise process. Further, by treating the true (noiseless) input as a hidden variable and sampling this jointly with the network's weights, using Markov Chain Monte Carlo methods, it is demonstrated that it is possible to infer the unbiassed regression over the noiseless input.