979 resultados para quantum information theory
Resumo:
Intracavity and external third order correlations in the damped nondegenerate parametric oscillator are calculated for quantum mechanics and stochastic electrodynamics (SED), a semiclassical theory. The two theories yield greatly different results, with the correlations of quantum mechanics being cubic in the system's nonlinear coupling constant and those of SED being linear in the same constant. In particular, differences between the two theories are present in at least a mesoscopic regime. They also exist when realistic damping is included. Such differences illustrate distinctions between quantum mechanics and a hidden variable theory for continuous variables.
Resumo:
In 1966 the Brazilian physicist Klaus Tausk (b. 1927) circulated a preprint from the International Centre for Theoretical Physics in Trieste, Italy, criticizing Adriana Daneri, Angelo Loinger, and Giovanni Maria Prosperi`s theory of 1962 on the measurement problem in quantum mechanics. A heated controversy ensued between two opposing camps within the orthodox interpretation of quantum theory, represented by Leon Rosenfeld and Eugene P. Wigner. The controversy went well beyond the strictly scientific issues, however, reflecting philosophical and political commitments within the context of the Cold War, the relationship between science in developed and Third World countries, the importance of social skills, and personal idiosyncrasies.
Resumo:
The main problem with current approaches to quantum computing is the difficulty of establishing and maintaining entanglement. A Topological Quantum Computer (TQC) aims to overcome this by using different physical processes that are topological in nature and which are less susceptible to disturbance by the environment. In a (2+1)-dimensional system, pseudoparticles called anyons have statistics that fall somewhere between bosons and fermions. The exchange of two anyons, an effect called braiding from knot theory, can occur in two different ways. The quantum states corresponding to the two elementary braids constitute a two-state system allowing the definition of a computational basis. Quantum gates can be built up from patterns of braids and for quantum computing it is essential that the operator describing the braiding-the R-matrix-be described by a unitary operator. The physics of anyonic systems is governed by quantum groups, in particular the quasi-triangular Hopf algebras obtained from finite groups by the application of the Drinfeld quantum double construction. Their representation theory has been described in detail by Gould and Tsohantjis, and in this review article we relate the work of Gould to TQC schemes, particularly that of Kauffman.
Resumo:
Colonius suggests that, in using standard set theory as the language in which to express our computational-level theory of human memory, we would need to violate the axiom of foundation in order to express meaningful memory bindings in which a context is identical to an item in the list. We circumvent Colonius's objection by allowing that a list item may serve as a label for a context without being identical to that context. This debate serves to highlight the value of specifying memory operations in set theoretic notation, as it would have been difficult if not impossible to formulate such an objection at the algorithmic level.
Resumo:
We report the observation of the quantum effects of competing chi((2)) nonlinearities. We also report classical signatures of competition, namely, clamping of the second-harmonic power and production of nondegenerate frequencies in the visible. Theory is presented that describes the observations as resulting from competition between various chi((2)) up-conversion and down-conversion processes. We show that competition imposes hitherto unsuspected limits to both power generation and squeezing. The observed signatures are expected to be significant effects in practical systems.
Resumo:
From a general model of fiber optics, we investigate the physical limits of soliton-based terabaud communication systems. In particular we consider Raman and initial quantum noise effects which are often neglected in fiber communications. Simulations of the position diffusion in dark and bright solitons show that these effects become increasingly important at short pulse durations, even over kilometer-scale distances. We also obtain an approximate analytic theory in agreement with numerical simulations, which shows that the Raman effects exceed the Gordon-Haus jitter for sub-picosecond pulses. (C) 1997 Elsevier Science B.V.
Resumo:
Experimental data for E. coli debris size reduction during high-pressure homogenisation at 55 MPa are presented. A mathematical model based on grinding theory is developed to describe the data. The model is based on first-order breakage and compensation conditions. It does not require any assumption of a specified distribution for debris size and can be used given information on the initial size distribution of whole cells and the disruption efficiency during homogenisation. The number of homogeniser passes is incorporated into the model and used to describe the size reduction of non-induced stationary and induced E. coil cells during homogenisation. Regressing the results to the model equations gave an excellent fit to experimental data ( > 98.7% of variance explained for both fermentations), confirming the model's potential for predicting size reduction during high-pressure homogenisation. This study provides a means to optimise both homogenisation and disc-stack centrifugation conditions for recombinant product recovery. (C) 1997 Elsevier Science Ltd.
Resumo:
Using the method of quantum trajectories we show that a known pure state can be optimally monitored through time when subject to a sequence of discrete measurements. By modifying the way that we extract information from the measurement apparatus we can minimize the average algorithmic information of the measurement record, without changing the unconditional evolution of the measured system. We define an optimal measurement scheme as one which has the lowest average algorithmic information allowed. We also show how it is possible to extract information about system operator averages from the measurement records and their probabilities. The optimal measurement scheme, in the limit of weak coupling, determines the statistics of the variance of the measured variable directly. We discuss the relevance of such measurements for recent experiments in quantum optics.
Resumo:
This paper offers a defense of backwards in time causation models in quantum mechanics. Particular attention is given to Cramer's transactional account, which is shown to have the threefold virtue of solving the Bell problem, explaining the complex conjugate aspect of the quantum mechanical formalism, and explaining various quantum mysteries such as Schrodinger's cat. The question is therefore asked, why has this model not received more attention from physicists and philosophers? One objection given by physicists in assessing Cramer's theory was that it is not testable. This paper seeks to answer this concern by utilizing an argument that backwards causation models entail a fork theory of causal direction. From the backwards causation model together with the fork theory one can deduce empirical predictions. Finally, the objection that this strategy is questionable because of its appeal to philosophy is deflected.
Resumo:
HE PROBIT MODEL IS A POPULAR DEVICE for explaining binary choice decisions in econometrics. It has been used to describe choices such as labor force participation, travel mode, home ownership, and type of education. These and many more examples can be found in papers by Amemiya (1981) and Maddala (1983). Given the contribution of economics towards explaining such choices, and given the nature of data that are collected, prior information on the relationship between a choice probability and several explanatory variables frequently exists. Bayesian inference is a convenient vehicle for including such prior information. Given the increasing popularity of Bayesian inference it is useful to ask whether inferences from a probit model are sensitive to a choice between Bayesian and sampling theory techniques. Of interest is the sensitivity of inference on coefficients, probabilities, and elasticities. We consider these issues in a model designed to explain choice between fixed and variable interest rate mortgages. Two Bayesian priors are employed: a uniform prior on the coefficients, designed to be noninformative for the coefficients, and an inequality restricted prior on the signs of the coefficients. We often know, a priori, whether increasing the value of a particular explanatory variable will have a positive or negative effect on a choice probability. This knowledge can be captured by using a prior probability density function (pdf) that is truncated to be positive or negative. Thus, three sets of results are compared:those from maximum likelihood (ML) estimation, those from Bayesian estimation with an unrestricted uniform prior on the coefficients, and those from Bayesian estimation with a uniform prior truncated to accommodate inequality restrictions on the coefficients.
Resumo:
After outlining some relevant background information about the NT crocodile farming industry and explaining the purpose of our survey of NT crocodile farmers conducted in the first half of 2005, this paper reports the results of the survey. The information received from the survey is supplemented by secondary data and by information from secondary sources. This report covers the location of respondents; the size of crocodile farms; farmers’ stated knowledge of and attitudes towards the NT Crocodile Management Plan; the involvement of farms in the harvesting of crocodile eggs and the costs involved; views of crocodile farmers about whether the NT Crocodile Management Plan encourages landholders to conserve crocodiles and their perceptions of the benefits to landholders; predicted production trends and trends in the number of farms operating in NT; economic characteristics of crocodile farms producing in NT including the economic advantages and disadvantages of crocodile farming in NT. Concluding comments provide, amongst other things, an overview of the structure of the crocodile farming industry in the NT gleaned from a consideration of data available from the NT Government’s Department of Business, Industry and Resource Development.
Resumo:
Classical dynamics is formulated as a Hamiltonian flow in phase space, while quantum mechanics is formulated as unitary dynamics in Hilbert space. These different formulations have made it difficult to directly compare quantum and classical nonlinear dynamics. Previous solutions have focused on computing quantities associated with a statistical ensemble such as variance or entropy. However a more diner comparison would compare classical predictions to the quantum predictions for continuous simultaneous measurement of position and momentum of a single system, in this paper we give a theory of such measurement and show that chaotic behavior in classical systems fan be reproduced by continuously measured quantum systems.
Resumo:
An important feature of some conceptual modelling grammars is the features they provide to allow database designers to show real-world things may or may not possess a particular attribute or relationship. In the entity-relationship model, for example, the fact that a thing may not possess an attribute can be represented by using a special symbol to indicate that the attribute is optional. Similarly, the fact that a thing may or may not be involved in a relationship can be represented by showing the minimum cardinality of the relationship as zero. Whether these practices should be followed, however, is a contentious issue. An alternative approach is to eliminate optional attributes and relationships from conceptual schema diagrams by using subtypes that have only mandatory attributes and relationships. In this paper, we first present a theory that led us to predict that optional attributes and relationships should be used in conceptual schema diagrams only when users of the diagrams require a surface-level understanding of the domain being represented by the diagrams. When users require a deep-level understanding, however, optional attributes and relationships should not be used because they undermine users' abilities to grasp important domain semantics. We describe three experiments which we then undertook to test our predictions. The results of the experiments support our predictions.
Resumo:
We examine constraints on quantum operations imposed by relativistic causality. A bipartite superoperator is said to be localizable if it can be implemented by two parties (Alice and Bob) who share entanglement but do not communicate, it is causal if the superoperator does not convey information from Alice to Bob or from Bob to Alice. We characterize the general structure of causal complete-measurement superoperators, and exhibit examples that are causal but not localizable. We construct another class of causal bipartite superoperators that are not localizable by invoking bounds on the strength of correlations among the parts of a quantum system. A bipartite superoperator is said to be semilocalizable if it can be implemented with one-way quantum communication from Alice to Bob, and it is semicausal if it conveys no information from Bob to Alice. We show that all semicausal complete-measurement superoperators are semi localizable, and we establish a general criterion for semicausality. In the multipartite case, we observe that a measurement superoperator that projects onto the eigenspaces of a stabilizer code is localizable.