979 resultados para quantum information theory


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Includes bibliography.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Reproduced from typewritten copy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Issued in three parts.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Operator quantum error correction is a recently developed theory that provides a generalized and unified framework for active error correction and passive error avoiding schemes. In this Letter, we describe these codes using the stabilizer formalism. This is achieved by adding a gauge group to stabilizer codes that defines an equivalence class between encoded states. Gauge transformations leave the encoded information unchanged; their effect is absorbed by virtual gauge qubits that do not carry useful information. We illustrate the construction by identifying a gauge symmetry in Shor's 9-qubit code that allows us to remove 3 of its 8 stabilizer generators, leading to a simpler decoding procedure and a wider class of logical operations without affecting its essential properties. This opens the path to possible improvements of the error threshold of fault-tolerant quantum computing.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

What is the minimal size quantum circuit required to exactly implement a specified n-qubit unitary operation, U, without the use of ancilla qubits? We show that a lower bound on the minimal size is provided by the length of the minimal geodesic between U and the identity, I, where length is defined by a suitable Finsler metric on the manifold SU(2(n)). The geodesic curves on these manifolds have the striking property that once an initial position and velocity are set, the remainder of the geodesic is completely determined by a second order differential equation known as the geodesic equation. This is in contrast with the usual case in circuit design, either classical or quantum, where being given part of an optimal circuit does not obviously assist in the design of the rest of the circuit. Geodesic analysis thus offers a potentially powerful approach to the problem of proving quantum circuit lower bounds. In this paper we construct several Finsler metrics whose minimal length geodesics provide lower bounds on quantum circuit size. For each Finsler metric we give a procedure to compute the corresponding geodesic equation. We also construct a large class of solutions to the geodesic equation, which we call Pauli geodesics, since they arise from isometries generated by the Pauli group. For any unitary U diagonal in the computational basis, we show that: (a) provided the minimal length geodesic is unique, it must be a Pauli geodesic; (b) finding the length of the minimal Pauli geodesic passing from I to U is equivalent to solving an exponential size instance of the closest vector in a lattice problem (CVP); and (c) all but a doubly exponentially small fraction of such unitaries have minimal Pauli geodesics of exponential length.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It has been suggested that, in order to maintain its relevance, critical research must develop a strong emphasis on empirical work rather than the conceptual emphasis that has typically characterized critical scholarship in management. A critical project of this nature is applicable in the information systems (IS) arena, which has a growing tradition of qualitative inquiry. Despite its relativist ontology, actor–network theory places a strong emphasis on empirical inquiry and this paper argues that actor–network theory, with its careful tracing and recording of heterogeneous networks, is well suited to the generation of detailed and contextual empirical knowledge about IS. The intention in this paper is to explore the relevance of IS research informed by actor–network theory in the pursuit of a broader critical research project as de? ned in earlier work.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The introduction situates the ‘hard problem’ in its historical context and argues that the problem has two sides: the output side (the Kant-Eccles problem of the freedom of the Will) and the input side (the problem of qualia). The output side ultimately reduces to whether quantum mechanics can affect the operation of synapses. A discussion of the detailed molecular biology of synaptic transmission as presently understood suggests that such affects are unlikely. Instead an evolutionary argument is presented which suggests that our conviction of free agency is an evolutionarily induced illusion and hence that the Kant-Eccles problem is itself illusory. This conclusion is supported by well-known neurophysiology. The input side, the problem of qualia, of subjectivity, is not so easily outflanked. After a brief review of the neurophysiological correlates of consciousness (NCC) and of the Penrose-Hameroff microtubular neuroquantology it is again concluded that the molecular neurobiology makes quantum wave-mechanics an unlikely explanation. Instead recourse is made to an evolutionarily- and neurobiologically-informed panpsychism. The notion of an ‘emergent’ property is carefully distinguished from that of the more usual ‘system’ property used by most dual-aspect theorists (and the majority of neuroscientists) and used to support Llinas’ concept of an ‘oneiric’ consciousness continuously modified by sensory input. I conclude that a panpsychist theory, such as this, coupled with the non-classical understanding of matter flowing from quantum physics (both epistemological and scientific) may be the default and only solution to the problem posed by the presence of mind in a world of things.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over recent years, evidence has been accumulating in favour of the importance of long-term information as a variable which can affect the success of short-term recall. Lexicality, word frequency, imagery and meaning have all been shown to augment short term recall performance. Two competing theories as to the causes of this long-term memory influence are outlined and tested in this thesis. The first approach is the order-encoding account, which ascribes the effect to the usage of resources at encoding, hypothesising that word lists which require less effort to process will benefit from increased levels of order encoding, in turn enhancing recall success. The alternative view, trace redintegration theory, suggests that order is automatically encoded phonologically, and that long-term information can only influence the interpretation of the resultant memory trace. The free recall experiments reported here attempted to determine the importance of order encoding as a facilitatory framework and to determine the locus of the effects of long-term information in free recall. Experiments 1 and 2 examined the effects of word frequency and semantic categorisation over a filled delay, and experiments 3 and 4 did the same for immediate recall. Free recall was improved by both long-term factors tested. Order information was not used over a short filled delay, but was evident in immediate recall. Furthermore, it was found that both long-term factors increased the amount of order information retained. Experiment 5 induced an order encoding effect over a filled delay, leaving a picture of short-term processes which are closely associated with long-term processes, and which fit conceptions of short-term memory being part of language processes rather better than either the encoding or the retrieval-based models. Experiments 6 and 7 aimed to determine to what extent phonological processes were responsible for the pattern of results observed. Articulatory suppression affected the encoding of order information where speech rate had no direct influence, suggesting that it is ease of lexical access which is the most important factor in the influence of long-term memory on immediate recall tasks. The evidence presented in this thesis does not offer complete support for either the retrieval-based account or the order encoding account of long-term influence. Instead, the evidence sits best with models that are based upon language-processing. The path urged for future research is to find ways in which this diffuse model can be better specified, and which can take account of the versatility of the human brain.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents the design and results of a task-based user study, based on Information Foraging Theory, on a novel user interaction framework - uInteract - for content-based image retrieval (CBIR). The framework includes a four-factor user interaction model and an interactive interface. The user study involves three focused evaluations, 12 simulated real life search tasks with different complexity levels, 12 comparative systems and 50 subjects. Information Foraging Theory is applied to the user study design and the quantitative data analysis. The systematic findings have not only shown how effective and easy to use the uInteract framework is, but also illustrate the value of Information Foraging Theory for interpreting user interaction with CBIR. © 2011 Springer-Verlag Berlin Heidelberg.