867 resultados para Measurement-based quantum computing
Resumo:
We introduce a model of computation based on read only memory (ROM), which allows us to compare the space-efficiency of reversible, error-free classical computation with reversible, error-free quantum computation. We show that a ROM-based quantum computer with one writable qubit is universal, whilst two writable bits are required for a universal classical ROM-based computer. We also comment on the time-efficiency advantages of quantum computation within this model.
Resumo:
Heavy metal-based quantum dots (QDs) have demonstrated to behave as efficient sensitizers in QD-sensitized solar cells (QDSSCs), as attested by the countless works and encouraging efficiencies reported so far. However, their intrinsic toxicity has arisen as a major issue for the prospects of commercialization. Here, we examine the potential of environmentally friendly zinc copper indium sulfide (ZCIS) QDs for the fabrication of liquid-junction QDSSCs by means of photoelectrochemical measurements. A straightforward approach to directly adsorb ZCIS QDs on TiO2 from a colloidal dispersion is presented. Incident photon-to-current efficiency (IPCE) spectra of sensitized photoanodes show a marked dependence on the adsorption time, with longer times leading to poorer performances. Cyclic voltammograms point to a blockage of the channels of the mesoporous TiO2 film by the agglomeration of QDs as the main reason for the decrease in efficiency. Photoanodes were also submitted to the ZnS treatment. Its effects on electron recombination with the electrolyte are analyzed through electrochemical impedance spectroscopy and photopotential measurements. The corresponding results bring out the role of the ZnS coating as a barrier layer preventing electron leakage toward the electrolyte, as argued in other QD-sensitized systems. The beneficial effect of the ZnS coating is ultimately reflected on the power conversion efficiency of complete devices, reaching values of 2 %. In a more general vein, through these findings, we aim to call the attention to the potentiality of this quaternary alloy, virtually unexplored as a light harvester for sensitized devices.
Resumo:
The field of linear optical quantum computation (LOQC) will soon need a repertoire of experimental milestones. We make progress in this direction by describing several experiments based on Grover's algorithm. These experiments range from a relatively simple implementation using only a single nonscalable controlled- NOT (CNOT) gate to the most complex, requiring two concatenated scalable CNOT gates, and thus form a useful set of early milestones for LOQC. We also give a complete description of basic LOQC using polarization-encoded qubits, making use of many simplifications to the original scheme of Knill, Laflamme, and Milburn [E. Knill, R. Laflamme, and G. J. Milburn, Nature (London) 409, 46 (2001)].
Resumo:
We present here a new approach to scalable quantum computing - a 'qubus computer' - which realizes qubit measurement and quantum gates through interacting qubits with a quantum communication bus mode. The qubits could be 'static' matter qubits or 'flying' optical qubits, but the scheme we focus on here is particularly suited to matter qubits. There is no requirement for direct interaction between the qubits. Universal two-qubit quantum gates may be effected by schemes which involve measurement of the bus mode, or by schemes where the bus disentangles automatically and no measurement is needed. In effect, the approach integrates together qubit degrees of freedom for computation with quantum continuous variables for communication and interaction.
Resumo:
We investigate the use of nanocrystal quantum dots as a quantum bus element for preparing various quantum resources for use in photonic quantum technologies. Using the Stark-tuning property of nanocrystal quantum dots as well as the biexciton transition, we demonstrate a photonic controlled-NOT (CNOT) interaction between two logical photonic qubits comprising two cavity field modes each. We find the CNOT interaction to be a robust generator of photonic Bell states, even with relatively large biexciton losses. These results are discussed in light of the current state of the art of both microcavity fabrication and recent advances in nanocrystal quantum dot technology. Overall, we find that such a scheme should be feasible in the near future with appropriate refinements to both nanocrystal fabrication technology and microcavity design. Such a gate could serve as an active element in photonic-based quantum technologies.
Resumo:
The physical implementation of quantum information processing is one of the major challenges of current research. In the last few years, several theoretical proposals and experimental demonstrations on a small number of qubits have been carried out, but a quantum computing architecture that is straightforwardly scalable, universal, and realizable with state-of-the-art technology is still lacking. In particular, a major ultimate objective is the construction of quantum simulators, yielding massively increased computational power in simulating quantum systems. Here we investigate promising routes towards the actual realization of a quantum computer, based on spin systems. The first one employs molecular nanomagnets with a doublet ground state to encode each qubit and exploits the wide chemical tunability of these systems to obtain the proper topology of inter-qubit interactions. Indeed, recent advances in coordination chemistry allow us to arrange these qubits in chains, with tailored interactions mediated by magnetic linkers. These act as switches of the effective qubit-qubit coupling, thus enabling the implementation of one- and two-qubit gates. Molecular qubits can be controlled either by uniform magnetic pulses, either by local electric fields. We introduce here two different schemes for quantum information processing with either global or local control of the inter-qubit interaction and demonstrate the high performance of these platforms by simulating the system time evolution with state-of-the-art parameters. The second architecture we propose is based on a hybrid spin-photon qubit encoding, which exploits the best characteristic of photons, whose mobility is exploited to efficiently establish long-range entanglement, and spin systems, which ensure long coherence times. The setup consists of spin ensembles coherently coupled to single photons within superconducting coplanar waveguide resonators. The tunability of the resonators frequency is exploited as the only manipulation tool to implement a universal set of quantum gates, by bringing the photons into/out of resonance with the spin transition. The time evolution of the system subject to the pulse sequence used to implement complex quantum algorithms has been simulated by numerically integrating the master equation for the system density matrix, thus including the harmful effects of decoherence. Finally a scheme to overcome the leakage of information due to inhomogeneous broadening of the spin ensemble is pointed out. Both the proposed setups are based on state-of-the-art technological achievements. By extensive numerical experiments we show that their performance is remarkably good, even for the implementation of long sequences of gates used to simulate interesting physical models. Therefore, the here examined systems are really promising buildingblocks of future scalable architectures and can be used for proof-of-principle experiments of quantum information processing and quantum simulation.
Resumo:
Magnetoresistance measurements were performed on an n-type PbTe/PbEuTe quantum well and weak antilocalization effects were observed. This indicates the presence of spin orbit coupling phenomena and we showed that the Rashba effect is the main mechanism responsible for this spin orbit coupling. Using the model developed by Iordanskii et al., we fitted the experimental curves and obtained the inelastic and spin orbit scattering times. Thus we could compare the zero field energy spin-splitting predicted by the Rashba theory with the energy spin-splitting obtained from the analysis of the experimental curves. The final result confirms the theoretical prediction of strong Rashba effect on IV-VI based quantum wells.
Resumo:
A fundamental interaction for electrons is their hyperfine interaction (HFI) with nuclear spins. HFI is well characterized in free atoms and molecules, and is crucial for purposes from chemical identification of atoms to trapped ion quantum computing. However, electron wave functions near atomic sites, therefore HFI, are often not accurately known in solids. Here we perform an all-electron calculation for conduction electrons in silicon and obtain reliable information on HFI. We verify the outstanding quantum spin coherence in Si, which is critical for fault-tolerant solid state quantum computing.
Resumo:
We study the transport properties of HgTe-based quantum wells containing simultaneously electrons and holes in a magnetic field B. At the charge neutrality point (CNP) with nearly equal electron and hole densities, the resistance is found to increase very strongly with B while the Hall resistivity turns to zero. This behavior results in a wide plateau in the Hall conductivity sigma(xy) approximate to 0 and in a minimum of diagonal conductivity sigma(xx) at nu = nu(p) - nu(n) = 0, where nu(n) and nu(p) are the electron and hole Landau level filling factors. We suggest that the transport at the CNP point is determined by electron-hole ""snake states'' propagating along the nu = 0 lines. Our observations are qualitatively similar to the quantum Hall effect in graphene as well as to the transport in a random magnetic field with a zero mean value.
Resumo:
We discuss quantum error correction for errors that occur at random times as described by, a conditional Poisson process. We shoo, how a class of such errors, detected spontaneous emission, can be corrected by continuous closed loop, feedback.
Resumo:
We theoretically study the Hilbert space structure of two neighboring P-donor electrons in silicon-based quantum computer architectures. To use electron spins as qubits, a crucial condition is the isolation of the electron spins from their environment, including the electronic orbital degrees of freedom. We provide detailed electronic structure calculations of both the single donor electron wave function and the two-electron pair wave function. We adopted a molecular orbital method for the two-electron problem, forming a basis with the calculated single donor electron orbitals. Our two-electron basis contains many singlet and triplet orbital excited states, in addition to the two simple ground state singlet and triplet orbitals usually used in the Heitler-London approximation to describe the two-electron donor pair wave function. We determined the excitation spectrum of the two-donor system, and study its dependence on strain, lattice position, and interdonor separation. This allows us to determine how isolated the ground state singlet and triplet orbitals are from the rest of the excited state Hilbert space. In addition to calculating the energy spectrum, we are also able to evaluate the exchange coupling between the two donor electrons, and the double occupancy probability that both electrons will reside on the same P donor. These two quantities are very important for logical operations in solid-state quantum computing devices, as a large exchange coupling achieves faster gating times, while the magnitude of the double occupancy probability can affect the error rate.
Resumo:
The phase estimation algorithm is so named because it allows an estimation of the eigenvalues associated with an operator. However, it has been proposed that the algorithm can also be used to generate eigenstates. Here we extend this proposal for small quantum systems, identifying the conditions under which the phase-estimation algorithm can successfully generate eigenstates. We then propose an implementation scheme based on an ion trap quantum computer. This scheme allows us to illustrate two simple examples, one in which the algorithm effectively generates eigenstates, and one in which it does not.
Resumo:
The trabecular bone score (TBS) is a gray-level textural metric that can be extracted from the two-dimensional lumbar spine dual-energy X-ray absorptiometry (DXA) image. TBS is related to bone microarchitecture and provides skeletal information that is not captured from the standard bone mineral density (BMD) measurement. Based on experimental variograms of the projected DXA image, TBS has the potential to discern differences between DXA scans that show similar BMD measurements. An elevated TBS value correlates with better skeletal microstructure; a low TBS value correlates with weaker skeletal microstructure. Lumbar spine TBS has been evaluated in cross-sectional and longitudinal studies. The following conclusions are based upon publications reviewed in this article: 1) TBS gives lower values in postmenopausal women and in men with previous fragility fractures than their nonfractured counterparts; 2) TBS is complementary to data available by lumbar spine DXA measurements; 3) TBS results are lower in women who have sustained a fragility fracture but in whom DXA does not indicate osteoporosis or even osteopenia; 4) TBS predicts fracture risk as well as lumbar spine BMD measurements in postmenopausal women; 5) efficacious therapies for osteoporosis differ in the extent to which they influence the TBS; 6) TBS is associated with fracture risk in individuals with conditions related to reduced bone mass or bone quality. Based on these data, lumbar spine TBS holds promise as an emerging technology that could well become a valuable clinical tool in the diagnosis of osteoporosis and in fracture risk assessment. © 2014 American Society for Bone and Mineral Research.
Resumo:
The trabecular bone score (TBS) is a gray-level textural metric that can be extracted from the two-dimensional lumbar spine dual-energy X-ray absorptiometry (DXA) image. TBS is related to bone microarchitecture and provides skeletal information that is not captured from the standard bone mineral density (BMD) measurement. Based on experimental variograms of the projected DXA image, TBS has the potential to discern differences between DXA scans that show similar BMD measurements. An elevated TBS value correlates with better skeletal microstructure; a low TBS value correlates with weaker skeletal microstructure. Lumbar spine TBS has been evaluated in cross-sectional and longitudinal studies. The following conclusions are based upon publications reviewed in this article: 1) TBS gives lower values in postmenopausal women and in men with previous fragility fractures than their nonfractured counterparts; 2) TBS is complementary to data available by lumbar spine DXA measurements; 3) TBS results are lower in women who have sustained a fragility fracture but in whom DXA does not indicate osteoporosis or even osteopenia; 4) TBS predicts fracture risk as well as lumbar spine BMD measurements in postmenopausal women; 5) efficacious therapies for osteoporosis differ in the extent to which they influence the TBS; 6) TBS is associated with fracture risk in individuals with conditions related to reduced bone mass or bone quality. Based on these data, lumbar spine TBS holds promise as an emerging technology that could well become a valuable clinical tool in the diagnosis of osteoporosis and in fracture risk assessment.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.