43 resultados para Quantum Simulation, Quantum Simulators, QED, Lattice Gauge Theory
Resumo:
Critical phenomena involve structural changes in the correlations of its constituents. Such changes can be reproduced and characterized in quantum simulators able to tackle medium-to-large-size systems. We demonstrate these concepts by engineering the ground state of a three-spin Ising ring by using a pair of entangled photons. The effect of a simulated magnetic field, leading to a critical modification of the correlations within the ring, is analysed by studying two- and three-spin entanglement. In particular, we connect the violation of a multipartite Bell inequality with the amount of tripartite entanglement in our ring.
Resumo:
As the development of a viable quantum computer nears, existing widely used public-key cryptosystems, such as RSA, will no longer be secure. Thus, significant effort is being invested into post-quantum cryptography (PQC). Lattice-based cryptography (LBC) is one such promising area of PQC, which offers versatile, efficient, and high performance security services. However, the vulnerabilities of these implementations against side-channel attacks (SCA) remain significantly understudied. Most, if not all, lattice-based cryptosystems require noise samples generated from a discrete Gaussian distribution, and a successful timing analysis attack can render the whole cryptosystem broken, making the discrete Gaussian sampler the most vulnerable module to SCA. This research proposes countermeasures against timing information leakage with FPGA-based designs of the CDT-based discrete Gaussian samplers with constant response time, targeting encryption and signature scheme parameters. The proposed designs are compared against the state-of-the-art and are shown to significantly outperform existing implementations. For encryption, the proposed sampler is 9x faster in comparison to the only other existing time-independent CDT sampler design. For signatures, the first time-independent CDT sampler in hardware is proposed.
Resumo:
A strain gauge instrumentation trial on a high pressure die casting ‘HPDC’ die was compared to a corresponding simulation model using Magmasoft® casting simulation software at two strain gauge rosette locations. The strains were measured during the casting cycle, from which the von Mises stress was determined and then compared to the simulation model. The von Mises stress from the simulation model correlated well with the findings from the instrumentation trial, showing a difference of 5.5%, ~ 10 MPa for one strain gauge rosette located in an area of low stress gradient. The second rosette was in a region of steep stress gradient, which resulted in a difference of up to 40%, ~40 MPa between the simulation and instrumentation results. Factors such as additional loading from die closure force or metal injection pressure which are not modelled by Magmasoft® were seen to have very little influence on the stress in the die, less than 7%.
Resumo:
Birefringence is one of the fascinating properties of the vacuum of quantum electrodynamics (QED) in strong electromagnetic fields. The scattering of linearly polarized incident probe photons into a perpendicularly polarized mode provides a distinct signature of the optical activity of the quantum vacuum and thus offers an excellent opportunity for a precision test of nonlinear QED. Precision tests require accurate predictions and thus a theoretical framework that is capable of taking the detailed experimental geometry into account. We derive analytical solutions for vacuum birefringence which include the spatio-temporal field structure of a strong optical pump laser field and an x-ray probe. We show that the angular distribution of the scattered photons depends strongly on the interaction geometry and find that scattering of the perpendicularly polarized scattered photons out of the cone of the incident probe x-ray beam is the key to making the phenomenon experimentally accessible with the current generation of FEL/high-field laser facilities.
Resumo:
A scheme for enhanced quantum electrodynamics (QED) production of electron-positron-pair plasmas is proposed that uses two ultraintense lasers irradiating a thin solid foil from opposite sides. In the scheme, under a proper matching condition, in addition to the skin-depth emission of gamma-ray photons and Breit-Wheeler creation of pairs on each side of the foil, a large number of high-energy electrons and photons from one side can propagate through it and interact with the laser on the other side, leading to much enhanced gamma-ray emission and pair production. More importantly, the created pairs can be collected later and confined to the center by opposite laser radiation pressures when the foil becomes transparent, resulting in the formation of unprecedentedly overdense and high-energy pair plasmas. Two-dimensional QED particle-in-cell simulations show that electron-positron-pair plasmas with overcritical density 10(22) cm(-3) and a high energy of 100s of MeV are obtained with 10 PW lasers at intensities 10(23) W/cm(2), which are of key significance for laboratory astrophysics studies.
Resumo:
A method for correlated quantum electron-ion dynamics is combined with a method for electronic open boundaries to simulate in real time the heating, and eventual equilibration at an elevated vibrational energy, of a quantum ion under current flow in an atomic wire, together with the response of the current to the ionic heating. The method can also be used to extract inelastic current voltage corrections under steady-state conditions. However, in its present form the open-boundary method contains an approximation that limits the resolution of current-voltage features. The results of the simulations are tested against analytical results from scattering theory. Directions for the improvement of the method are summarized at the end.
Self-consistent non-Markovian theory of a quantum-state evolution for quantum-information processing
Resumo:
We study non-Markovian decoherence phenomena by employing projection-operator formalism when a quantum system (a quantum bit or a register of quantum bits) is coupled to a reservoir. By projecting out the degree of freedom of the reservoir, we derive a non-Markovian master equation for the system, which is reduced to a Lindblad master equation in Markovian limit, and obtain the operator sum representation for the time evolution. It is found that the system is decohered slower in the non- Markovian reservoir than the Markovian because the quantum information of the system is memorized in the non-Markovian reservoir. We discuss the potential importance of non-Markovian reservoirs for quantum-information processing.
Resumo:
We suggest a theoretical scheme for the simulation of quantum random walks on a line using beam splitters, phase shifters, and photodetectors. Our model enables us to simulate a quantum random walk using of the wave nature of classical light fields. Furthermore, the proposed setup allows the analysis of the effects of decoherence. The transition from a pure mean-photon-number distribution to a classical one is studied varying the decoherence parameters.
Resumo:
The simulation of open quantum dynamics has recently allowed the direct investigation of the features of system-environment interaction and of their consequences on the evolution of a quantum system. Such interaction threatens the quantum properties of the system, spoiling them and causing the phenomenon of decoherence. Sometimes however a coherent exchange of information takes place between system and environment, memory effects arise and the dynamics of the system becomes non-Markovian. Here we report the experimental realisation of a non-Markovian process where system and environment are coupled through a simulated transverse Ising model. By engineering the evolution in a photonic quantum simulator, we demonstrate the role played by system-environment correlations in the emergence of memory effects.
Resumo:
Quantum annealing is a promising tool for solving optimization problems, similar in some ways to the traditional ( classical) simulated annealing of Kirkpatrick et al. Simulated annealing takes advantage of thermal fluctuations in order to explore the optimization landscape of the problem at hand, whereas quantum annealing employs quantum fluctuations. Intriguingly, quantum annealing has been proved to be more effective than its classical counterpart in many applications. We illustrate the theory and the practical implementation of both classical and quantum annealing - highlighting the crucial differences between these two methods - by means of results recently obtained in experiments, in simple toy-models, and more challenging combinatorial optimization problems ( namely, Random Ising model and Travelling Salesman Problem). The techniques used to implement quantum and classical annealing are either deterministic evolutions, for the simplest models, or Monte Carlo approaches, for harder optimization tasks. We discuss the pro and cons of these approaches and their possible connections to the landscape of the problem addressed.
Resumo:
The outcomes of educational assessments undoubtedly have real implications for students, teachers, schools and education in the widest sense. Assessment results are, for example, used to award qualifications that determine future educational or vocational pathways of students. The results obtained by students in assessments are also used to gauge individual teacher quality, to hold schools to account for the standards achieved by their students, and to compare international education systems. Given the current high-stakes nature of educational assessment, it is imperative that the measurement practices involved have stable philosophical foundations. However, this paper casts doubt on the theoretical underpinnings of contemporary educational measurement models. Aspects of Wittgenstein’s later philosophy and Bohr’s philosophy of quantum theory are used to argue that a quantum theoretical rather than a Newtonian model is appropriate for educational measurement, and the associated implications for the concept of validity are elucidated. Whilst it is acknowledged that the transition to a quantum theoretical framework would not lead to the demise of educational assessment, it is argued that, where practical, current high-stakes assessments should be reformed to become as ‘low-stakes’ as possible. The paper also undermines some of the pro high-stakes testing rhetoric that has a tendency to afflict education.
Resumo:
The strong mixing of many-electron basis states in excited atoms and ions with open f shells results in very large numbers of complex, chaotic eigenstates that cannot be computed to any degree of accuracy. Describing the processes which involve such states requires the use of a statistical theory. Electron capture into these “compound resonances” leads to electron-ion recombination rates that are orders of magnitude greater than those of direct, radiative recombination and cannot be described by standard theories of dielectronic recombination. Previous statistical theories considered this as a two-electron capture process which populates a pair of single-particle orbitals, followed by “spreading” of the two-electron states into chaotically mixed eigenstates. This method is similar to a configuration-average approach because it neglects potentially important effects of spectator electrons and conservation of total angular momentum. In this work we develop a statistical theory which considers electron capture into “doorway” states with definite angular momentum obtained by the configuration interaction method. We apply this approach to electron recombination with W20+, considering 2×106 doorway states. Despite strong effects from the spectator electrons, we find that the results of the earlier theories largely hold. Finally, we extract the fluorescence yield (the probability of photoemission and hence recombination) by comparison with experiment.
Resumo:
A key element in the architecture of a quantum-information processing network is a reliable physical interface between fields and qubits. We study a process of entanglement transfer engineering, where two remote qubits respectively interact with an entangled two-mode continuous-variable (CV) field. We quantify the entanglement induced in the qubit state at the expenses of the loss of entanglement in the CV system. We discuss the range of mixed entangled states which can be obtained with this setup. Furthermore, we suggest a protocol to determine the residual entangling power of the light fields inferring, thus, the entanglement left in the field modes which, after the interaction, are no longer in a Gaussian state. Two different setups are proposed: a cavity-QED system and an interface between superconducting qubits and field modes. We address in detail the practical difficulties inherent in these two proposals, showing that the latter is promising in many aspects.