905 resultados para Quantum computational complexity
Resumo:
We present a microscopic model for calculating the AC conductivity of a finite length line junction made up of two counter-or co-propagating single mode quantum Hall edges with possibly different filling fractions. The effect of density-density interactions and a local tunneling conductance (sigma) between the two edges is considered. Assuming that sigma is independent of the frequency omega, we derive expressions for the AC conductivity as a function of omega, the length of the line junction and other parameters of the system. We reproduce the results of Sen and Agarwal (2008 Phys. Rev. B 78 085430) in the DC limit (omega -> 0), and generalize those results for an interacting system. As a function of omega, the AC conductivity shows significant oscillations if sigma is small; the oscillations become less prominent as sigma increases. A renormalization group analysis shows that the system may be in a metallic or an insulating phase depending on the strength of the interactions. We discuss the experimental implications of this for the behavior of the AC conductivity at low temperatures.
Resumo:
We report an efficient and fast solvothermal route to prepare highly crystalline monodispersed InP quantum dots. This solvothermal route, not only ensures inert atmosphere, which is strictly required for the synthesis of phase pure InP quantum dots but also allows a reaction temperature as high as 430 degrees C, which is otherwise impossible to achieve using a typical solution chemistry; the higher reaction temperature makes the reaction more facile. This method also has a judicious control over the size of the quantum dots and thus in tuning the bandgap.
Resumo:
The Standard Model of particle physics consists of the quantum electrodynamics (QED) and the weak and strong nuclear interactions. The QED is the basis for molecular properties, and thus it defines much of the world we see. The weak nuclear interaction is responsible for decays of nuclei, among other things, and in principle, it should also effects at the molecular scale. The strong nuclear interaction is hidden in interactions inside nuclei. From the high-energy and atomic experiments it is known that the weak interaction does not conserve parity. Consequently, the weak interaction and specifically the exchange of the Z^0 boson between a nucleon and an electron induces small energy shifts of different sign for mirror image molecules. This in turn will make the other enantiomer of a molecule energetically favorable than the other and also shifts the spectral lines of the mirror image pair of molecules into different directions creating a split. Parity violation (PV) in molecules, however, has not been observed. The topic of this thesis is how the weak interaction affects certain molecular magnetic properties, namely certain parameters of nuclear magnetic resonance (NMR) and electron spin resonance (ESR) spectroscopies. The thesis consists of numerical estimates of NMR and ESR spectral parameters and investigations of the effects of different aspects of quantum chemical computations to them. PV contributions to the NMR shielding and spin-spin coupling constants are investigated from the computational point of view. All the aspects of quantum chemical electronic structure computations are found to be very important, which makes accurate computations challenging. Effects of molecular geometry are also investigated using a model system of polysilyene chains. PV contribution to the NMR shielding constant is found to saturate after the chain reaches a certain length, but the effects of local geometry can be large. Rigorous vibrational averaging is also performed for a relatively small and rigid molecule. Vibrational corrections to the PV contribution are found to be only a couple of per cents. PV contributions to the ESR g-tensor are also evaluated using a series of molecules. Unfortunately, all the estimates are below the experimental limits, but PV in some of the heavier molecules comes close to the present day experimental resolution.
Resumo:
This thesis studies the intermolecular interactions in (i) boron-nitrogen based systems for hydrogen splitting and storage, (ii) endohedral complexes, A@C60, and (iii) aurophilic dimers. We first present an introduction of intermolecular interactions. The theoretical background is then described. The research results are summarized in the following sections. In the boron-nitrogen systems, the electrostatic interaction is found to be the leading contribution, as 'Coulomb Pays for Heitler and London' (CHL). For the endohedral complex, the intermolecular interaction is formulated by a one-center expansion of the Coulomb operator 1/rab. For the aurophilic attraction between two C2v monomers, a London-type formula was derived by fully accounting for the anisotropy and point-group symmetry of the monomers.
Resumo:
Quantum effects are often of key importance for the function of biological systems at molecular level. Cellular respiration, where energy is extracted from the reduction of molecular oxygen to water, is no exception. In this work, the end station of the electron transport chain in mitochondria, cytochrome c oxidase, is investigated using quantum chemical methodology. Cytochrome c oxidase contains two haems, haem a and haem a3. Haem a3, with its copper companion, CuB, is involved in the final reduction of oxygen into water. This binuclear centre receives the necessary electrons from haem a. Haem a, in turn, receives its electrons from a copper ion pair in the vicinity, called CuA. Density functional theory (DFT) has been used to clarify the charge and spin distributions of haem a, as well as changes in these during redox activity. Upon reduction, the added electron is shown to be evenly distributed over the entire haem structure, important for the accommodation of the prosthetic group within the protein. At the same time, the spin distribution of the open-shell oxidised state is more localised to the central iron. The exact spin density distribution has been disputed in the literature, however, different experiments indicating different distributions of the unpaired electron. The apparent contradiction is shown to be due to the false assumption of a unit amount of unpaired electron density; in fact, the oxidised state has about 1.3 unpaired electrons. The validity of the DFT results have been corroborated by wave function based coupled cluster calculations. Point charges, for use in classical force field based simulations, have been parameterised for the four metal centres, using a newly developed methodology. In the procedure, the subsystem for which point charges are to be obtained, is surrounded by an outer region, with the purpose of stabilising the inner region, both electronically and structurally. Finally, the possibility of vibrational promotion of the electron transfer step between haem a and a3 has been investigated. Calculating the full vibrational spectra, at DFT level, of a combined model of the two haems, revealed several normal modes that do shift electron density between the haems. The magnitude of the shift was found to be moderate, at most. The proposed mechanism could have an assisting role in the electron transfer, which still seems to be dominated by electron tunnelling.
Resumo:
Having the ability to work with complex models can be highly beneficial, but the computational cost of doing so is often large. Complex models often have intractable likelihoods, so methods that directly use the likelihood function are infeasible. In these situations, the benefits of working with likelihood-free methods become apparent. Likelihood-free methods, such as parametric Bayesian indirect likelihood that uses the likelihood of an alternative parametric auxiliary model, have been explored throughout the literature as a good alternative when the model of interest is complex. One of these methods is called the synthetic likelihood (SL), which assumes a multivariate normal approximation to the likelihood of a summary statistic of interest. This paper explores the accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions. We relate BSL to pseudo-marginal methods and propose to use an alternative SL that uses an unbiased estimator of the exact working normal likelihood when the summary statistic has a multivariate normal distribution. Several applications of varying complexity are considered to illustrate the findings of this paper.
Resumo:
For the consumer, flavor is arguably the most important aspect of a good coffee. Coffee flavor is extremely complex and arises from numerous chemical, biological and physical influences of cultivar, coffee cherry maturity, geographical growing location, production, processing, roasting and cup preparation. Not surprisingly there is a large volume of research published detailing the volatile and non-volatile compounds in coffee and that are likely to be playing a role in coffee flavor. Further, there is much published on the sensory properties of coffee. Nevertheless, the link between flavor components and the sensory properties expressed in the complex matrix of coffee is yet to be fully understood. This paper provides an overview of the chemical components that are thought to be involved in the flavor and sensory quality of Arabica coffee.
Resumo:
This paper presents an effective feature representation method in the context of activity recognition. Efficient and effective feature representation plays a crucial role not only in activity recognition, but also in a wide range of applications such as motion analysis, tracking, 3D scene understanding etc. In the context of activity recognition, local features are increasingly popular for representing videos because of their simplicity and efficiency. While they achieve state-of-the-art performance with low computational requirements, their performance is still limited for real world applications due to a lack of contextual information and models not being tailored to specific activities. We propose a new activity representation framework to address the shortcomings of the popular, but simple bag-of-words approach. In our framework, first multiple instance SVM (mi-SVM) is used to identify positive features for each action category and the k-means algorithm is used to generate a codebook. Then locality-constrained linear coding is used to encode the features into the generated codebook, followed by spatio-temporal pyramid pooling to convey the spatio-temporal statistics. Finally, an SVM is used to classify the videos. Experiments carried out on two popular datasets with varying complexity demonstrate significant performance improvement over the base-line bag-of-feature method.
Resumo:
Large-scale chromosome rearrangements such as copy number variants (CNVs) and inversions encompass a considerable proportion of the genetic variation between human individuals. In a number of cases, they have been closely linked with various inheritable diseases. Single-nucleotide polymorphisms (SNPs) are another large part of the genetic variance between individuals. They are also typically abundant and their measuring is straightforward and cheap. This thesis presents computational means of using SNPs to detect the presence of inversions and deletions, a particular variety of CNVs. Technically, the inversion-detection algorithm detects the suppressed recombination rate between inverted and non-inverted haplotype populations whereas the deletion-detection algorithm uses the EM-algorithm to estimate the haplotype frequencies of a window with and without a deletion haplotype. As a contribution to population biology, a coalescent simulator for simulating inversion polymorphisms has been developed. Coalescent simulation is a backward-in-time method of modelling population ancestry. Technically, the simulator also models multiple crossovers by using the Counting model as the chiasma interference model. Finally, this thesis includes an experimental section. The aforementioned methods were tested on synthetic data to evaluate their power and specificity. They were also applied to the HapMap Phase II and Phase III data sets, yielding a number of candidates for previously unknown inversions, deletions and also correctly detecting known such rearrangements.
Resumo:
For a dynamically disordered continuum it is found that the exact quantum mechanical mean square displacement 〈x2(t)〉∼t3, for t→∞. A Gaussian white-noise spectrum is assumed for the random potential. The result differs qualitatively from the diffusive behavior well known for the one-band lattice Hamiltonian, and is understandable in terms of the momentum cutoff inherent in the lattice, simulating a "momentum bath."
Resumo:
This thesis which consists of an introduction and four peer-reviewed original publications studies the problems of haplotype inference (haplotyping) and local alignment significance. The problems studied here belong to the broad area of bioinformatics and computational biology. The presented solutions are computationally fast and accurate, which makes them practical in high-throughput sequence data analysis. Haplotype inference is a computational problem where the goal is to estimate haplotypes from a sample of genotypes as accurately as possible. This problem is important as the direct measurement of haplotypes is difficult, whereas the genotypes are easier to quantify. Haplotypes are the key-players when studying for example the genetic causes of diseases. In this thesis, three methods are presented for the haplotype inference problem referred to as HaploParser, HIT, and BACH. HaploParser is based on a combinatorial mosaic model and hierarchical parsing that together mimic recombinations and point-mutations in a biologically plausible way. In this mosaic model, the current population is assumed to be evolved from a small founder population. Thus, the haplotypes of the current population are recombinations of the (implicit) founder haplotypes with some point--mutations. HIT (Haplotype Inference Technique) uses a hidden Markov model for haplotypes and efficient algorithms are presented to learn this model from genotype data. The model structure of HIT is analogous to the mosaic model of HaploParser with founder haplotypes. Therefore, it can be seen as a probabilistic model of recombinations and point-mutations. BACH (Bayesian Context-based Haplotyping) utilizes a context tree weighting algorithm to efficiently sum over all variable-length Markov chains to evaluate the posterior probability of a haplotype configuration. Algorithms are presented that find haplotype configurations with high posterior probability. BACH is the most accurate method presented in this thesis and has comparable performance to the best available software for haplotype inference. Local alignment significance is a computational problem where one is interested in whether the local similarities in two sequences are due to the fact that the sequences are related or just by chance. Similarity of sequences is measured by their best local alignment score and from that, a p-value is computed. This p-value is the probability of picking two sequences from the null model that have as good or better best local alignment score. Local alignment significance is used routinely for example in homology searches. In this thesis, a general framework is sketched that allows one to compute a tight upper bound for the p-value of a local pairwise alignment score. Unlike the previous methods, the presented framework is not affeced by so-called edge-effects and can handle gaps (deletions and insertions) without troublesome sampling and curve fitting.