965 resultados para Quantum computational complexity


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The vast majority of known proteins have not yet been experimentally characterized and little is known about their function. The design and implementation of computational tools can provide insight into the function of proteins based on their sequence, their structure, their evolutionary history and their association with other proteins. Knowledge of the three-dimensional (3D) structure of a protein can lead to a deep understanding of its mode of action and interaction, but currently the structures of <1% of sequences have been experimentally solved. For this reason, it became urgent to develop new methods that are able to computationally extract relevant information from protein sequence and structure. The starting point of my work has been the study of the properties of contacts between protein residues, since they constrain protein folding and characterize different protein structures. Prediction of residue contacts in proteins is an interesting problem whose solution may be useful in protein folding recognition and de novo design. The prediction of these contacts requires the study of the protein inter-residue distances related to the specific type of amino acid pair that are encoded in the so-called contact map. An interesting new way of analyzing those structures came out when network studies were introduced, with pivotal papers demonstrating that protein contact networks also exhibit small-world behavior. In order to highlight constraints for the prediction of protein contact maps and for applications in the field of protein structure prediction and/or reconstruction from experimentally determined contact maps, I studied to which extent the characteristic path length and clustering coefficient of the protein contacts network are values that reveal characteristic features of protein contact maps. Provided that residue contacts are known for a protein sequence, the major features of its 3D structure could be deduced by combining this knowledge with correctly predicted motifs of secondary structure. In the second part of my work I focused on a particular protein structural motif, the coiled-coil, known to mediate a variety of fundamental biological interactions. Coiled-coils are found in a variety of structural forms and in a wide range of proteins including, for example, small units such as leucine zippers that drive the dimerization of many transcription factors or more complex structures such as the family of viral proteins responsible for virus-host membrane fusion. The coiled-coil structural motif is estimated to account for 5-10% of the protein sequences in the various genomes. Given their biological importance, in my work I introduced a Hidden Markov Model (HMM) that exploits the evolutionary information derived from multiple sequence alignments, to predict coiled-coil regions and to discriminate coiled-coil sequences. The results indicate that the new HMM outperforms all the existing programs and can be adopted for the coiled-coil prediction and for large-scale genome annotation. Genome annotation is a key issue in modern computational biology, being the starting point towards the understanding of the complex processes involved in biological networks. The rapid growth in the number of protein sequences and structures available poses new fundamental problems that still deserve an interpretation. Nevertheless, these data are at the basis of the design of new strategies for tackling problems such as the prediction of protein structure and function. Experimental determination of the functions of all these proteins would be a hugely time-consuming and costly task and, in most instances, has not been carried out. As an example, currently, approximately only 20% of annotated proteins in the Homo sapiens genome have been experimentally characterized. A commonly adopted procedure for annotating protein sequences relies on the "inheritance through homology" based on the notion that similar sequences share similar functions and structures. This procedure consists in the assignment of sequences to a specific group of functionally related sequences which had been grouped through clustering techniques. The clustering procedure is based on suitable similarity rules, since predicting protein structure and function from sequence largely depends on the value of sequence identity. However, additional levels of complexity are due to multi-domain proteins, to proteins that share common domains but that do not necessarily share the same function, to the finding that different combinations of shared domains can lead to different biological roles. In the last part of this study I developed and validate a system that contributes to sequence annotation by taking advantage of a validated transfer through inheritance procedure of the molecular functions and of the structural templates. After a cross-genome comparison with the BLAST program, clusters were built on the basis of two stringent constraints on sequence identity and coverage of the alignment. The adopted measure explicity answers to the problem of multi-domain proteins annotation and allows a fine grain division of the whole set of proteomes used, that ensures cluster homogeneity in terms of sequence length. A high level of coverage of structure templates on the length of protein sequences within clusters ensures that multi-domain proteins when present can be templates for sequences of similar length. This annotation procedure includes the possibility of reliably transferring statistically validated functions and structures to sequences considering information available in the present data bases of molecular functions and structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Some fundamental biological processes such as embryonic development have been preserved during evolution and are common to species belonging to different phylogenetic positions, but are nowadays largely unknown. The understanding of cell morphodynamics leading to the formation of organized spatial distribution of cells such as tissues and organs can be achieved through the reconstruction of cells shape and position during the development of a live animal embryo. We design in this work a chain of image processing methods to automatically segment and track cells nuclei and membranes during the development of a zebrafish embryo, which has been largely validates as model organism to understand vertebrate development, gene function and healingrepair mechanisms in vertebrates. The embryo is previously labeled through the ubiquitous expression of fluorescent proteins addressed to cells nuclei and membranes, and temporal sequences of volumetric images are acquired with laser scanning microscopy. Cells position is detected by processing nuclei images either through the generalized form of the Hough transform or identifying nuclei position with local maxima after a smoothing preprocessing step. Membranes and nuclei shapes are reconstructed by using PDEs based variational techniques such as the Subjective Surfaces and the Chan Vese method. Cells tracking is performed by combining informations previously detected on cells shape and position with biological regularization constraints. Our results are manually validated and reconstruct the formation of zebrafish brain at 7-8 somite stage with all the cells tracked starting from late sphere stage with less than 2% error for at least 6 hours. Our reconstruction opens the way to a systematic investigation of cellular behaviors, of clonal origin and clonal complexity of brain organs, as well as the contribution of cell proliferation modes and cell movements to the formation of local patterns and morphogenetic fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die Wechselwirkung zwischen Proteinen und anorganischen Oberflächen fasziniert sowohl aus angewandter als auch theoretischer Sicht. Sie ist ein wichtiger Aspekt in vielen Anwendungen, unter anderem in chirugischen Implantaten oder Biosensoren. Sie ist außerdem ein Beispiel für theoretische Fragestellungen betreffend die Grenzfläche zwischen harter und weicher Materie. Fest steht, dass Kenntnis der beteiligten Mechanismen erforderlich ist um die Wechselwirkung zwischen Proteinen und Oberflächen zu verstehen, vorherzusagen und zu optimieren. Aktuelle Fortschritte im experimentellen Forschungsbereich ermöglichen die Untersuchung der direkten Peptid-Metall-Bindung. Dadurch ist die Erforschung der theoretischen Grundlagen weiter ins Blickfeld aktueller Forschung gerückt. Eine Möglichkeit die Wechselwirkung zwischen Proteinen und anorganischen Oberflächen zu erforschen ist durch Computersimulationen. Obwohl Simulationen von Metalloberflächen oder Proteinen als Einzelsysteme schon länger verbreitet sind, bringt die Simulation einer Kombination beider Systeme neue Schwierigkeiten mit sich. Diese zu überwinden erfordert ein Mehrskalen-Verfahren: Während Proteine als biologische Systeme ausreichend mit klassischer Molekulardynamik beschrieben werden können, bedarf die Beschreibung delokalisierter Elektronen metallischer Systeme eine quantenmechanische Formulierung. Die wichtigste Voraussetzung eines Mehrskalen-Verfahrens ist eine Übereinstimmung der Simulationen auf den verschiedenen Skalen. In dieser Arbeit wird dies durch die Verknüpfung von Simulationen alternierender Skalen erreicht. Diese Arbeit beginnt mit der Untersuchung der Thermodynamik der Benzol-Hydratation mittels klassischer Molekulardynamik. Dann wird die Wechselwirkung zwischen Wasser und den [111]-Metalloberflächen von Gold und Nickel mittels eines Multiskalen-Verfahrens modelliert. In einem weiteren Schritt wird die Adsorbtion des Benzols an Metalloberflächen in wässriger Umgebung studiert. Abschließend wird die Modellierung erweitert und auch die Aminosäuren Alanin und Phenylalanin einbezogen. Dies eröffnet die Möglichkeit realistische Protein- Metall-Systeme in Computersimulationen zu betrachten und auf theoretischer Basis die Wechselwirkung zwischen Peptiden und Oberflächen für jede Art Peptide und Oberfläche vorauszusagen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer simulations have become an important tool in physics. Especially systems in the solid state have been investigated extensively with the help of modern computational methods. This thesis focuses on the simulation of hydrogen-bonded systems, using quantum chemical methods combined with molecular dynamics (MD) simulations. MD simulations are carried out for investigating the energetics and structure of a system under conditions that include physical parameters such as temperature and pressure. Ab initio quantum chemical methods have proven to be capable of predicting spectroscopic quantities. The combination of these two features still represents a methodological challenge. Furthermore, conventional MD simulations consider the nuclei as classical particles. Not only motional effects, but also the quantum nature of the nuclei are expected to influence the properties of a molecular system. This work aims at a more realistic description of properties that are accessible via NMR experiments. With the help of the path integral formalism the quantum nature of the nuclei has been incorporated and its influence on the NMR parameters explored. The effect on both the NMR chemical shift and the Nuclear Quadrupole Coupling Constants (NQCC) is presented for intra- and intermolecular hydrogen bonds. The second part of this thesis presents the computation of electric field gradients within the Gaussian and Augmented Plane Waves (GAPW) framework, that allows for all-electron calculations in periodic systems. This recent development improves the accuracy of many calculations compared to the pseudopotential approximation, which treats the core electrons as part of an effective potential. In combination with MD simulations of water, the NMR longitudinal relaxation times for 17O and 2H have been obtained. The results show a considerable agreement with the experiment. Finally, an implementation of the calculation of the stress tensor into the quantum chemical program suite CP2K is presented. This enables MD simulations under constant pressure conditions, which is demonstrated with a series of liquid water simulations, that sheds light on the influence of the exchange-correlation functional used on the density of the simulated liquid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biodiesel represents a possible substitute to the fossil fuels; for this reason a good comprehension of the kinetics involved is important. Due to the complexity of the biodiesel mixture a common practice is the use of surrogate molecules to study its reactivity. In this work are presented the experimental and computational results obtained for the oxidation and pyrolysis of methane and methyl formate conducted in a plug flow reactor. The work was divided into two parts: the first one was the setup assembly whilst, in the second one, was realized a comparison between the experimental and model results; these last was obtained using models available in literature. It was started studying the methane since, a validate model was available, in this way was possible to verify the reliability of the experimental results. After this first study the attention was focused on the methyl formate investigation. All the analysis were conducted at different temperatures, pressures and, for the oxidation, at different equivalence ratios. The results shown that, a good comprehension of the kinetics is reach but efforts are necessary to better evaluate kinetics parameters such as activation energy. The results even point out that the realized setup is adapt to study the oxidation and pyrolysis and, for this reason, it will be employed to study a longer chain esters with the aim to better understand the kinetic of the molecules that are part of the biodiesel mixture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Self-organising pervasive ecosystems of devices are set to become a major vehicle for delivering infrastructure and end-user services. The inherent complexity of such systems poses new challenges to those who want to dominate it by applying the principles of engineering. The recent growth in number and distribution of devices with decent computational and communicational abilities, that suddenly accelerated with the massive diffusion of smartphones and tablets, is delivering a world with a much higher density of devices in space. Also, communication technologies seem to be focussing on short-range device-to-device (P2P) interactions, with technologies such as Bluetooth and Near-Field Communication gaining greater adoption. Locality and situatedness become key to providing the best possible experience to users, and the classic model of a centralised, enormously powerful server gathering and processing data becomes less and less efficient with device density. Accomplishing complex global tasks without a centralised controller responsible of aggregating data, however, is a challenging task. In particular, there is a local-to-global issue that makes the application of engineering principles challenging at least: designing device-local programs that, through interaction, guarantee a certain global service level. In this thesis, we first analyse the state of the art in coordination systems, then motivate the work by describing the main issues of pre-existing tools and practices and identifying the improvements that would benefit the design of such complex software ecosystems. The contribution can be divided in three main branches. First, we introduce a novel simulation toolchain for pervasive ecosystems, designed for allowing good expressiveness still retaining high performance. Second, we leverage existing coordination models and patterns in order to create new spatial structures. Third, we introduce a novel language, based on the existing ``Field Calculus'' and integrated with the aforementioned toolchain, designed to be usable for practical aggregate programming.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have investigated the thermodynamics of sulfuric acid dimer hydration using ab initio quantum mechanical methods. For (H2SO4)2(H2O)n where n = 0−6, we employed high-level ab initio calculations to locate the most stable minima for each cluster size. The results presented herein yield a detailed understanding of the first deprotonation of sulfuric acid as a function of temperature for a system consisting of two sulfuric acid molecules and up to six waters. At 0 K, a cluster of two sulfuric acid molecules and one water remains undissociated. Addition of a second water begins the deprotonation of the first sulfuric acid leading to the di-ionic species (the bisulfate anion HSO4−, the hydronium cation H3O+, an undissociated sulfuric acid molecule, and a water). Upon the addition of a third water molecule, the second sulfuric acid molecule begins to dissociate. For the (H2SO4)2(H2O)3 cluster, the di-ionic cluster is a few kcal mol−1 more stable than the neutral cluster, which is just slightly more stable than the tetra-ionic cluster (two bisulfate anions, two hydronium cations, and one water). With four water molecules, the tetra-ionic cluster, (HSO4−)2(H3O+)2(H2O)2, becomes as favorable as the di-ionic cluster H2SO4(HSO4−)(H3O+)(H2O)3 at 0 K. Increasing the temperature favors the undissociated clusters, and at room temperature we predict that the di-ionic species is slightly more favorable than the neutral cluster once three waters have been added to the cluster. The tetra-ionic species competes with the di-ionic species once five waters have been added to the cluster. The thermodynamics of stepwise hydration of sulfuric acid dimer is similar to that of the monomer; it is favorable up to n = 4−5 at 298 K. A much more thermodynamically favorable pathway forming sulfuric acid dimer hydrates is through the combination of sulfuric acid monomer hydrates, but the low concentration of sulfuric acid relative to water vapor at ambient conditions limits that process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have studied the structure and stability of (H3O+)(H2O)8 clusters using a combination of molecular dynamics sampling and high-level ab initio calculations. 20 distinct oxygen frameworks are found within 2 kcal/mol of the electronic or standard Gibbs free energy minimum. The impact of quantum zero-point vibrational corrections on the relative stability of these isomers is quite significant. The box-like isomers are favored in terms of electronic energy, but with the inclusion of zero-point vibrational corrections and entropic effects tree-like isomers are favored at higher temperatures. Under conditions from 0 to 298.15 K, the global minimum is predicted to be a tree-like structure with one dangling singly coordinated water molecule. Above 298.15 K, higher entropy tree-like isomers with two or more singly coordinated water molecules are favored. These assignments are generally consistent with experimental IR spectra of (H3O+)(H2O)8 obtained at 150 K.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, the bio-conjugated nanostructured materials have emerged as a new class of materials for the bio-sensing and medical diagnostics applications. In spite of their multi-directional applications, interfacing nanomaterials with bio-molecules has been a challenge due to somewhat limited knowledge about the underlying physics and chemistry behind these interactions and also for the complexity of biomolecules. The main objective of this dissertation is to provide such a detailed knowledge on bioconjugated nanomaterials toward their applications in designing the next generation of sensing devices. Specifically, we investigate the changes in the electronic properties of a boron nitride nanotube (BNNT) due to the adsorption of different bio-molecules, ranging from neutral (DNA/RNA nucleobases) to polar (amino acid molecules). BNNT is a typical member of III-V compounds semiconductors with morphology similar to that of carbon nanotubes (CNTs) but with its own distinct properties. More specifically, the natural affinity of BNNTs toward living cells with no apparent toxicity instigates the applications of BNNTs in drug delivery and cell therapy. Our results predict that the adsorption of DNA/RNA nucleobases on BNNTs amounts to different degrees of modulation in the band gap of BNNTs, which can be exploited for distinguishing these nucleobases from each other. Interestingly, for the polar amino acid molecules, the nature of interaction appeared to vary ranging from Coulombic, van der Waals and covalent depending on the polarity of the individual molecules, each with a different binding strength and amount of charge transfer involved in the interaction. The strong binding of amino acid molecules on the BNNTs explains the observed protein wrapping onto BNNTs without any linkers, unlike carbon nanotubes (CNTs). Additionally, the widely varying binding energies corresponding to different amino acid molecules toward BNNTs indicate to the suitability of BNNTs for the biosensing applications, as compared to the metallic CNTs. The calculated I-V characteristics in these bioconjugated nanotubes predict notable changes in the conductivity of BNNTs due to the physisorption of DNA/RNA nucleobases. This is not the case with metallic CNTs whose transport properties remained unaltered in their conjugated systems with the nucleobases. Collectively, the bioconjugated BNNTs are found to be an excellent system for the next generation sensing devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An invisibility cloak is a device that can hide the target by enclosing it from the incident radiation. This intriguing device has attracted a lot of attention since it was first implemented at a microwave frequency in 2006. However, the problems of existing cloak designs prevent them from being widely applied in practice. In this dissertation, we try to remove or alleviate the three constraints for practical applications imposed by loosy cloaking media, high implementation complexity, and small size of hidden objects compared to the incident wavelength. To facilitate cloaking design and experimental characterization, several devices and relevant techniques for measuring the complex permittivity of dielectric materials at microwave frequencies are developed. In particular, a unique parallel plate waveguide chamber has been set up to automatically map the electromagnetic (EM) field distribution for wave propagation through the resonator arrays and cloaking structures. The total scattering cross section of the cloaking structures was derived based on the measured scattering field by using this apparatus. To overcome the adverse effects of lossy cloaking media, microwave cloaks composed of identical dielectric resonators made of low loss ceramic materials are designed and implemented. The effective permeability dispersion was provided by tailoring dielectric resonator filling fractions. The cloak performances had been verified by full-wave simulation of true multi-resonator structures and experimental measurements of the fabricated prototypes. With the aim to reduce the implementation complexity caused by metamaterials employment for cloaking, we proposed to design 2-D cylindrical cloaks and 3-D spherical cloaks by using multi-layer ordinary dielectric material (εr>1) coating. Genetic algorithm was employed to optimize the dielectric profiles of the cloaking shells to provide the minimum scattering cross sections of the cloaked targets. The designed cloaks can be easily scaled to various operating frequencies. The simulation results show that the multi-layer cylindrical cloak essentially outperforms the similarly sized metamaterials-based cloak designed by using the transformation optics-based reduced parameters. For the designed spherical cloak, the simulated scattering pattern shows that the total scattering cross section is greatly reduced. In addition, the scattering in specific directions could be significantly reduced. It is shown that the cloaking efficiency for larger targets could be improved by employing lossy materials in the shell. At last, we propose to hide a target inside a waveguide structure filled with only epsilon near zero materials, which are easy to implement in practice. The cloaking efficiency of this method, which was found to increase for large targets, has been confirmed both theoretically and by simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A quantum simulator of U(1) lattice gauge theories can be implemented with superconducting circuits. This allows the investigation of confined and deconfined phases in quantum link models, and of valence bond solid and spin liquid phases in quantum dimer models. Fractionalized confining strings and the real-time dynamics of quantum phase transitions are accessible as well. Here we show how state-of-the-art superconducting technology allows us to simulate these phenomena in relatively small circuit lattices. By exploiting the strong non-linear couplings between quantized excitations emerging when superconducting qubits are coupled, we show how to engineer gauge invariant Hamiltonians, including ring-exchange and four-body Ising interactions. We demonstrate that, despite decoherence and disorder effects, minimal circuit instances allow us to investigate properties such as the dynamics of electric flux strings, signaling confinement in gauge invariant field theories. The experimental realization of these models in larger superconducting circuits could address open questions beyond current computational capability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential and adaptive flexibility of population dynamic P-systems (PDP) to study population dynamics suggests that they may be suitable for modelling complex fluvial ecosystems, characterized by a composition of dynamic habitats with many variables that interact simultaneously. Using as a model a reservoir occupied by the zebra mussel Dreissena polymorpha, we designed a computational model based on P systems to study the population dynamics of larvae, in order to evaluate management actions to control or eradicate this invasive species. The population dynamics of this species was simulated under different scenarios ranging from the absence of water flow change to a weekly variation with different flow rates, to the actual hydrodynamic situation of an intermediate flow rate. Our results show that PDP models can be very useful tools to model complex, partially desynchronized, processes that work in parallel. This allows the study of complex hydroecological processes such as the one presented, where reproductive cycles, temperature and water dynamics are involved in the desynchronization of the population dynamics both, within areas and among them. The results obtained may be useful in the management of other reservoirs with similar hydrodynamic situations in which the presence of this invasive species has been documented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). In this context both the correct associations among the observations, and the orbits of the objects have to be determined. The complexity of the MTT problem is defined by its dimension S. Where S stands for the number of ’fences’ used in the problem, each fence consists of a set of observations that all originate from dierent targets. For a dimension of S ˃ the MTT problem becomes NP-hard. As of now no algorithm exists that can solve an NP-hard problem in an optimal manner within a reasonable (polynomial) computation time. However, there are algorithms that can approximate the solution with a realistic computational e ort. To this end an Elitist Genetic Algorithm is implemented to approximately solve the S ˃ MTT problem in an e cient manner. Its complexity is studied and it is found that an approximate solution can be obtained in a polynomial time. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to e ciently process large data sets with minimal manual intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decade, Grid computing paved the way for a new level of large scale distributed systems. This infrastructure made it possible to securely and reliably take advantage of widely separated computational resources that are part of several different organizations. Resources can be incorporated to the Grid, building a theoretical virtual supercomputer. In time, cloud computing emerged as a new type of large scale distributed system, inheriting and expanding the expertise and knowledge that have been obtained so far. Some of the main characteristics of Grids naturally evolved into clouds, others were modified and adapted and others were simply discarded or postponed. Regardless of these technical specifics, both Grids and clouds together can be considered as one of the most important advances in large scale distributed computing of the past ten years; however, this step in distributed computing has came along with a completely new level of complexity. Grid and cloud management mechanisms play a key role, and correct analysis and understanding of the system behavior are needed. Large scale distributed systems must be able to self-manage, incorporating autonomic features capable of controlling and optimizing all resources and services. Traditional distributed computing management mechanisms analyze each resource separately and adjust specific parameters of each one of them. When trying to adapt the same procedures to Grid and cloud computing, the vast complexity of these systems can make this task extremely complicated. But large scale distributed systems complexity could only be a matter of perspective. It could be possible to understand the Grid or cloud behavior as a single entity, instead of a set of resources. This abstraction could provide a different understanding of the system, describing large scale behavior and global events that probably would not be detected analyzing each resource separately. In this work we define a theoretical framework that combines both ideas, multiple resources and single entity, to develop large scale distributed systems management techniques aimed at system performance optimization, increased dependability and Quality of Service (QoS). The resulting synergy could be the key 350 J. Montes et al. to address the most important difficulties of Grid and cloud management.