61 resultados para Quantum algorithms

em Universit


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the challenges of treating polarization and covalent interactions in docking by developing a hybrid quantum mechanical/molecular mechanical (QM/MM) scoring function based on the semiempirical self-consistent charge density functional tight-binding (SCC-DFTB) method and the CHARMM force field. To benchmark this scoring function within the EADock DSS docking algorithm, we created a publicly available dataset of high-quality X-ray structures of zinc metalloproteins ( http://www.molecular-modelling.ch/resources.php ). For zinc-bound ligands (226 complexes), the QM/MM scoring yielded a substantially improved success rate compared to the classical scoring function (77.0% vs 61.5%), while, for allosteric ligands (55 complexes), the success rate remained constant (49.1%). The QM/MM scoring significantly improved the detection of correct zinc-binding geometries and improved the docking success rate by more than 20% for several important drug targets. The performance of both the classical and the QM/MM scoring functions compare favorably to the performance of AutoDock4, AutoDock4Zn, and AutoDock Vina.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantum indeterminism is frequently invoked as a solution to the problem of how a disembodied soul might interact with the brain (as Descartes proposed), and is sometimes invoked in theories of libertarian free will even when they do not involve dualistic assumptions. Taking as example the Eccles-Beck model of interaction between self (or soul) and brain at the level of synaptic exocytosis, I here evaluate the plausibility of these approaches. I conclude that Heisenbergian uncertainty is too small to affect synaptic function, and that amplification by chaos or by other means does not provide a solution to this problem. Furthermore, even if Heisenbergian effects did modify brain functioning, the changes would be swamped by those due to thermal noise. Cells and neural circuits have powerful noise-resistance mechanisms, that are adequate protection against thermal noise and must therefore be more than sufficient to buffer against Heisenbergian effects. Other forms of quantum indeterminism must be considered, because these can be much greater than Heisenbergian uncertainty, but these have not so far been shown to play a role in the brain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Defining an efficient training set is one of the most delicate phases for the success of remote sensing image classification routines. The complexity of the problem, the limited temporal and financial resources, as well as the high intraclass variance can make an algorithm fail if it is trained with a suboptimal dataset. Active learning aims at building efficient training sets by iteratively improving the model performance through sampling. A user-defined heuristic ranks the unlabeled pixels according to a function of the uncertainty of their class membership and then the user is asked to provide labels for the most uncertain pixels. This paper reviews and tests the main families of active learning algorithms: committee, large margin, and posterior probability-based. For each of them, the most recent advances in the remote sensing community are discussed and some heuristics are detailed and tested. Several challenging remote sensing scenarios are considered, including very high spatial resolution and hyperspectral image classification. Finally, guidelines for choosing the good architecture are provided for new and/or unexperienced user.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dissertation investigates some relevant metaphysical issues arising in the context of spacetime theories. In particular, the inquiry focuses on general relativity and canonical quantum gravity. A formal definition of spacetime theory is proposed and, against this framework, an analysis of the notions of general covariance, symmetry and background independence is performed. It is argued that many conceptual issues in general relativity and canonical quantum gravity derive from putting excessive emphasis on general covariance as an ontological prin-ciple. An original metaphysical position grounded in scientific essential- ism and causal realism (weak essentialism) is developed and defended. It is argued that, in the context of general relativity, weak essentialism supports spacetime substantivalism. It is also shown that weak essentialism escapes arguments from metaphysical underdetermination by positing a particular kind of causation, dubbed geometric. The proposed interpretive framework is then applied to Bohmian mechanics, pointing out that weak essentialism nicely fits into this theory. In the end, a possible Bohmian implementation of loop quantum gravity is considered, and such a Bohmian approach is interpreted in a geometric causal fashion. Under this interpretation, Bohmian loop quantum gravity straightforwardly commits us to an ontology of elementary extensions of space whose evolution is described by a non-local law. The causal mechanism underlying this evolution clarifies many conceptual issues related to the emergence of classical spacetime from the quantum regime. Although there is as yet no fully worked out physical theory of quantum gravity, it is argued that the proposed approach sets up a standard that proposals for a serious ontology in this field should meet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper presents an approach for mapping of precipitation data. The main goal is to perform spatial predictions and simulations of precipitation fields using geostatistical methods (ordinary kriging, kriging with external drift) as well as machine learning algorithms (neural networks). More practically, the objective is to reproduce simultaneously both the spatial patterns and the extreme values. This objective is best reached by models integrating geostatistics and machine learning algorithms. To demonstrate how such models work, two case studies have been considered: first, a 2-day accumulation of heavy precipitation and second, a 6-day accumulation of extreme orographic precipitation. The first example is used to compare the performance of two optimization algorithms (conjugate gradients and Levenberg-Marquardt) of a neural network for the reproduction of extreme values. Hybrid models, which combine geostatistical and machine learning algorithms, are also treated in this context. The second dataset is used to analyze the contribution of radar Doppler imagery when used as external drift or as input in the models (kriging with external drift and neural networks). Model assessment is carried out by comparing independent validation errors as well as analyzing data patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.