998 resultados para Quantum algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Per a determinar la dinàmica espai-temporal completa d’un sistema quàntic tridimensional de N partícules cal integrar l’equació d’Schrödinger en 3N dimensions. La capacitat dels ordinadors actuals permet fer-ho com a molt en 3 dimensions. Amb l’objectiu de disminuir el temps de càlcul necessari per a integrar l’equació d’Schrödinger multidimensional, es realitzen usualment una sèrie d’aproximacions, com l’aproximació de Born–Oppenheimer o la de camp mig. En general, el preu que es paga en realitzar aquestes aproximacions és la pèrdua de les correlacions quàntiques (o entrellaçament). Per tant, és necessari desenvolupar mètodes numèrics que permetin integrar i estudiar la dinàmica de sistemes mesoscòpics (sistemes d’entre tres i unes deu partícules) i en els que es tinguin en compte, encara que sigui de forma aproximada, les correlacions quàntiques entre partícules. Recentment, en el context de la propagació d’electrons per efecte túnel en materials semiconductors, X. Oriols ha desenvolupat un nou mètode [Phys. Rev. Lett. 98, 066803 (2007)] per al tractament de les correlacions quàntiques en sistemes mesoscòpics. Aquesta nova proposta es fonamenta en la formulació de la mecànica quàntica de de Broglie– Bohm. Així, volem fer notar que l’enfoc del problema que realitza X. Oriols i que pretenem aquí seguir no es realitza a fi de comptar amb una eina interpretativa, sinó per a obtenir una eina de càlcul numèric amb la que integrar de manera més eficient l’equació d’Schrödinger corresponent a sistemes quàntics de poques partícules. En el marc del present projecte de tesi doctoral es pretén estendre els algorismes desenvolupats per X. Oriols a sistemes quàntics constituïts tant per fermions com per bosons, i aplicar aquests algorismes a diferents sistemes quàntics mesoscòpics on les correlacions quàntiques juguen un paper important. De forma específica, els problemes a estudiar són els següents: (i) Fotoionització de l’àtom d’heli i de l’àtom de liti mitjançant un làser intens. (ii) Estudi de la relació entre la formulació de X. Oriols amb la aproximació de Born–Oppenheimer. (iii) Estudi de les correlacions quàntiques en sistemes bi- i tripartits en l’espai de configuració de les partícules mitjançant la formulació de de Broglie–Bohm.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a seminal paper [10], Weitz gave a deterministic fully polynomial approximation scheme for counting exponentially weighted independent sets (which is the same as approximating the partition function of the hard-core model from statistical physics) in graphs of degree at most d, up to the critical activity for the uniqueness of the Gibbs measure on the innite d-regular tree. ore recently Sly [8] (see also [1]) showed that this is optimal in the sense that if here is an FPRAS for the hard-core partition function on graphs of maximum egree d for activities larger than the critical activity on the innite d-regular ree then NP = RP. In this paper we extend Weitz's approach to derive a deterministic fully polynomial approximation scheme for the partition function of general two-state anti-ferromagnetic spin systems on graphs of maximum degree d, up to the corresponding critical point on the d-regular tree. The main ingredient of our result is a proof that for two-state anti-ferromagnetic spin systems on the d-regular tree, weak spatial mixing implies strong spatial mixing. his in turn uses a message-decay argument which extends a similar approach proposed recently for the hard-core model by Restrepo et al [7] to the case of general two-state anti-ferromagnetic spin systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dissertation investigates some relevant metaphysical issues arising in the context of spacetime theories. In particular, the inquiry focuses on general relativity and canonical quantum gravity. A formal definition of spacetime theory is proposed and, against this framework, an analysis of the notions of general covariance, symmetry and background independence is performed. It is argued that many conceptual issues in general relativity and canonical quantum gravity derive from putting excessive emphasis on general covariance as an ontological prin-ciple. An original metaphysical position grounded in scientific essential- ism and causal realism (weak essentialism) is developed and defended. It is argued that, in the context of general relativity, weak essentialism supports spacetime substantivalism. It is also shown that weak essentialism escapes arguments from metaphysical underdetermination by positing a particular kind of causation, dubbed geometric. The proposed interpretive framework is then applied to Bohmian mechanics, pointing out that weak essentialism nicely fits into this theory. In the end, a possible Bohmian implementation of loop quantum gravity is considered, and such a Bohmian approach is interpreted in a geometric causal fashion. Under this interpretation, Bohmian loop quantum gravity straightforwardly commits us to an ontology of elementary extensions of space whose evolution is described by a non-local law. The causal mechanism underlying this evolution clarifies many conceptual issues related to the emergence of classical spacetime from the quantum regime. Although there is as yet no fully worked out physical theory of quantum gravity, it is argued that the proposed approach sets up a standard that proposals for a serious ontology in this field should meet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper presents an approach for mapping of precipitation data. The main goal is to perform spatial predictions and simulations of precipitation fields using geostatistical methods (ordinary kriging, kriging with external drift) as well as machine learning algorithms (neural networks). More practically, the objective is to reproduce simultaneously both the spatial patterns and the extreme values. This objective is best reached by models integrating geostatistics and machine learning algorithms. To demonstrate how such models work, two case studies have been considered: first, a 2-day accumulation of heavy precipitation and second, a 6-day accumulation of extreme orographic precipitation. The first example is used to compare the performance of two optimization algorithms (conjugate gradients and Levenberg-Marquardt) of a neural network for the reproduction of extreme values. Hybrid models, which combine geostatistical and machine learning algorithms, are also treated in this context. The second dataset is used to analyze the contribution of radar Doppler imagery when used as external drift or as input in the models (kriging with external drift and neural networks). Model assessment is carried out by comparing independent validation errors as well as analyzing data patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is concerned with the modeling and analysis of quantum dissipation phenomena in the Schrödinger picture. More precisely, we do investigate in detail a dissipative, nonlinear Schrödinger equation somehow accounting for quantum Fokker–Planck effects, and how it is drastically reduced to a simpler logarithmic equation via a nonlinear gauge transformation in such a way that the physics underlying both problems keeps unaltered. From a mathematical viewpoint, this allows for a more achievable analysis regarding the local wellposedness of the initial–boundary value problem. This simplification requires the performance of the polar (modulus–argument) decomposition of the wavefunction, which is rigorously attained (for the first time to the best of our knowledge) under quite reasonable assumptions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aplicació per a iPad a mode de repositori de continguts relacionats amb l'ensenyament d'assignatures d'informàtica.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To make a comprehensive evaluation of organ-specific out-of-field doses using Monte Carlo (MC) simulations for different breast cancer irradiation techniques and to compare results with a commercial treatment planning system (TPS). Three breast radiotherapy techniques using 6MV tangential photon beams were compared: (a) 2DRT (open rectangular fields), (b) 3DCRT (conformal wedged fields), and (c) hybrid IMRT (open conformal+modulated fields). Over 35 organs were contoured in a whole-body CT scan and organ-specific dose distributions were determined with MC and the TPS. Large differences in out-of-field doses were observed between MC and TPS calculations, even for organs close to the target volume such as the heart, the lungs and the contralateral breast (up to 70% difference). MC simulations showed that a large fraction of the out-of-field dose comes from the out-of-field head scatter fluence (>40%) which is not adequately modeled by the TPS. Based on MC simulations, the 3DCRT technique using external wedges yielded significantly higher doses (up to a factor 4-5 in the pelvis) than the 2DRT and the hybrid IMRT techniques which yielded similar out-of-field doses. In sharp contrast to popular belief, the IMRT technique investigated here does not increase the out-of-field dose compared to conventional techniques and may offer the most optimal plan. The 3DCRT technique with external wedges yields the largest out-of-field doses. For accurate out-of-field dose assessment, a commercial TPS should not be used, even for organs near the target volume (contralateral breast, lungs, heart).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new and original reagent based on the use of highly fluorescent cadmium telluride (CdTe) quantum dots (QDs) in aqueous solution is proposed to detect weak fingermarks in blood on non-porous surfaces. To assess the efficiency of this approach, comparisons were performed with one of the most efficient blood reagents on non-porous surfaces, Acid Yellow 7 (AY7). To this end, four non-porous surfaces were studied, i.e. glass, transparent polypropylene, black polyethylene, and aluminium foil. To evaluate the sensitivity of both reagents, sets of depleted fingermarks were prepared, using the same finger, initially soaked with blood, which was then successively applied on the same surface without recharging it with blood or latent secretions. The successive marks were then cut in halves and the halves treated separately with each reagent. The results showed that QDs were equally efficient to AY7 on glass, polyethylene and polypropylene surfaces, and were superior to AY7 on aluminium. The use of QDs in new, sensitive and highly efficient latent and blood mark detection techniques appears highly promising. Health and safety issues related to the use of cadmium are also discussed. It is suggested that applying QDs in aqueous solution (and not as a dry dusting powder) considerably lowers the toxicity risks.