978 resultados para QUANTUM-MECHANICS
Resumo:
General Relativity (GR) is one of the greatest scientific achievements of the 20th century along with quantum theory. Despite the elegance and the accordance with experimental tests, these two theories appear to be utterly incompatible at fundamental level. Black holes provide a perfect stage to point out these difficulties. Indeed, classical GR fails to describe Nature at small radii, because nothing prevents quantum mechanics from affecting the high curvature zone, and because classical GR becomes ill-defined at r = 0 anyway. Rovelli and Haggard have recently proposed a scenario where a negative quantum pressure at the Planck scales stops and reverts the gravitational collapse, leading to an effective “bounce” and explosion, thus resolving the central singularity. This scenario, called Black Hole Fireworks, has been proposed in a semiclassical framework. The purpose of this thesis is twofold: - Compute the bouncing time by means of a pure quantum computation based on Loop Quantum Gravity; - Extend the known theory to a more realistic scenario, in which the rotation is taken into account by means of the Newman-Janis Algorithm.
Resumo:
An efficient mixed molecular dynamics/quantum mechanics model has been applied to the water cluster system. The use of the MP2 method and correlation consistent basis sets, with appropriate correction for BSSE, allows for the accurate calculation of electronic and free energies for the formation of clusters of 2−10 water molecules. This approach reveals new low energy conformers for (H2O)n=7,9,10. The water heptamer conformers comprise five different structural motifs ranging from a three-dimensional prism to a quasi-planar book structure. A prism-like structure is favored energetically at low temperatures, but a chair-like structure is the global Gibbs free energy minimum past 200 K. The water nonamers exhibit less complexity with all the low energy structures shaped like a prism. The decamer has 30 conformers that are within 2 kcal/mol of the Gibbs free energy minimum structure at 298 K. These structures are categorized into four conformer classes, and a pentagonal prism is the most stable structure from 0 to 320 K. Results can be used as benchmark values for empirical water models and density functionals, and the method can be applied to larger water clusters.
Resumo:
na provide students with motivation for the study of quantum mechanics. That microscopic matter exists in quantized states can be demonstrated with modem versions of historic experiments: atomic line spectra (I), resonance potentials, and blackbody radiation. The resonance potentials of mercury were discovered by Franck and Hertz in 1914 (2). Their experiment consisted of bombarding atoms by electrons, and detecting the kinetic energy loss of the scattered electrons (3). Prior to the Franck-Hertz experiment, spectroscopic work bv Balmer and Rvdbere revealed that atoms emitted radiatibn at discrete ekergiis. The Franck-Hertz experiment showed directly that auantized enerm levels in an atom are real, not jist optiEal artifacts. atom can be raised to excited states by inelastic collisions with electrons as well as lowered from excited states by emission of photons. The classic Franck-Hertz experiment is carried out with mercury (4-7). Here we present an experiment for the study of resonance potentials using neon.
Resumo:
A mixed molecular dynamics/quantum mechanics model has been applied to the ammonium/water clustering system. The use of the high level MP2 calculation method and correlated basis sets, such as aug-cc-pVDZ and aug-cc-pVTZ, lends confidence in the accuracy of the extrapolated energies. These calculations provide electronic and free energies for the formation of clusters of ammonium and 1−10 water molecules at two different temperatures. Structures and thermodynamic values are in good agreement with previous experimental and theoretical results. The estimated concentration of these clusters in the troposphere was calculated using atmospheric amounts of ammonium and water. Results show the favorability of forming these clusters and implications for ion-induced nucleation in the atmosphere.
Resumo:
An efficient mixed molecular dynamics/quantum mechanics model has been applied to the water cluster system. The use of the MP2 method and correlation consistent basis sets, with appropriate correction for BSSE, allows for the accurate calculation of electronic and free energies for the formation of clusters of 2−10 water molecules. This approach reveals new low energy conformers for (H2O)n=7,9,10. The water heptamer conformers comprise five different structural motifs ranging from a three-dimensional prism to a quasi-planar book structure. A prism-like structure is favored energetically at low temperatures, but a chair-like structure is the global Gibbs free energy minimum past 200 K. The water nonamers exhibit less complexity with all the low energy structures shaped like a prism. The decamer has 30 conformers that are within 2 kcal/mol of the Gibbs free energy minimum structure at 298 K. These structures are categorized into four conformer classes, and a pentagonal prism is the most stable structure from 0 to 320 K. Results can be used as benchmark values for empirical water models and density functionals, and the method can be applied to larger water clusters.
Resumo:
According to Bell's theorem a large class of hidden-variable models obeying Bell's notion of local causality (LC) conflict with the predictions of quantum mechanics. Recently, a Bell-type theorem has been proven using a weaker notion of LC, yet assuming the existence of perfectly correlated event types. Here we present a similar Bell-type theorem without this latter assumption. The derived inequality differs from the Clauser-Horne inequality by some small correction terms, which render it less constraining.
Resumo:
This tutorial review article is intended to provide a general guidance to a reader interested to learn about the methodologies to obtain accurate electron density mapping in molecules and crystalline solids, from theory or from experiment, and to carry out a sensible interpretation of the results, for chemical, biochemical or materials science applications. The review mainly focuses on X-ray diffraction techniques and refinement of experimental models, in particular multipolar models. Neutron diffraction, which was widely used in the past to fix accurate positions of atoms, is now used for more specific purposes. The review illustrates three principal analyses of the experimental or theoretical electron density, based on quantum chemical, semi-empirical or empirical interpretation schemes, such as the quantum theory of atoms in molecules, the semi-classical evaluation of interaction energies and the Hirshfeld analysis. In particular, it is shown that a simple topological analysis based on a partition of the electron density cannot alone reveal the whole nature of chemical bonding. More information based on the pair density is necessary. A connection between quantum mechanics and observable quantities is given in order to provide the physical grounds to explain the observations and to justify the interpretations.
Resumo:
We consider one-dimensional Schrödinger-type operators in a bounded interval with non-self-adjoint Robin-type boundary conditions. It is well known that such operators are generically conjugate to normal operators via a similarity transformation. Motivated by recent interests in quasi-Hermitian Hamiltonians in quantum mechanics, we study properties of the transformations and similar operators in detail. In the case of parity and time reversal boundary conditions, we establish closed integral-type formulae for the similarity transformations, derive a non-local self-adjoint operator similar to the Schrödinger operator and also find the associated “charge conjugation” operator, which plays the role of fundamental symmetry in a Krein-space reformulation of the problem.
Resumo:
In any physicochemical process in liquids, the dynamical response of the solvent to the solutes out of equilibrium plays a crucial role in the rates and products: the solvent molecules react to the changes in volume and electron density of the solutes to minimize the free energy of the solution, thus modulating the activation barriers and stabilizing (or destabilizing) intermediate states. In charge transfer (CT) processes in polar solvents, the response of the solvent always assists the formation of charge separation states by stabilizing the energy of the localized charges. A deep understanding of the solvation mechanisms and time scales is therefore essential for a correct description of any photochemical process in dense phase and for designing molecular devices based on photosensitizers with CT excited states. In the last two decades, with the advent of ultrafast time-resolved spectroscopies, microscopic models describing the relevant case of polar solvation (where both the solvent and the solute molecules have a permanent electric dipole and the mutual interaction is mainly dipole−dipole) have dramatically progressed. Regardless of the details of each model, they all assume that the effect of the electrostatic fields of the solvent molecules on the internal electronic dynamics of the solute are perturbative and that the solvent−solute coupling is mainly an electrostatic interaction between the constant permanent dipoles of the solute and the solvent molecules. This well-established picture has proven to quantitatively rationalize spectroscopic effects of environmental and electric dynamics (time-resolved Stokes shifts, inhomogeneous broadening, etc.). However, recent computational and experimental studies, including ours, have shown that further improvement is required. Indeed, in the last years we investigated several molecular complexes exhibiting photoexcited CT states, and we found that the current description of the formation and stabilization of CT states in an important group of molecules such as transition metal complexes is inaccurate. In particular, we proved that the solvent molecules are not just spectators of intramolecular electron density redistribution but significantly modulate it. Our results solicit further development of quantum mechanics computational methods to treat the solute and (at least) the closest solvent molecules including the nonperturbative treatment of the effects of local electrostatics and direct solvent−solute interactions to describe the dynamical changes of the solute excited states during the solvent response.
Resumo:
A ladder operator solution to the particle in a box problem of elementary quantum mechanics is presented, although the pedagogical use of this method for this problem is questioned.
Resumo:
The Pauli Spin Matrix representation of Spin Operators in Quantum Mechanics are explicitly demonstrated and illustrated in detail for one and two spin systems.
Resumo:
The twentieth century brought a new sensibility characterized by the discredit of cartesian rationality and the weakening of universal truths, related with aesthetic values as order, proportion and harmony. In the middle of the century, theorists such as Theodor Adorno, Rudolf Arnheim and Anton Ehrenzweig warned about the transformation developed by the artistic field. Contemporary aesthetics seemed to have a new goal: to deny the idea of art as an organized, finished and coherent structure. The order had lost its privileged position. Disorder, probability, arbitrariness, accidentality, randomness, chaos, fragmentation, indeterminacy... Gradually new terms were coined by aesthetic criticism to explain what had been happening since the beginning of the century. The first essays on the matter sought to provide new interpretative models based on, among other arguments, the phenomenology of perception, the recent discoveries of quantum mechanics, the deeper layers of the psyche or the information theories. Overall, were worthy attempts to give theoretical content to a situation as obvious as devoid of founding charter. Finally, in 1962, Umberto Eco brought together all this efforts by proposing a single theoretical frame in his book Opera Aperta. According to his point of view, all of the aesthetic production of twentieth century had a characteristic in common: its capacity to express multiplicity. For this reason, he considered that the nature of contemporary art was, above all, ambiguous. The aim of this research is to clarify the consequences of the incorporation of ambiguity in architectural theoretical discourse. We should start making an accurate analysis of this concept. However, this task is quite difficult because ambiguity does not allow itself to be clearly defined. This concept has the disadvantage that its signifier is as imprecise as its signified. In addition, the negative connotations that ambiguity still has outside the aesthetic field, stigmatizes this term and makes its use problematic. Another problem of ambiguity is that the contemporary subject is able to locate it in all situations. This means that in addition to distinguish ambiguity in contemporary productions, so does in works belonging to remote ages and styles. For that reason, it could be said that everything is ambiguous. And that’s correct, because somehow ambiguity is present in any creation of the imperfect human being. However, as Eco, Arnheim and Ehrenzweig pointed out, there are two major differences between current and past contexts. One affects the subject and the other the object. First, it’s the contemporary subject, and no other, who has acquired the ability to value and assimilate ambiguity. Secondly, ambiguity was an unexpected aesthetic result in former periods, while in contemporary object it has been codified and is deliberately present. In any case, as Eco did, we consider appropriate the use of the term ambiguity to refer to the contemporary aesthetic field. Any other term with more specific meaning would only show partial and limited aspects of a situation quite complex and difficult to diagnose. Opposed to what normally might be expected, in this case ambiguity is the term that fits better due to its particular lack of specificity. In fact, this lack of specificity is what allows to assign a dynamic condition to the idea of ambiguity that in other terms would hardly be operative. Thus, instead of trying to define the idea of ambiguity, we will analyze how it has evolved and its consequences in architectural discipline. Instead of trying to define what it is, we will examine what its presence has supposed in each moment. We will deal with ambiguity as a constant presence that has always been latent in architectural production but whose nature has been modified over time. Eco, in the mid-twentieth century, discerned between classical ambiguity and contemporary ambiguity. Currently, half a century later, the challenge is to discern whether the idea of ambiguity has remained unchanged or have suffered a new transformation. What this research will demonstrate is that it’s possible to detect a new transformation that has much to do with the cultural and aesthetic context of last decades: the transition from modernism to postmodernism. This assumption leads us to establish two different levels of contemporary ambiguity: each one related to one these periods. The first level of ambiguity is widely well-known since many years. Its main characteristics are a codified multiplicity, an interpretative freedom and an active subject who gives conclusion to an object that is incomplete or indefinite. This level of ambiguity is related to the idea of indeterminacy, concept successfully introduced into contemporary aesthetic language. The second level of ambiguity has been almost unnoticed for architectural criticism, although it has been identified and studied in other theoretical disciplines. Much of the work of Fredric Jameson and François Lyotard shows reasonable evidences that the aesthetic production of postmodernism has transcended modern ambiguity to reach a new level in which, despite of the existence of multiplicity, the interpretative freedom and the active subject have been questioned, and at last denied. In this period ambiguity seems to have reached a new level in which it’s no longer possible to obtain a conclusive and complete interpretation of the object because it has became an unreadable device. The postmodern production offers a kind of inaccessible multiplicity and its nature is deeply contradictory. This hypothetical transformation of the idea of ambiguity has an outstanding analogy with that shown in the poetic analysis made by William Empson, published in 1936 in his Seven Types of Ambiguity. Empson established different levels of ambiguity and classified them according to their poetic effect. This layout had an ascendant logic towards incoherence. In seventh level, where ambiguity is higher, he located the contradiction between irreconcilable opposites. It could be said that contradiction, once it undermines the coherence of the object, was the better way that contemporary aesthetics found to confirm the Hegelian judgment, according to which art would ultimately reject its capacity to express truth. Much of the transformation of architecture throughout last century is related to the active involvement of ambiguity in its theoretical discourse. In modern architecture ambiguity is present afterwards, in its critical review made by theoreticians like Colin Rowe, Manfredo Tafuri and Bruno Zevi. The publication of several studies about Mannerism in the forties and fifties rescued certain virtues of an historical style that had been undervalued due to its deviation from Renacentist canon. Rowe, Tafuri and Zevi, among others, pointed out the similarities between Mannerism and certain qualities of modern architecture, both devoted to break previous dogmas. The recovery of Mannerism allowed joining ambiguity and modernity for first time in the same sentence. In postmodernism, on the other hand, ambiguity is present ex-professo, developing a prominent role in the theoretical discourse of this period. The distance between its analytical identification and its operational use quickly disappeared because of structuralism, an analytical methodology with the aspiration of becoming a modus operandi. Under its influence, architecture began to be identified and studied as a language. Thus, postmodern theoretical project discerned between the components of architectural language and developed them separately. Consequently, there is not only one, but three projects related to postmodern contradiction: semantic project, syntactic project and pragmatic project. Leading these projects are those prominent architects whose work manifested an especial interest in exploring and developing the potential of the use of contradiction in architecture. Thus, Robert Venturi, Peter Eisenman and Rem Koolhaas were who established the main features through which architecture developed the dialectics of ambiguity, in its last and extreme level, as a theoretical project in each component of architectural language. Robert Venturi developed a new interpretation of architecture based on its semantic component, Peter Eisenman did the same with its syntactic component, and also did Rem Koolhaas with its pragmatic component. With this approach this research aims to establish a new reflection on the architectural transformation from modernity to postmodernity. Also, it can serve to light certain aspects still unaware that have shaped the architectural heritage of past decades, consequence of a fruitful relationship between architecture and ambiguity and its provocative consummation in a contradictio in terminis. Esta investigación centra su atención fundamentalmente sobre las repercusiones de la incorporación de la ambigüedad en forma de contradicción en el discurso arquitectónico postmoderno, a través de cada uno de sus tres proyectos teóricos. Está estructurada, por tanto, en torno a un capítulo principal titulado Dialéctica de la ambigüedad como proyecto teórico postmoderno, que se desglosa en tres, de títulos: Proyecto semántico. Robert Venturi; Proyecto sintáctico. Peter Eisenman; y Proyecto pragmático. Rem Koolhaas. El capítulo central se complementa con otros dos situados al inicio. El primero, titulado Dialéctica de la ambigüedad contemporánea. Una aproximación realiza un análisis cronológico de la evolución que ha experimentado la idea de la ambigüedad en la teoría estética del siglo XX, sin entrar aún en cuestiones arquitectónicas. El segundo, titulado Dialéctica de la ambigüedad como crítica del proyecto moderno se ocupa de examinar la paulatina incorporación de la ambigüedad en la revisión crítica de la modernidad, que sería de vital importancia para posibilitar su posterior introducción operativa en la postmodernidad. Un último capítulo, situado al final del texto, propone una serie de Proyecciones que, a tenor de lo analizado en los capítulos anteriores, tratan de establecer una relectura del contexto arquitectónico actual y su evolución posible, considerando, en todo momento, que la reflexión en torno a la ambigüedad todavía hoy permite vislumbrar nuevos horizontes discursivos. Cada doble página de la Tesis sintetiza la estructura tripartita del capítulo central y, a grandes rasgos, la principal herramienta metodológica utilizada en la investigación. De este modo, la triple vertiente semántica, sintáctica y pragmática con que se ha identificado al proyecto teórico postmoderno se reproduce aquí en una distribución específica de imágenes, notas a pie de página y cuerpo principal del texto. En la columna de la izquierda están colocadas las imágenes que acompañan al texto principal. Su distribución atiende a criterios estéticos y compositivos, cualificando, en la medida de lo posible, su condición semántica. A continuación, a su derecha, están colocadas las notas a pie de página. Su disposición es en columna y cada nota está colocada a la misma altura que su correspondiente llamada en el texto principal. Su distribución reglada, su valor como notación y su posible equiparación con una estructura profunda aluden a su condición sintáctica. Finalmente, el cuerpo principal del texto ocupa por completo la mitad derecha de cada doble página. Concebido como un relato continuo, sin apenas interrupciones, su papel como responsable de satisfacer las demandas discursivas que plantea una investigación doctoral está en correspondencia con su condición pragmática.
Resumo:
The alanine helix provides a model system for studying the energetics of interaction between water and the helical peptide group, a possible major factor in the energetics of protein folding. Helix formation is enthalpy-driven (−1.0 kcal/mol per residue). Experimental transfer data (vapor phase to aqueous) for amides give the enthalpy of interaction with water of the amide group as ≈−11.5 kcal/mol. The enthalpy of the helical peptide hydrogen bond, computed for the gas phase by quantum mechanics, is −4.9 kcal/mol. These numbers give an enthalpy deficit for helix formation of −7.6 kcal/mol. To study this problem, we calculate the electrostatic solvation free energy (ESF) of the peptide groups in the helical and β-strand conformations, by using the delphi program and parse parameter set. Experimental data show that the ESF values of amides are almost entirely enthalpic. Two key results are: in the β-strand conformation, the ESF value of an interior alanine peptide group is −7.9 kcal/mol, substantially less than that of N-methylacetamide (−12.2 kcal/mol), and the helical peptide group is solvated with an ESF of −2.5 kcal/mol. These results reduce the enthalpy deficit to −1.5 kcal/mol, and desolvation of peptide groups through partial burial in the random coil may account for the remainder. Mutant peptides in the helical conformation show ESF differences among nonpolar amino acids that are comparable to observed helix propensity differences, but the ESF differences in the random coil conformation still must be subtracted.
Resumo:
Quantum mechanics associate to some symplectic manifolds M a quantum model Q(M), which is a Hilbert space. The space Q(M) is the quantum mechanical analogue of the classical phase space M. We discuss here relations between the volume of M and the dimension of the vector space Q(M). Analogues for convex polyhedra are considered.
Resumo:
We describe a procedure for the generation of chemically accurate computer-simulation models to study chemical reactions in the condensed phase. The process involves (i) the use of a coupled semiempirical quantum and classical molecular mechanics method to represent solutes and solvent, respectively; (ii) the optimization of semiempirical quantum mechanics (QM) parameters to produce a computationally efficient and chemically accurate QM model; (iii) the calibration of a quantum/classical microsolvation model using ab initio quantum theory; and (iv) the use of statistical mechanical principles and methods to simulate, on massively parallel computers, the thermodynamic properties of chemical reactions in aqueous solution. The utility of this process is demonstrated by the calculation of the enthalpy of reaction in vacuum and free energy change in aqueous solution for a proton transfer involving methanol, methoxide, imidazole, and imidazolium, which are functional groups involved with proton transfers in many biochemical systems. An optimized semiempirical QM model is produced, which results in the calculation of heats of formation of the above chemical species to within 1.0 kcal/mol (1 kcal = 4.18 kJ) of experimental values. The use of the calibrated QM and microsolvation QM/MM (molecular mechanics) models for the simulation of a proton transfer in aqueous solution gives a calculated free energy that is within 1.0 kcal/mol (12.2 calculated vs. 12.8 experimental) of a value estimated from experimental pKa values of the reacting species.