972 resultados para Symmetric Quantum-mechanics
Resumo:
A ladder operator solution to the particle in a box problem of elementary quantum mechanics is presented, although the pedagogical use of this method for this problem is questioned.
Resumo:
The Pauli Spin Matrix representation of Spin Operators in Quantum Mechanics are explicitly demonstrated and illustrated in detail for one and two spin systems.
Resumo:
The twentieth century brought a new sensibility characterized by the discredit of cartesian rationality and the weakening of universal truths, related with aesthetic values as order, proportion and harmony. In the middle of the century, theorists such as Theodor Adorno, Rudolf Arnheim and Anton Ehrenzweig warned about the transformation developed by the artistic field. Contemporary aesthetics seemed to have a new goal: to deny the idea of art as an organized, finished and coherent structure. The order had lost its privileged position. Disorder, probability, arbitrariness, accidentality, randomness, chaos, fragmentation, indeterminacy... Gradually new terms were coined by aesthetic criticism to explain what had been happening since the beginning of the century. The first essays on the matter sought to provide new interpretative models based on, among other arguments, the phenomenology of perception, the recent discoveries of quantum mechanics, the deeper layers of the psyche or the information theories. Overall, were worthy attempts to give theoretical content to a situation as obvious as devoid of founding charter. Finally, in 1962, Umberto Eco brought together all this efforts by proposing a single theoretical frame in his book Opera Aperta. According to his point of view, all of the aesthetic production of twentieth century had a characteristic in common: its capacity to express multiplicity. For this reason, he considered that the nature of contemporary art was, above all, ambiguous. The aim of this research is to clarify the consequences of the incorporation of ambiguity in architectural theoretical discourse. We should start making an accurate analysis of this concept. However, this task is quite difficult because ambiguity does not allow itself to be clearly defined. This concept has the disadvantage that its signifier is as imprecise as its signified. In addition, the negative connotations that ambiguity still has outside the aesthetic field, stigmatizes this term and makes its use problematic. Another problem of ambiguity is that the contemporary subject is able to locate it in all situations. This means that in addition to distinguish ambiguity in contemporary productions, so does in works belonging to remote ages and styles. For that reason, it could be said that everything is ambiguous. And that’s correct, because somehow ambiguity is present in any creation of the imperfect human being. However, as Eco, Arnheim and Ehrenzweig pointed out, there are two major differences between current and past contexts. One affects the subject and the other the object. First, it’s the contemporary subject, and no other, who has acquired the ability to value and assimilate ambiguity. Secondly, ambiguity was an unexpected aesthetic result in former periods, while in contemporary object it has been codified and is deliberately present. In any case, as Eco did, we consider appropriate the use of the term ambiguity to refer to the contemporary aesthetic field. Any other term with more specific meaning would only show partial and limited aspects of a situation quite complex and difficult to diagnose. Opposed to what normally might be expected, in this case ambiguity is the term that fits better due to its particular lack of specificity. In fact, this lack of specificity is what allows to assign a dynamic condition to the idea of ambiguity that in other terms would hardly be operative. Thus, instead of trying to define the idea of ambiguity, we will analyze how it has evolved and its consequences in architectural discipline. Instead of trying to define what it is, we will examine what its presence has supposed in each moment. We will deal with ambiguity as a constant presence that has always been latent in architectural production but whose nature has been modified over time. Eco, in the mid-twentieth century, discerned between classical ambiguity and contemporary ambiguity. Currently, half a century later, the challenge is to discern whether the idea of ambiguity has remained unchanged or have suffered a new transformation. What this research will demonstrate is that it’s possible to detect a new transformation that has much to do with the cultural and aesthetic context of last decades: the transition from modernism to postmodernism. This assumption leads us to establish two different levels of contemporary ambiguity: each one related to one these periods. The first level of ambiguity is widely well-known since many years. Its main characteristics are a codified multiplicity, an interpretative freedom and an active subject who gives conclusion to an object that is incomplete or indefinite. This level of ambiguity is related to the idea of indeterminacy, concept successfully introduced into contemporary aesthetic language. The second level of ambiguity has been almost unnoticed for architectural criticism, although it has been identified and studied in other theoretical disciplines. Much of the work of Fredric Jameson and François Lyotard shows reasonable evidences that the aesthetic production of postmodernism has transcended modern ambiguity to reach a new level in which, despite of the existence of multiplicity, the interpretative freedom and the active subject have been questioned, and at last denied. In this period ambiguity seems to have reached a new level in which it’s no longer possible to obtain a conclusive and complete interpretation of the object because it has became an unreadable device. The postmodern production offers a kind of inaccessible multiplicity and its nature is deeply contradictory. This hypothetical transformation of the idea of ambiguity has an outstanding analogy with that shown in the poetic analysis made by William Empson, published in 1936 in his Seven Types of Ambiguity. Empson established different levels of ambiguity and classified them according to their poetic effect. This layout had an ascendant logic towards incoherence. In seventh level, where ambiguity is higher, he located the contradiction between irreconcilable opposites. It could be said that contradiction, once it undermines the coherence of the object, was the better way that contemporary aesthetics found to confirm the Hegelian judgment, according to which art would ultimately reject its capacity to express truth. Much of the transformation of architecture throughout last century is related to the active involvement of ambiguity in its theoretical discourse. In modern architecture ambiguity is present afterwards, in its critical review made by theoreticians like Colin Rowe, Manfredo Tafuri and Bruno Zevi. The publication of several studies about Mannerism in the forties and fifties rescued certain virtues of an historical style that had been undervalued due to its deviation from Renacentist canon. Rowe, Tafuri and Zevi, among others, pointed out the similarities between Mannerism and certain qualities of modern architecture, both devoted to break previous dogmas. The recovery of Mannerism allowed joining ambiguity and modernity for first time in the same sentence. In postmodernism, on the other hand, ambiguity is present ex-professo, developing a prominent role in the theoretical discourse of this period. The distance between its analytical identification and its operational use quickly disappeared because of structuralism, an analytical methodology with the aspiration of becoming a modus operandi. Under its influence, architecture began to be identified and studied as a language. Thus, postmodern theoretical project discerned between the components of architectural language and developed them separately. Consequently, there is not only one, but three projects related to postmodern contradiction: semantic project, syntactic project and pragmatic project. Leading these projects are those prominent architects whose work manifested an especial interest in exploring and developing the potential of the use of contradiction in architecture. Thus, Robert Venturi, Peter Eisenman and Rem Koolhaas were who established the main features through which architecture developed the dialectics of ambiguity, in its last and extreme level, as a theoretical project in each component of architectural language. Robert Venturi developed a new interpretation of architecture based on its semantic component, Peter Eisenman did the same with its syntactic component, and also did Rem Koolhaas with its pragmatic component. With this approach this research aims to establish a new reflection on the architectural transformation from modernity to postmodernity. Also, it can serve to light certain aspects still unaware that have shaped the architectural heritage of past decades, consequence of a fruitful relationship between architecture and ambiguity and its provocative consummation in a contradictio in terminis. Esta investigación centra su atención fundamentalmente sobre las repercusiones de la incorporación de la ambigüedad en forma de contradicción en el discurso arquitectónico postmoderno, a través de cada uno de sus tres proyectos teóricos. Está estructurada, por tanto, en torno a un capítulo principal titulado Dialéctica de la ambigüedad como proyecto teórico postmoderno, que se desglosa en tres, de títulos: Proyecto semántico. Robert Venturi; Proyecto sintáctico. Peter Eisenman; y Proyecto pragmático. Rem Koolhaas. El capítulo central se complementa con otros dos situados al inicio. El primero, titulado Dialéctica de la ambigüedad contemporánea. Una aproximación realiza un análisis cronológico de la evolución que ha experimentado la idea de la ambigüedad en la teoría estética del siglo XX, sin entrar aún en cuestiones arquitectónicas. El segundo, titulado Dialéctica de la ambigüedad como crítica del proyecto moderno se ocupa de examinar la paulatina incorporación de la ambigüedad en la revisión crítica de la modernidad, que sería de vital importancia para posibilitar su posterior introducción operativa en la postmodernidad. Un último capítulo, situado al final del texto, propone una serie de Proyecciones que, a tenor de lo analizado en los capítulos anteriores, tratan de establecer una relectura del contexto arquitectónico actual y su evolución posible, considerando, en todo momento, que la reflexión en torno a la ambigüedad todavía hoy permite vislumbrar nuevos horizontes discursivos. Cada doble página de la Tesis sintetiza la estructura tripartita del capítulo central y, a grandes rasgos, la principal herramienta metodológica utilizada en la investigación. De este modo, la triple vertiente semántica, sintáctica y pragmática con que se ha identificado al proyecto teórico postmoderno se reproduce aquí en una distribución específica de imágenes, notas a pie de página y cuerpo principal del texto. En la columna de la izquierda están colocadas las imágenes que acompañan al texto principal. Su distribución atiende a criterios estéticos y compositivos, cualificando, en la medida de lo posible, su condición semántica. A continuación, a su derecha, están colocadas las notas a pie de página. Su disposición es en columna y cada nota está colocada a la misma altura que su correspondiente llamada en el texto principal. Su distribución reglada, su valor como notación y su posible equiparación con una estructura profunda aluden a su condición sintáctica. Finalmente, el cuerpo principal del texto ocupa por completo la mitad derecha de cada doble página. Concebido como un relato continuo, sin apenas interrupciones, su papel como responsable de satisfacer las demandas discursivas que plantea una investigación doctoral está en correspondencia con su condición pragmática.
Resumo:
The alanine helix provides a model system for studying the energetics of interaction between water and the helical peptide group, a possible major factor in the energetics of protein folding. Helix formation is enthalpy-driven (−1.0 kcal/mol per residue). Experimental transfer data (vapor phase to aqueous) for amides give the enthalpy of interaction with water of the amide group as ≈−11.5 kcal/mol. The enthalpy of the helical peptide hydrogen bond, computed for the gas phase by quantum mechanics, is −4.9 kcal/mol. These numbers give an enthalpy deficit for helix formation of −7.6 kcal/mol. To study this problem, we calculate the electrostatic solvation free energy (ESF) of the peptide groups in the helical and β-strand conformations, by using the delphi program and parse parameter set. Experimental data show that the ESF values of amides are almost entirely enthalpic. Two key results are: in the β-strand conformation, the ESF value of an interior alanine peptide group is −7.9 kcal/mol, substantially less than that of N-methylacetamide (−12.2 kcal/mol), and the helical peptide group is solvated with an ESF of −2.5 kcal/mol. These results reduce the enthalpy deficit to −1.5 kcal/mol, and desolvation of peptide groups through partial burial in the random coil may account for the remainder. Mutant peptides in the helical conformation show ESF differences among nonpolar amino acids that are comparable to observed helix propensity differences, but the ESF differences in the random coil conformation still must be subtracted.
Resumo:
Quantum mechanics associate to some symplectic manifolds M a quantum model Q(M), which is a Hilbert space. The space Q(M) is the quantum mechanical analogue of the classical phase space M. We discuss here relations between the volume of M and the dimension of the vector space Q(M). Analogues for convex polyhedra are considered.
Resumo:
We describe a procedure for the generation of chemically accurate computer-simulation models to study chemical reactions in the condensed phase. The process involves (i) the use of a coupled semiempirical quantum and classical molecular mechanics method to represent solutes and solvent, respectively; (ii) the optimization of semiempirical quantum mechanics (QM) parameters to produce a computationally efficient and chemically accurate QM model; (iii) the calibration of a quantum/classical microsolvation model using ab initio quantum theory; and (iv) the use of statistical mechanical principles and methods to simulate, on massively parallel computers, the thermodynamic properties of chemical reactions in aqueous solution. The utility of this process is demonstrated by the calculation of the enthalpy of reaction in vacuum and free energy change in aqueous solution for a proton transfer involving methanol, methoxide, imidazole, and imidazolium, which are functional groups involved with proton transfers in many biochemical systems. An optimized semiempirical QM model is produced, which results in the calculation of heats of formation of the above chemical species to within 1.0 kcal/mol (1 kcal = 4.18 kJ) of experimental values. The use of the calibrated QM and microsolvation QM/MM (molecular mechanics) models for the simulation of a proton transfer in aqueous solution gives a calculated free energy that is within 1.0 kcal/mol (12.2 calculated vs. 12.8 experimental) of a value estimated from experimental pKa values of the reacting species.
Resumo:
Foi proposta uma experiência na qual seria possível produzir um emaranhamento quântico de feixes de fótons com diferentes frequências, movendo-se em uma mesma direção, controlado por meio de um campo magnético externo. Nessa experiência, a interação entre o campo magnético e fótons é realizada por intermédio de elétrons, que interagem tanto com os fótons quanto com o campo magnético externo. Foi desenvolvida uma teoria que descreve processos físicos. Derivamos medidas de emaranhamento de informação e de Schmidt para um sistema geral de dois qubits e a medida residual para um sistema geral de três qubits. Usando a informação obtida da análise dos sistemas de dois e de três quase-fótons, calculamos medidas de emaranhamento. Criamos um programa para cálculo numérico, nesses casos, através do qual construímos gráficos de dependência das medidas de emaranhamentos em feixes de dois e de três fótons. Os resultados obtidos nos permitem ver como a medida de emaranhamento depende dos parâmetros, que caracterizam o sistema em questão. Por exemplo, se ambas as polarizações dos fótons coincidem, então, nenhum emaranhamento ocorre. O emaranhamento acontece apenas se as polarizações do fóton forem opostas.
Resumo:
"Issued April 15, 1948."
Resumo:
"CODEN: XNBSAV."
Resumo:
Includes indexes.
Resumo:
v. 1. Molecular quantum mechanics and molecular electronic spectroscopy: early workers.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
A central feature in the Hilbert space formulation of classical mechanics is the quantisation of classical Lionville densities, leading to what may be termed Groenewold operators. We investigate the spectra of the Groenewold operators that correspond to Gaussian and to certain uniform Lionville densities. We show that when the classical coordinate-momentum uncertainty product falls below Heisenberg's limit, the Groenewold operators in the Gaussian case develop negative eigenvalues and eigenvalues larger than 1. However, in the uniform case, negative eigenvalues are shown to persist for arbitrarily large values of the classical uncertainty product.
Resumo:
Measuring the polarization of a single photon typically results in its destruction. We propose, demonstrate, and completely characterize a quantum nondemolition (QND) scheme for realizing such a measurement nondestructively. This scheme uses only linear optics and photodetection of ancillary modes to induce a strong nonlinearity at the single-photon level, nondeterministically. We vary this QND measurement continuously into the weak regime and use it to perform a nondestructive test of complementarity in quantum mechanics. Our scheme realizes the most advanced general measurement of a qubit to date: it is nondestructive, can be made in any basis, and with arbitrary strength.
Resumo:
The Einstein-Podolsky-Rosen paradox and quantum entanglement are at the heart of quantum mechanics. Here we show that single-pass traveling-wave second-harmonic generation can be used to demonstrate both entanglement and the paradox with continuous variables that are analogous to the position and momentum of the original proposal.