891 resultados para Path integral approach
Resumo:
The asymptotic safety scenario allows to define a consistent theory of quantized gravity within the framework of quantum field theory. The central conjecture of this scenario is the existence of a non-Gaussian fixed point of the theory's renormalization group flow, that allows to formulate renormalization conditions that render the theory fully predictive. Investigations of this possibility use an exact functional renormalization group equation as a primary non-perturbative tool. This equation implements Wilsonian renormalization group transformations, and is demonstrated to represent a reformulation of the functional integral approach to quantum field theory.rnAs its main result, this thesis develops an algebraic algorithm which allows to systematically construct the renormalization group flow of gauge theories as well as gravity in arbitrary expansion schemes. In particular, it uses off-diagonal heat kernel techniques to efficiently handle the non-minimal differential operators which appear due to gauge symmetries. The central virtue of the algorithm is that no additional simplifications need to be employed, opening the possibility for more systematic investigations of the emergence of non-perturbative phenomena. As a by-product several novel results on the heat kernel expansion of the Laplace operator acting on general gauge bundles are obtained.rnThe constructed algorithm is used to re-derive the renormalization group flow of gravity in the Einstein-Hilbert truncation, showing the manifest background independence of the results. The well-studied Einstein-Hilbert case is further advanced by taking the effect of a running ghost field renormalization on the gravitational coupling constants into account. A detailed numerical analysis reveals a further stabilization of the found non-Gaussian fixed point.rnFinally, the proposed algorithm is applied to the case of higher derivative gravity including all curvature squared interactions. This establishes an improvement of existing computations, taking the independent running of the Euler topological term into account. Known perturbative results are reproduced in this case from the renormalization group equation, identifying however a unique non-Gaussian fixed point.rn
Resumo:
In dieser Arbeit wurden Simulation von Flüssigkeiten auf molekularer Ebene durchgeführt, wobei unterschiedliche Multi-Skalen Techniken verwendet wurden. Diese erlauben eine effektive Beschreibung der Flüssigkeit, die weniger Rechenzeit im Computer benötigt und somit Phänomene auf längeren Zeit- und Längenskalen beschreiben kann.rnrnEin wesentlicher Aspekt ist dabei ein vereinfachtes (“coarse-grained”) Modell, welches in einem systematischen Verfahren aus Simulationen des detaillierten Modells gewonnen wird. Dabei werden ausgewählte Eigenschaften des detaillierten Modells (z.B. Paar-Korrelationsfunktion, Druck, etc) reproduziert.rnrnEs wurden Algorithmen untersucht, die eine gleichzeitige Kopplung von detaillierten und vereinfachten Modell erlauben (“Adaptive Resolution Scheme”, AdResS). Dabei wird das detaillierte Modell in einem vordefinierten Teilvolumen der Flüssigkeit (z.B. nahe einer Oberfläche) verwendet, während der Rest mithilfe des vereinfachten Modells beschrieben wird.rnrnHierzu wurde eine Methode (“Thermodynamische Kraft”) entwickelt um die Kopplung auch dann zu ermöglichen, wenn die Modelle in verschiedenen thermodynamischen Zuständen befinden. Zudem wurde ein neuartiger Algorithmus der Kopplung beschrieben (H-AdResS) der die Kopplung mittels einer Hamilton-Funktion beschreibt. In diesem Algorithmus ist eine zur Thermodynamischen Kraft analoge Korrektur mit weniger Rechenaufwand möglich.rnrnAls Anwendung dieser grundlegenden Techniken wurden Pfadintegral Molekulardynamik (MD) Simulationen von Wasser untersucht. Mithilfe dieser Methode ist es möglich, quantenmechanische Effekte der Kerne (Delokalisation, Nullpunktsenergie) in die Simulation einzubeziehen. Hierbei wurde zuerst eine Multi-Skalen Technik (“Force-matching”) verwendet um eine effektive Wechselwirkung aus einer detaillierten Simulation auf Basis der Dichtefunktionaltheorie zu extrahieren. Die Pfadintegral MD Simulation verbessert die Beschreibung der intra-molekularen Struktur im Vergleich mit experimentellen Daten. Das Modell eignet sich auch zur gleichzeitigen Kopplung in einer Simulation, wobei ein Wassermolekül (beschrieben durch 48 Punktteilchen im Pfadintegral-MD Modell) mit einem vereinfachten Modell (ein Punktteilchen) gekoppelt wird. Auf diese Weise konnte eine Wasser-Vakuum Grenzfläche simuliert werden, wobei nur die Oberfläche im Pfadintegral Modell und der Rest im vereinfachten Modell beschrieben wird.
Resumo:
In questa tesi viene affrontato lo studio degli integrali funzionali nella meccanica quantistica, sia come rielaborazione dell'operatore di evoluzione temporale che costruendo direttamente una somma sui cammini. Vengono inoltre messe in luce ambiguit\`a dovute alla discretizzazione dell'azione corrispondenti ai problemi di ordinamento operatoriale della formulazione canonica. Si descrive inoltre come una possibile scelta della discretizzazione dell'integrale funzionale pu\`o essere ottenuta utilizzando l'ordinamento di Weyl dell'opertore Hamiltoniano, sfruttando la relazione tra Hamiltoniana Weyl ordinata e la prescrizione del punto di mezzo da usare nella discretizzazione dell'azione classica. Studieremo in particolare il caso di una particella non relativistica interagente con un potenziale scalare, un potenziale vettore (campo magnetico) ed un potenziale tensore (metrica).
Resumo:
We consider a large quantum system with spins 12 whose dynamics is driven entirely by measurements of the total spin of spin pairs. This gives rise to a dissipative coupling to the environment. When one averages over the measurement results, the corresponding real-time path integral does not suffer from a sign problem. Using an efficient cluster algorithm, we study the real-time evolution from an initial antiferromagnetic state of the two-dimensional Heisenberg model, which is driven to a disordered phase, not by a Hamiltonian, but by sporadic measurements or by continuous Lindblad evolution.
Resumo:
En la actualidad existe una gran expectación ante la introducción de nuevas herramientas y métodos para el desarrollo de productos software, que permitirán en un futuro próximo un planteamiento de ingeniería del proceso de producción software. Las nuevas metodologías que empiezan a esbozarse suponen un enfoque integral del problema abarcando todas las fases del esquema productivo. Sin embargo el grado de automatización conseguido en el proceso de construcción de sistemas es muy bajo y éste está centrado en las últimas fases del ciclo de vida del software, consiguiéndose así una reducción poco significativa de sus costes y, lo que es aún más importante, sin garantizar la calidad de los productos software obtenidos. Esta tesis define una metodología de desarrollo software estructurada que se puede automatizar, es decir una metodología CASE. La metodología que se presenta se ajusta al modelo de ciclo de desarrollo CASE, que consta de las fases de análisis, diseño y pruebas; siendo su ámbito de aplicación los sistemas de información. Se establecen inicialmente los principios básicos sobre los que la metodología CASE se asienta. Posteriormente, y puesto que la metodología se inicia con la fijación de los objetivos de la empresa que demanda un sistema informático, se emplean técnicas que sirvan de recogida y validación de la información, que proporcionan a la vez un lenguaje de comunicación fácil entre usuarios finales e informáticos. Además, estas mismas técnicas detallarán de una manera completa, consistente y sin ambigüedad todos los requisitos del sistema. Asimismo, se presentan un conjunto de técnicas y algoritmos para conseguir que desde la especificación de requisitos del sistema se logre una automatización tanto del diseño lógico del Modelo de Procesos como del Modelo de Datos, validados ambos conforme a la especificación de requisitos previa. Por último se definen unos procedimientos formales que indican el conjunto de actividades a realizar en el proceso de construcción y cómo llevarlas a cabo, consiguiendo de esta manera una integridad en las distintas etapas del proceso de desarrollo.---ABSTRACT---Nowdays there is a great expectation with regard to the introduction of new tools and methods for the software products development that, in the very near future will allow, an engineering approach in the software development process. New methodologies, just emerging, imply an integral approach to the problem, including all the productive scheme stages. However, the automatization degree obtained in the systems construction process is very low and focused on the last phases of the software lifecycle, which means that the costs reduction obtained is irrelevant and, which is more important, the quality of the software products is not guaranteed. This thesis defines an structured software development methodology that can be automated, that is a CASE methodology. Such a methodology is adapted to the CASE development cycle-model, which consists in analysis, design and testing phases, being the information systems its field of application. Firstly, we present the basic principies on which CASE methodology is based. Secondly, since the methodology starts from fixing the objectives of the company demanding the automatization system, we use some techniques that are useful for gathering and validating the information, being at the same time an easy communication language between end-users and developers. Indeed, these same techniques will detail completely, consistently and non ambiguously all the system requirements. Likewise, a set of techniques and algorithms are shown in order to obtain, from the system requirements specification, an automatization of the Process Model logical design, and of the Data Model logical design. Those two models are validated according to the previous requirement specification. Finally, we define several formal procedures that suggest which set of activities to be accomplished in the construction process, and how to carry them out, getting in this way integrity and completness for the different stages of the development process.
Resumo:
Accurate quantum mechanical simulations of the primary charge transfer in photosynthetic reaction centers are reported. The process is modeled by three coupled electronic states corresponding to the photoexcited chlorophyll special pair (donor), the reduced bacteriopheophytin (acceptor), and the reduced accessory chlorophyll (bridge) that interact with a dissipative medium of protein and solvent degrees of freedom. The time evolution of the excited special pair is followed over 17 ps by using a fully quantum mechanical path integral scheme. We find that a free energy of the reduced accessory chlorophyll state approximately equal to 400 cm(-1) lower than that of the excited special pair state yields state populations in agreement with experimental results on wild-type and modified reaction centers. For this energetic configuration electron transfer is a two-step process.
Resumo:
In this work we review the basic principles of the theory of the relativistic bosonic string through the study of the action functionals of Nambu-Goto and Polyakov and the techniques required for their canonical, light-cone, and path-integral quantisation. For this purpose, we briefly review the main properties of the gauge symmetries and conformal field theory involved in the techniques studied.
Resumo:
Muitos dos problemas auditivos não são notados por pais e professores. Este fato prejudica a aprendizagem da criança principalmente no ambiente escolar. Por isso, programas de triagem auditiva podem ser utilizados com o intuito de detectar e, posteriormente, diagnosticar escolares a fim de que se possa prevenir ou minimizar o impacto a que possíveis sequelas auditivas venham prejudicar o rendimento escolar da criança. Hoje em dia podemos contar com programas que permitem o melhor acompanhamento de populações que necessitam de cuidados preventivos e curativos, e a audição é um aspecto muito importante que pode ser avaliado quando estes programas são colocados em prática. O Programa Nacional de Reorientação da Formação Profissional em Saúde (Pró-Saúde), que visou reorientar a formação profissional, teve como objetivo integrar ensino-serviço e promover atenção básica por meio da abordagem integral do processo saúde-doença. Ambientes externos podem ser utilizados por alunos e professores universitários para que possam colocadas em prática ações que possibilitem a humanização das práticas de atenção a saúde e a integralidade das mesmas, por meio da articulação de ações e serviços de saúde, preventivos e curativos, individuais e coletivos. A escola é considerada um dos ambientes que este trabalho pode ser realizado. O Programa Saúde na Escola (PSE) abre o ambiente escolar com a finalidade de contribuir para a formação integral dos estudantes da rede pública de educação básica por meio de ações de prevenção, promoção e atenção à saúde. Sendo um estudo do tipo retrospectivo transversal, como objetivo principal caracterizar o perfil audiológico de escolares de escola pública do município de Bauru SP, contando com a integração de profissionais da área da saúde e educação no ambiente escolar, o que teve como base os programas citados acima. A triagem auditiva foi realizada com a aplicação dos seguintes procedimentos: imitanciometria, inspeção visual do meato acústico externo, emissões otoacústicas por produto de distorção e audiometria tonal liminar. Observou-se que do total de 652 estudantes, a grande maioria (97,1%) dos participantes com faixa etária entre 10 e 18 anos, apresentaram audição normal. Em 2,9% desta população foi encontrada alguma alteração auditiva temporária. Com a exceção de um único participante, portador de perda auditiva sensorioneural. Apesar de encontrarmos muitas crianças e adolescentes com audição normal, o que mais ressalta a importância deste trabalho é a necessidade da triagem auditiva em ambientes escolares e, essencialmente, o acompanhamento das mesmas nesta faixa etária, já que são escassos os estudos referentes a ela. Apesar das poucas alterações auditivas encontradas serem passageiras, são exatamente estas que interferem no bom rendimento escolar e outros fatores.
Resumo:
When applying multivariate analysis techniques in information systems and social science disciplines, such as management information systems (MIS) and marketing, the assumption that the empirical data originate from a single homogeneous population is often unrealistic. When applying a causal modeling approach, such as partial least squares (PLS) path modeling, segmentation is a key issue in coping with the problem of heterogeneity in estimated cause-and-effect relationships. This chapter presents a new PLS path modeling approach which classifies units on the basis of the heterogeneity of the estimates in the inner model. If unobserved heterogeneity significantly affects the estimated path model relationships on the aggregate data level, the methodology will allow homogenous groups of observations to be created that exhibit distinctive path model estimates. The approach will, thus, provide differentiated analytical outcomes that permit more precise interpretations of each segment formed. An application on a large data set in an example of the American customer satisfaction index (ACSI) substantiates the methodology’s effectiveness in evaluating PLS path modeling results.
Resumo:
In this study, we investigate the problem of reconstruction of a stationary temperature field from given temperature and heat flux on a part of the boundary of a semi-infinite region containing an inclusion. This situation can be modelled as a Cauchy problem for the Laplace operator and it is an ill-posed problem in the sense of Hadamard. We propose and investigate a Landweber-Fridman type iterative method, which preserve the (stationary) heat operator, for the stable reconstruction of the temperature field on the boundary of the inclusion. In each iteration step, mixed boundary value problems for the Laplace operator are solved in the semi-infinite region. Well-posedness of these problems is investigated and convergence of the procedures is discussed. For the numerical implementation of these mixed problems an efficient boundary integral method is proposed which is based on the indirect variant of the boundary integral approach. Using this approach the mixed problems are reduced to integral equations over the (bounded) boundary of the inclusion. Numerical examples are included showing that stable and accurate reconstructions of the temperature field on the boundary of the inclusion can be obtained also in the case of noisy data. These results are compared with those obtained with the alternating iterative method.
Resumo:
We find the probability distribution of the fluctuating parameters of a soliton propagating through a medium with additive noise. Our method is a modification of the instanton formalism (method of optimal fluctuation) based on a saddle-point approximation in the path integral. We first solve consistently a fundamental problem of soliton propagation within the framework of noisy nonlinear Schrödinger equation. We then consider model modifications due to in-line (filtering, amplitude and phase modulation) control. It is examined how control elements change the error probability in optical soliton transmission. Even though a weak noise is considered, we are interested here in probabilities of error-causing large fluctuations which are beyond perturbation theory. We describe in detail a new phenomenon of soliton collapse that occurs under the combined action of noise, filtering and amplitude modulation. © 2004 Elsevier B.V. All rights reserved.
Resumo:
We propose a modification of the nonlinear digital signal processing technique based on the nonlinear inverse synthesis for the systems with distributed Raman amplification. The proposed path-average approach offers 3 dB performance gain, regardless of the signal power profile.
Resumo:
We investigate the theoretical and numerical computation of rare transitions in simple geophysical turbulent models. We consider the barotropic quasi-geostrophic and two-dimensional Navier–Stokes equations in regimes where bistability between two coexisting large-scale attractors exist. By means of large deviations and instanton theory with the use of an Onsager–Machlup path integral formalism for the transition probability, we show how one can directly compute the most probable transition path between two coexisting attractors analytically in an equilibrium (Langevin) framework and numerically otherWe adapt a class of numerical optimization algorithms known as minimum action methods to simple geophysical turbulent models. We show that by numerically minimizing an appropriate action functional in a large deviation limit, one can predict the most likely transition path for a rare transition between two states. By considering examples where theoretical predictions can be made, we show that the minimum action method successfully predicts the most likely transition path. Finally, we discuss the application and extension of such numerical optimization schemes to the computation of rare transitions observed in direct numerical simulations and experiments and to other, more complex, turbulent systems.
Resumo:
We investigate a class of simple models for Langevin dynamics of turbulent flows, including the one-layer quasi-geostrophic equation and the two-dimensional Euler equations. Starting from a path integral representation of the transition probability, we compute the most probable fluctuation paths from one attractor to any state within its basin of attraction. We prove that such fluctuation paths are the time reversed trajectories of the relaxation paths for a corresponding dual dynamics, which are also within the framework of quasi-geostrophic Langevin dynamics. Cases with or without detailed balance are studied. We discuss a specific example for which the stationary measure displays either a second order (continuous) or a first order (discontinuous) phase transition and a tricritical point. In situations where a first order phase transition is observed, the dynamics are bistable. Then, the transition paths between two coexisting attractors are instantons (fluctuation paths from an attractor to a saddle), which are related to the relaxation paths of the corresponding dual dynamics. For this example, we show how one can analytically determine the instantons and compute the transition probabilities for rare transitions between two attractors.
Resumo:
In Marxist frameworks “distributive justice” depends on extracting value through a centralized state. Many new social movements—peer to peer economy, maker activism, community agriculture, queer ecology, etc.—take the opposite approach, keeping value in its unalienated form and allowing it to freely circulate from the bottom up. Unlike Marxism, there is no general theory for bottom-up, unalienated value circulation. This paper examines the concept of “generative justice” through an historical contrast between Marx’s writings and the indigenous cultures that he drew upon. Marx erroneously concluded that while indigenous cultures had unalienated forms of production, only centralized value extraction could allow the productivity needed for a high quality of life. To the contrary, indigenous cultures now provide a robust model for the “gift economy” that underpins open source technological production, agroecology, and restorative approaches to civil rights. Expanding Marx’s concept of unalienated labor value to include unalienated ecological (nonhuman) value, as well as the domain of freedom in speech, sexual orientation, spirituality and other forms of “expressive” value, we arrive at an historically informed perspective for generative justice.