927 resultados para Equivalence Proof
Resumo:
Abstract. We critically examine some recent claims that certain field theories with and without boson kinetic energy terms are equivalent. We point out that the crucial element in these claims is the finiteness or otherwise of the boson wavefunction renormalisation constant. We show that when this constant is finite, the equivalence proof offered in the literature fails in a direct way. When the constant is divergent, the claimed equivalence is only a consequence of improper use of divergent quantities.
Resumo:
In this paper, by use of the boundary integral equation method and the techniques of Green basic solution and singularity analysis, the dynamic problem of antiplane is investigated. The problem is reduced to solving a Cauchy singular integral equation in Laplace transform space. This equation is strictly proved to be equivalent to the dual integral equations obtained by Sih [Mechanics of Fracture, Vol. 4. Noordhoff, Leyden (1977)]. On this basis, the dynamic influence between two parallel cracks is also investigated. By use of the high precision numerical method for the singular integral equation and Laplace numerical inversion, the dynamic stress intensity factors of several typical problems are calculated in this paper. The related numerical results are compared to be consistent with those of Sih. It shows that the method of this paper is successful and can be used to solve more complicated problems. Copyright (C) 1996 Elsevier Science Ltd
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Starting from the Generating functional for the Green Function (GF), constructed from the Lagrangian action in the Duffin-Kemmer-Petiau (DKP) theory (L-approach) we strictly prove that the physical matrix elements of the S-matrix in DKP and Klein-Gordon-Fock (KGF) theories coincide in cases of interacting spin O particles with external and quantized Maxwell and Yang-Mills fields and in case of external gravitational field (without or with torsion), For the proof we use the reduction formulas of Lehmann, Symanzik and Zimmermann (LSZ). We prove that many photons and Yang-Mills particles GF coincide in both theories too. (C) 2000 Elsevier B.V. B.V. All rights reserved.
Resumo:
Bana et al. proposed the relation formal indistinguishability (FIR), i.e. an equivalence between two terms built from an abstract algebra. Later Ene et al. extended it to cover active adversaries and random oracles. This notion enables a framework to verify computational indistinguishability while still offering the simplicity and formality of symbolic methods. We are in the process of making an automated tool for checking FIR between two terms. First, we extend the work by Ene et al. further, by covering ordered sorts and simplifying the way to cope with random oracles. Second, we investigate the possibility of combining algebras together, since it makes the tool scalable and able to cover a wide class of cryptographic schemes. Specially, we show that the combined algebra is still computationally sound, as long as each algebra is sound. Third, we design some proving strategies and implement the tool. Basically, the strategies allow us to find a sequence of intermediate terms, which are formally indistinguishable, between two given terms. FIR between the two given terms is then guaranteed by the transitivity of FIR. Finally, we show applications of the work, e.g. on key exchanges and encryption schemes. In the future, the tool should be extended easily to cover many schemes. This work continues previous research of ours on use of compilers to aid in automated proofs for key exchange.
Resumo:
This paper presents a mechanically verified implementation of an algorithm for deciding the equivalence of Kleene algebra terms within the Coq proof assistant. The algorithm decides equivalence of two given regular expressions through an iterated process of testing the equivalence of their partial derivatives and does not require the construction of the corresponding automata. Recent theoretical and experimental research provides evidence that this method is, on average, more efficient than the classical methods based on automata. We present some performance tests, comparisons with similar approaches, and also introduce a generalization of the algorithm to decide the equivalence of terms of Kleene algebra with tests. The motivation for the work presented in this paper is that of using the libraries developed as trusted frameworks for carrying out certified program verification.
Resumo:
In this paper we investigate the classification of mappings up to K-equivalence. We give several results of this type. We study semialgebraic deformations up to semialgebraic C(0) K-equivalence and bi-Lipschitz K-equivalence. We give an algebraic criterion for bi-Lipschitz K-triviality in terms of semi-integral closure (Theorem 3.5). We also give a new proof of a result of Nishimura: we show that two germs of smooth mappings f, g : R(n) -> R(n), finitely determined with respect to K-equivalence are C(0)-K-equivalent if and only if they have the same degree in absolute value.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
We prove the equivalence of many-gluon Green's functions in the Duffin-Kemmer-Petieu and Klein-Gordon-Fock statistical quantum field theories. The proof is based on the functional integral formulation for the statistical generating functional in a finite-temperature quantum field theory. As an illustration, we calculate one-loop polarization operators in both theories and show that their expressions indeed coincide.
Resumo:
A strict proof of the equivalence of the Duffin-Kemmer-Petiau and Klein-Gordon Fock theories is presented for physical S-matrix elements in the case of charged scalar particles minimally interacting with an external or quantized electromagnetic field. The Hamiltonian canonical approach to the Duffin - Kemmer Petiau theory is first developed in both the component and the matrix form. The theory is then quantized through the construction of the generating functional for the Green's functions, and the physical matrix elements of the S-matrix are proved to be relativistic invariants. The equivalence of the two theories is then proved for the matrix elements of the scattered scalar particles using the reduction formulas of Lehmann, Symanzik, and Zimmermann and for the many-photon Green's functions.
Resumo:
Coinduction is a proof rule. It is the dual of induction. It allows reasoning about non--well--founded structures such as lazy lists or streams and is of particular use for reasoning about equivalences. A central difficulty in the automation of coinductive proof is the choice of a relation (called a bisimulation). We present an automation of coinductive theorem proving. This automation is based on the idea of proof planning. Proof planning constructs the higher level steps in a proof, using knowledge of the general structure of a family of proofs and exploiting this knowledge to control the proof search. Part of proof planning involves the use of failure information to modify the plan by the use of a proof critic which exploits the information gained from the failed proof attempt. Our approach to the problem was to develop a strategy that makes an initial simple guess at a bisimulation and then uses generalisation techniques, motivated by a critic, to refine this guess, so that a larger class of coinductive problems can be automatically verified. The implementation of this strategy has focused on the use of coinduction to prove the equivalence of programs in a small lazy functional language which is similar to Haskell. We have developed a proof plan for coinduction and a critic associated with this proof plan. These have been implemented in CoClam, an extended version of Clam with encouraging results. The planner has been successfully tested on a number of theorems.
Resumo:
Generalising arithmetic structures is seen as a key to developing algebraic understanding. Many adolescent students begin secondary school with a poor understanding of the structure of arithmetic. This paper presents a theory for a teaching/learning trajectory designed to build mathematical understanding and abstraction in the elementary school context. The particular focus is on the use of models and representations to construct an understanding of equivalence. The results of a longitudinal intervention study with five elementary schools, following 220 students as they progressed from Year 2 to Year 6, informed the development of this theory. Data were gathered from multiple sources including interviews, videos of classroom teaching, and pre-and post-tests. Data reduction resulted in the development of nine conjectures representing a growth in integration of models and representations. These conjectures formed the basis of the theory.
Resumo:
The purpose of this proof-of-concept study was to determine the relevance of direct measurements to monitor the load applied on the osseointegrated fixation of transfemoral amputees during static load bearing exercises. The objectives were (A) to introduce an apparatus using a three-dimensional load transducer, (B) to present a range of derived information relevant to clinicians, (C) to report on the outcomes of a pilot study and (D) to compare the measurements from the transducer with those from the current method using a weighing scale. One transfemoral amputee fitted with an osseointegrated implant was asked to apply 10 kg, 20 kg, 40 kg and 80 kg on the fixation, using self-monitoring with the weighing scale. The loading was directly measured with a portable kinetic system including a six-channel transducer, external interface circuitry and a laptop. As the load prescribed increased from 10 kg to 80 kg, the forces and moments applied on and around the antero-posterior axis increased by 4 fold anteriorly and 14 fold medially, respectively. The forces and moments applied on and around the medio-lateral axis increased by 9 fold laterally and 16 fold from anterior to posterior, respectively. The long axis of the fixation was overloaded and underloaded in 17 % and 83 % of the trials, respectively, by up to ±10 %. This proof-of-concept study presents an apparatus that can be used by clinicians facing the challenge of improving basic knowledge on osseointegration, for the design of equipment for load bearing exercises and for rehabilitation programs.