959 resultados para Constructive proof
Resumo:
A PhD Dissertation, presented as part of the requirements for the Degree of Doctor of Philosophy from the NOVA - School of Business and Economics
Resumo:
I prove that as long as we allow the marginal utility for money (lambda) to vary between purchases (similarly to the budget) then the quasi-linear and the ordinal budget-constrained models rationalize the same data. However, we know that lambda is approximately constant. I provide a simple constructive proof for the necessary and sufficient condition for the constant lambda rationalization, which I argue should replace the Generalized Axiom of Revealed Preference in empirical studies of consumer behavior. 'Go Cardinals!' It is the minimal requirement of any scientifi c theory that it is consistent with the data it is trying to explain. In the case of (Hicksian) consumer theory it was revealed preference -introduced by Samuelson (1938,1948) - that provided an empirical test to satisfy this need. At that time most of economic reasoning was done in terms of a competitive general equilibrium, a concept abstract enough so that it can be built on the ordinal preferences over baskets of goods - even if the extremely specialized ones of Arrow and Debreu. However, starting in the sixties, economics has moved beyond the 'invisible hand' explanation of how -even competitive- markets operate. A seemingly unavoidable step of this 'revolution' was that ever since, most economic research has been carried out in a partial equilibrium context. Now, the partial equilibrium approach does not mean that the rest of the markets are ignored, rather that they are held constant. In other words, there is a special commodity -call it money - that reflects the trade-offs of moving purchasing power across markets. As a result, the basic building block of consumer behavior in partial equilibrium is no longer the consumer's preferences over goods, rather her valuation of them, in terms of money. This new paradigm necessitates a new theory of revealed preference.
Resumo:
In this paper we describe our system for automatically extracting "correct" programs from proofs using a development of the Curry-Howard process. Although program extraction has been developed by many authors, our system has a number of novel features designed to make it very easy to use and as close as possible to ordinary mathematical terminology and practice. These features include 1. the use of Henkin's technique to reduce higher-order logic to many-sorted (first-order) logic; 2. the free use of new rules for induction subject to certain conditions; 3. the extensive use of previously programmed (total, recursive) functions; 4. the use of templates to make the reasoning much closer to normal mathematical proofs and 5. a conceptual distinction between the computational type theory (for representing programs)and the logical type theory (for reasoning about programs). As an example of our system we give a constructive proof of the well known theorem that every graph of even parity, which is non-trivial in the sense that it does not consist of isolated vertices, has a cycle. Given such a graph as input, the extracted program produces a cycle as promised.
Resumo:
In this paper we describe a new protocol that we call the Curry-Howard protocol between a theory and the programs extracted from it. This protocol leads to the expansion of the theory and the production of more powerful programs. The methodology we use for automatically extracting “correct” programs from proofs is a development of the well-known Curry-Howard process. Program extraction has been developed by many authors, but our presentation is ultimately aimed at a practical, usable system and has a number of novel features. These include 1. a very simple and natural mimicking of ordinary mathematical practice and likewise the use of established computer programs when we obtain programs from formal proofs, and 2. a conceptual distinction between programs on the one hand, and proofs of theorems that yield programs on the other. An implementation of our methodology is the Fred system. As an example of our protocol we describe a constructive proof of the well-known theorem that every graph of even parity can be decomposed into a list of disjoint cycles. Given such a graph as input, the extracted program produces a list of the (non-trivial) disjoint cycles as promised.
Resumo:
Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.
Resumo:
We consider Sklyanin algebras $S$ with 3 generators, which are quadratic algebras over a field $\K$ with $3$ generators $x,y,z$ given by $3$ relations $pxy+qyx+rzz=0$, $pyz+qzy+rxx=0$ and $pzx+qxz+ryy=0$, where $p,q,r\in\K$. this class of algebras has enjoyed much attention. In particular, using tools from algebraic geometry, Feigin, Odesskii \cite{odf}, and Artin, Tate and Van Den Bergh, showed that if at least two of the parameters $p$, $q$ and $r$ are non-zero and at least two of three numbers $p^3$, $q^3$ and $r^3$ are distinct, then $S$ is Artin--Schelter regular. More specifically, $S$ is Koszul and has the same Hilbert series as the algebra of commutative polynomials in 3 indeterminates (PHS). It has became commonly accepted that it is impossible to achieve the same objective by purely algebraic and combinatorial means like the Groebner basis technique. The main purpose of this paper is to trace the combinatorial meaning of the properties of Sklyanin algebras, such as Koszulity, PBW, PHS, Calabi-Yau, and to give a new constructive proof of the above facts due to Artin, Tate and Van Den Bergh. Further, we study a wider class of Sklyanin algebras, namely
the situation when all parameters of relations could be different. We call them generalized Sklyanin algebras. We classify up to isomorphism all generalized Sklyanin algebras with the same Hilbert series as commutative polynomials on
3 variables. We show that generalized Sklyanin algebras in general position have a Golod–Shafarevich Hilbert series (with exception of the case of field with two elements).
Resumo:
It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. ‘Emergent’ properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.
Resumo:
Using a desorption/ionization technique, easy ambient sonic-spray ionization coupled to mass spectrometry (EASI-MS), documents related to the 2nd generation of Brazilian Real currency (R$) were screened in the positive ion mode for authenticity based on chemical profiles obtained directly from the banknote surface. Characteristic profiles were observed for authentic, seized suspect counterfeit and counterfeited homemade banknotes from inkjet and laserjet printers. The chemicals in the authentic banknotes' surface were detected via a few minor sets of ions, namely from the plasticizers bis(2-ethylhexyl)phthalate (DEHP) and dibutyl phthalate (DBP), most likely related to the official offset printing process, and other common quaternary ammonium cations, presenting a similar chemical profile to 1st-generation R$. The seized suspect counterfeit banknotes, however, displayed abundant diagnostic ions in the m/z 400-800 range due to the presence of oligomers. High-accuracy FT-ICR MS analysis enabled molecular formula assignment for each ion. The ions were separated by 44 m/z, which enabled their characterization as Surfynol® 4XX (S4XX, XX=40, 65, and 85), wherein increasing XX values indicate increasing amounts of ethoxylation on a backbone of 2,4,7,9-tetramethyl-5-decyne-4,7-diol (Surfynol® 104). Sodiated triethylene glycol monobutyl ether (TBG) of m/z 229 (C10H22O4Na) was also identified in the seized counterfeit banknotes via EASI(+) FT-ICR MS. Surfynol® and TBG are constituents of inks used for inkjet printing.
Resumo:
An (n, d)-expander is a graph G = (V, E) such that for every X subset of V with vertical bar X vertical bar <= 2n - 2 we have vertical bar Gamma(G)(X) vertical bar >= (d + 1) vertical bar X vertical bar. A tree T is small if it has at most n vertices and has maximum degree at most d. Friedman and Pippenger (1987) proved that any ( n; d)- expander contains every small tree. However, their elegant proof does not seem to yield an efficient algorithm for obtaining the tree. In this paper, we give an alternative result that does admit a polynomial time algorithm for finding the immersion of any small tree in subgraphs G of (N, D, lambda)-graphs Lambda, as long as G contains a positive fraction of the edges of Lambda and lambda/D is small enough. In several applications of the Friedman-Pippenger theorem, including the ones in the original paper of those authors, the (n, d)-expander G is a subgraph of an (N, D, lambda)-graph as above. Therefore, our result suffices to provide efficient algorithms for such previously non-constructive applications. As an example, we discuss a recent result of Alon, Krivelevich, and Sudakov (2007) concerning embedding nearly spanning bounded degree trees, the proof of which makes use of the Friedman-Pippenger theorem. We shall also show a construction inspired on Wigderson-Zuckerman expander graphs for which any sufficiently dense subgraph contains all trees of sizes and maximum degrees achieving essentially optimal parameters. Our algorithmic approach is based on a reduction of the tree embedding problem to a certain on-line matching problem for bipartite graphs, solved by Aggarwal et al. (1996).
Resumo:
Let P be a linear partial differential operator with analytic coefficients. We assume that P is of the form ""sum of squares"", satisfying Hormander's bracket condition. Let q be a characteristic point; for P. We assume that q lies on a symplectic Poisson stratum of codimension two. General results of Okaji Show that P is analytic hypoelliptic at q. Hence Okaji has established the validity of Treves' conjecture in the codimension two case. Our goal here is to give a simple, self-contained proof of this fact.
Resumo:
The general flowshop scheduling problem is a production problem where a set of n jobs have to be processed with identical flow pattern on in machines. In permutation flowshops the sequence of jobs is the same on all machines. A significant research effort has been devoted for sequencing jobs in a flowshop minimizing the makespan. This paper describes the application of a Constructive Genetic Algorithm (CGA) to makespan minimization on flowshop scheduling. The CGA was proposed recently as an alternative to traditional GA approaches, particularly, for evaluating schemata directly. The population initially formed only by schemata, evolves controlled by recombination to a population of well-adapted structures (schemata instantiation). The CGA implemented is based on the NEH classic heuristic and a local search heuristic used to define the fitness functions. The parameters of the CGA are calibrated using a Design of Experiments (DOE) approach. The computational results are compared against some other successful algorithms from the literature on Taillard`s well-known standard benchmark. The computational experience shows that this innovative CGA approach provides competitive results for flowshop scheduling; problems. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Citrus sudden death (CSD) transmission was studied by graft-inoculation and under natural conditions. Young sweet orange trees on Rangpur rootstock were used as indicator plants. They were examined regularly for one or two characteristic markers of CSD: (i) presence of a yellow-stained layer of thickened bark on the Rangpur rootstock, and (ii) infection with the CSD-associated marafivirus. Based on these two markers, transmission of CSD was obtained, not only when budwood for graft-inoculation was taken from symptomatic, sweet orange trees on Rangpur, but also when the budwood sources were asymptomatic sweet orange trees on Cleopatra mandarin, indicating that the latter trees are symptomless carriers of the CSD agent. For natural transmission, 80 young indicator plants were planted within a citrus plot severely affected by CSD. Individual insect-proof cages were built around 40 indicator plants, and the other 40 indicator plants remained uncaged. Only two of the 40 caged indicator plants were affected by CSD, whereas 17 uncaged indicator plants showed CSD symptoms and were infected with the marafivirus. An additional 12 uncaged indicator plants became severely affected with citrus variegated chlorosis and were removed. These results strongly suggest that under natural conditions, CSD is transmitted by an aerial vector, such as an insect, and that the cages protected the trees against infection by the vector.
Resumo:
Trust is a vital feature for Semantic Web: If users (humans and agents) are to use and integrate system answers, they must trust them. Thus, systems should be able to explain their actions, sources, and beliefs, and this issue is the topic of the proof layer in the design of the Semantic Web. This paper presents the design and implementation of a system for proof explanation on the Semantic Web, based on defeasible reasoning. The basis of this work is the DR-DEVICE system that is extended to handle proofs. A critical aspect is the representation of proofs in an XML language, which is achieved by a RuleML language extension.
Resumo:
We prove that, once an algorithm of perfect simulation for a stationary and ergodic random field F taking values in S(Zd), S a bounded subset of R(n), is provided, the speed of convergence in the mean ergodic theorem occurs exponentially fast for F. Applications from (non-equilibrium) statistical mechanics and interacting particle systems are presented.
Resumo:
With the purpose of approximating two issues, oral narrative and constructive memory, we assume that children, as well as adults, have a constructive memory. Accordingly, researchers of the constructive memory share with piagetians the vision that memory is an applied cognition. Under this perspective, understanding and coding into memory constitute a process which is considered similar to the piagetian assimilation of building an internal conceptual representation of the information (hence the term constructive memory. The objective of this study is to examine and illustrate, through examples drawn from a research about oral narrative with 5, 8 and 10 years old children, the extent to which the constructive memory is stimulated by the acquisition of the structures of knowledge or ""mental models"" (schemes of stories and scenes, scripts), and if they automatically employ them to process constructively the information in storage and rebuild them in the recovery. A sequence of five pictures from a book without text was transformed into computerized program, and the pictures were thus presented to the children. The story focuses on a misunderstanding of two characters on a different assessment about a key event. In data collection, the demands of memory were preserved, since children narrate their stories when the images were no longer viewed on the computer screen. Each narrative was produced as a monologue. The results show that this story can be told either in a descriptive level or in a more elaborated level, where intentions and beliefs are attributed to the characters. Although this study allows an assessment of the development of children`s capabilities (both cognitive and linguistic) to narrate a story, there are for sure other issues that could be exploited.