132 resultados para Predicate
Resumo:
This work investigates the syntactic, semantic, and pragmatic properties of nominal Split Topicalization (ST) constructions in Standard and non-Standard German. The topic phrase denotes a property, and the MF phrase either modifies this property or picks out a specific entity. Semantically, the topic phrase will be analysed as a property-denoting expression which restricts the denotation of the verbal predicate, while the MF phrase is composed either via specify or restrict (cf. Chung and Ladusaw, 2003). Syntactically, the base position of the topic phrase is the (incorporating) verb, and the MF phrase is generated independently as the complement of the verb containing an empty pronoun. Since predicates introduce abstract discourse referents, the topic phrase can be resumed via "pro" in the MF phrase.
Resumo:
This paper has three sections. In the first one, I expose and discuss Davidson's semantic account of adverbial sentences: the basic idea is that these sentences involve quantification over events, and I defend that view from opposing perspectives like the theory of adverbs as predicate modifiers. In the second section I defend the claim that in english constructions following the scheme: ¿X did V by T-ings¿, we are referring to the same action of X; what is sometimes called ¿The Anscombe Thesis¿. Again I discuss competing theories only to conclude that the Anscombe Thesis is true. In the third section, however, it is shown that to assume as premisses these two theses -Davidson's account and the Anscombe Thesis- leads to a serious conflict. Alternative solutions are worked out and rejected. It is also argued that the only tenable solution depends on certain metaphysical assumptions. Finally, however, I will cast doubt on this solution.
Resumo:
Fuzzy set theory and Fuzzy logic is studied from a mathematical point of view. The main goal is to investigatecommon mathematical structures in various fuzzy logical inference systems and to establish a general mathematical basis for fuzzy logic when considered as multi-valued logic. The study is composed of six distinct publications. The first paper deals with Mattila'sLPC+Ch Calculus. THis fuzzy inference system is an attempt to introduce linguistic objects to mathematical logic without defining these objects mathematically.LPC+Ch Calculus is analyzed from algebraic point of view and it is demonstratedthat suitable factorization of the set of well formed formulae (in fact, Lindenbaum algebra) leads to a structure called ET-algebra and introduced in the beginning of the paper. On its basis, all the theorems presented by Mattila and many others can be proved in a simple way which is demonstrated in the Lemmas 1 and 2and Propositions 1-3. The conclusion critically discusses some other issues of LPC+Ch Calculus, specially that no formal semantics for it is given.In the second paper the characterization of solvability of the relational equation RoX=T, where R, X, T are fuzzy relations, X the unknown one, and o the minimum-induced composition by Sanchez, is extended to compositions induced by more general products in the general value lattice. Moreover, the procedure also applies to systemsof equations. In the third publication common features in various fuzzy logicalsystems are investigated. It turns out that adjoint couples and residuated lattices are very often present, though not always explicitly expressed. Some minor new results are also proved.The fourth study concerns Novak's paper, in which Novak introduced first-order fuzzy logic and proved, among other things, the semantico-syntactical completeness of this logic. He also demonstrated that the algebra of his logic is a generalized residuated lattice. In proving that the examination of Novak's logic can be reduced to the examination of locally finite MV-algebras.In the fifth paper a multi-valued sentential logic with values of truth in an injective MV-algebra is introduced and the axiomatizability of this logic is proved. The paper developes some ideas of Goguen and generalizes the results of Pavelka on the unit interval. Our proof for the completeness is purely algebraic. A corollary of the Completeness Theorem is that fuzzy logic on the unit interval is semantically complete if, and only if the algebra of the valuesof truth is a complete MV-algebra. The Compactness Theorem holds in our well-defined fuzzy sentential logic, while the Deduction Theorem and the Finiteness Theorem do not. Because of its generality and good-behaviour, MV-valued logic can be regarded as a mathematical basis of fuzzy reasoning. The last paper is a continuation of the fifth study. The semantics and syntax of fuzzy predicate logic with values of truth in ana injective MV-algerba are introduced, and a list of universally valid sentences is established. The system is proved to be semanticallycomplete. This proof is based on an idea utilizing some elementary properties of injective MV-algebras and MV-homomorphisms, and is purely algebraic.
Resumo:
Possibilistic Defeasible Logic Programming (P-DeLP) is a logic programming language which combines features from argumentation theory and logic programming, incorporating the treatment of possibilistic uncertainty at the object-language level. In spite of its expressive power, an important limitation in P-DeLP is that imprecise, fuzzy information cannot be expressed in the object language. One interesting alternative for solving this limitation is the use of PGL+, a possibilistic logic over Gödel logic extended with fuzzy constants. Fuzzy constants in PGL+ allow expressing disjunctive information about the unknown value of a variable, in the sense of a magnitude, modelled as a (unary) predicate. The aim of this article is twofold: firstly, we formalize DePGL+, a possibilistic defeasible logic programming language that extends P-DeLP through the use of PGL+ in order to incorporate fuzzy constants and a fuzzy unification mechanism for them. Secondly, we propose a way to handle conflicting arguments in the context of the extended framework.
Resumo:
Abstract: Quantity and time. 2 : Nominal aspect and the case marking of the Finnish predicate nominal
Resumo:
Programming and mathematics are core areas of computer science (CS) and consequently also important parts of CS education. Introductory instruction in these two topics is, however, not without problems. Studies show that CS students find programming difficult to learn and that teaching mathematical topics to CS novices is challenging. One reason for the latter is the disconnection between mathematics and programming found in many CS curricula, which results in students not seeing the relevance of the subject for their studies. In addition, reports indicate that students' mathematical capability and maturity levels are dropping. The challenges faced when teaching mathematics and programming at CS departments can also be traced back to gaps in students' prior education. In Finland the high school curriculum does not include CS as a subject; instead, focus is on learning to use the computer and its applications as tools. Similarly, many of the mathematics courses emphasize application of formulas, while logic, formalisms and proofs, which are important in CS, are avoided. Consequently, high school graduates are not well prepared for studies in CS. Motivated by these challenges, the goal of the present work is to describe new approaches to teaching mathematics and programming aimed at addressing these issues: Structured derivations is a logic-based approach to teaching mathematics, where formalisms and justifications are made explicit. The aim is to help students become better at communicating their reasoning using mathematical language and logical notation at the same time as they become more confident with formalisms. The Python programming language was originally designed with education in mind, and has a simple syntax compared to many other popular languages. The aim of using it in instruction is to address algorithms and their implementation in a way that allows focus to be put on learning algorithmic thinking and programming instead of on learning a complex syntax. Invariant based programming is a diagrammatic approach to developing programs that are correct by construction. The approach is based on elementary propositional and predicate logic, and makes explicit the underlying mathematical foundations of programming. The aim is also to show how mathematics in general, and logic in particular, can be used to create better programs.
Resumo:
This paper aims at clarifying the nature of Frege's system of logic, as presented in the first volume of the Grundgesetze . We undertake a rational reconstruction of this system, by distinguishing its propositional and predicate fragments. This allows us to emphasise the differences and similarities between this system and a modern system of classical second-order logic.
Resumo:
Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.
Resumo:
Rahanpesu on rikollisesta toiminnasta hankitun varallisuuden siirtoa lailliseen talousjärjestelmään. Rahanpesu on liitännäinen teko ja se edellyttää aina esirikosta. Esirikos voi olla mikä tahansa rikos, josta varallisuutta hankitaan - huumausainerikos tai talousrikos. Tutkielmassa on selvitetty lainopillista menetelmää käyttäen Suomen rahanpesulain ja ilmoitusmenettelyn tavoitteita ja sisältöä, sekä kansainvälisen rahanpesusäännöstön vaikutuksia Suomen säännöstöön. Empiiristä aineistoa kerättiin teemahaastattelujen avulla ja sen avulla selvitettiin, miten Suomen ilmoitusvelvolliset noudattavat laissa ja ilmoitusmenettelyssä asetettuja velvollisuuksia käytännössä. Tutkimustulosten mukaan, ilmoitusvelvollisiin kohdistetaan suora normatiivinen ohjaus. Ilmoitusvelvolliset noudattavat, hyväksyvät sääntelyn ja uskovat siihen. Tämä auttaa Suomen viranomaisia rahanpesutorjuntatyössä. Suomen rahanpesusäännöstöä on kehitetty vuosien aikana vastamaan kansainvälisiä vaatimuksia. Sääntelyssä on vielä puutteita. On olemassa myös muita ulkopuolisia tekijöitä, jotka hankaloittavat rahanpesun tehokasta torjuntaa.
Resumo:
Workshop at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Kirjallisuusarvostelu
Resumo:
ABSTRACT This paper aims to analyze the concept of emerging power established to the understanding of international affairs. The work observes that the use of the lexicon emerging - regarding to markets, countries or powers - as qualifier for a range of international relations phenomena became a constituent part of the matter. In spite of that, the empirical denotation of the predicate is ahead of the amount of efforts on its theoretical contextualization. Our methodological hypothesis is that the rational denial of the concepts prevailing connotative spectrum by acknowledging the embedded wisdom about cognate phenomena synthesizes a theoretical framework on its accurate use.
Resumo:
If you want to know whether a property is true or not in a specific algebraic structure,you need to test that property on the given structure. This can be done by hand, which can be cumbersome and erroneous. In addition, the time consumed in testing depends on the size of the structure where the property is applied. We present an implementation of a system for finding counterexamples and testing properties of models of first-order theories. This system is supposed to provide a convenient and paperless environment for researchers and students investigating or studying such models and algebraic structures in particular. To implement a first-order theory in the system, a suitable first-order language.( and some axioms are required. The components of a language are given by a collection of variables, a set of predicate symbols, and a set of operation symbols. Variables and operation symbols are used to build terms. Terms, predicate symbols, and the usual logical connectives are used to build formulas. A first-order theory now consists of a language together with a set of closed formulas, i.e. formulas without free occurrences of variables. The set of formulas is also called the axioms of the theory. The system uses several different formats to allow the user to specify languages, to define axioms and theories and to create models. Besides the obvious operations and tests on these structures, we have introduced the notion of a functor between classes of models in order to generate more co~plex models from given ones automatically. As an example, we will use the system to create several lattices structures starting from a model of the theory of pre-orders.
Resumo:
Département de linguistique et de traduction