926 resultados para Default Logic
Resumo:
The "recursive" definition of Default Logic is shown to be representable in a monotonic Modal Quantificational Logic whose modal laws are stronger than S5. Specifically, it is proven that a set of sentences of First Order Logic is a fixed-point of the "recursive" fixed-point equation of Default Logic with an initial set of axioms and defaults if and only if the meaning of the fixed-point is logically equivalent to a particular modal functor of the meanings of that initial set of sentences and of the sentences in those defaults. This is important because the modal representation allows the use of powerful automatic deduction systems for Modal Logic and because unlike the original "recursive" definition of Default Logic, it is easily generalized to the case where quantified variables may be shared across the scope of the components of the defaults.
Resumo:
The nonmonotonic logic called Default Logic is shown to be representable in a monotonic Modal Quantificational Logic whose modal laws are stronger than S5. Specifically, it is proven that a set of sentences of First Order Logic is a fixed-point of the fixed-point equation of Default Logic with an initial set of axioms and defaults if and only if the meaning or rather disquotation of that set of sentences is logically equivalent to a particular modal functor of the meanings of that initial set of sentences and of the sentences in those defaults. This result is important because the modal representation allows the use of powerful automatic deduction systems for Modal Logic and because unlike the original Default Logic, it is easily generalized to the case where quantified variables may be shared across the scope of the components of the defaults thus allowing such defaults to produce quantified consequences. Furthermore, this generalization properly treats such quantifiers since both the Barcan Formula and its converse hold.
Resumo:
Reflective Logic and Default Logic are both generalized so as to allow universally quantified variables to cross modal scopes whereby the Barcan formula and its converse hold. This is done by representing both the fixed-point equation for Reflective Logic and the fixed-point equation for Default both as necessary equivalences in the Modal Quantificational Logic Z. and then inserting universal quantifiers before the defaults. The two resulting systems, called Quantified Reflective Logic and Quantified Default Logic, are then compared by deriving metatheorems of Z that express their relationships. The main result is to show that every solution to the equivalence for Quantified Default Logic is a strongly grounded solution to the equivalence for Quantified Reflective Logic. It is further shown that Quantified Reflective Logic and Quantified Default Logic have exactly the same solutions when no default has an entailment condition.
Resumo:
La distinción entre argumentación y explicación es una tarea complicada pero necesaria por diversas razones. Una de ellas es la necesidad de incorporar la explicación en un movimiento del diálogo como resultado de una obligación dialéctica. Se propusieron distintos sistemas de diálogo que exploran la distinción enfatizando aspectos pragmáticos. En el presente trabajo me ocupo de aspectos estructurales de la explicación analizados en el marco de la lógica por defecto que permite caracterizar ciertas objeciones en el diálogo. Asimismo, considero que la versión operacional de la lógica por defecto constituye una aproximaciónadecuada en la construcción de la explicación y en la representación de la instancia de diálogo en el intercambio dialéctico
Resumo:
La distinción entre argumentación y explicación es una tarea complicada pero necesaria por diversas razones. Una de ellas es la necesidad de incorporar la explicación en un movimiento del diálogo como resultado de una obligación dialéctica. Se propusieron distintos sistemas de diálogo que exploran la distinción enfatizando aspectos pragmáticos. En el presente trabajo me ocupo de aspectos estructurales de la explicación analizados en el marco de la lógica por defecto que permite caracterizar ciertas objeciones en el diálogo. Asimismo, considero que la versión operacional de la lógica por defecto constituye una aproximaciónadecuada en la construcción de la explicación y en la representación de la instancia de diálogo en el intercambio dialéctico
Resumo:
La distinción entre argumentación y explicación es una tarea complicada pero necesaria por diversas razones. Una de ellas es la necesidad de incorporar la explicación en un movimiento del diálogo como resultado de una obligación dialéctica. Se propusieron distintos sistemas de diálogo que exploran la distinción enfatizando aspectos pragmáticos. En el presente trabajo me ocupo de aspectos estructurales de la explicación analizados en el marco de la lógica por defecto que permite caracterizar ciertas objeciones en el diálogo. Asimismo, considero que la versión operacional de la lógica por defecto constituye una aproximaciónadecuada en la construcción de la explicación y en la representación de la instancia de diálogo en el intercambio dialéctico
Resumo:
After a historical introduction, the bulk of the thesis concerns the study of a declarative semantics for logic programs. The main original contributions are: ² WFSX (Well–Founded Semantics with eXplicit negation), a new semantics for logic programs with explicit negation (i.e. extended logic programs), which compares favourably in its properties with other extant semantics. ² A generic characterization schema that facilitates comparisons among a diversity of semantics of extended logic programs, including WFSX. ² An autoepistemic and a default logic corresponding to WFSX, which solve existing problems of the classical approaches to autoepistemic and default logics, and clarify the meaning of explicit negation in logic programs. ² A framework for defining a spectrum of semantics of extended logic programs based on the abduction of negative hypotheses. This framework allows for the characterization of different levels of scepticism/credulity, consensuality, and argumentation. One of the semantics of abduction coincides with WFSX. ² O–semantics, a semantics that uniquely adds more CWA hypotheses to WFSX. The techniques used for doing so are applicable as well to the well–founded semantics of normal logic programs. ² By introducing explicit negation into logic programs contradiction may appear. I present two approaches for dealing with contradiction, and show their equivalence. One of the approaches consists in avoiding contradiction, and is based on restrictions in the adoption of abductive hypotheses. The other approach consists in removing contradiction, and is based in a transformation of contradictory programs into noncontradictory ones, guided by the reasons for contradiction.
Resumo:
Many of the most common human functions such as temporal and non-monotonic reasoning have not yet been fully mapped in developed systems, even though some theoretical breakthroughs have already been accomplished. This is mainly due to the inherent computational complexity of the theoretical approaches. In the particular area of fault diagnosis in power systems however, some systems which tried to solve the problem, have been deployed using methodologies such as production rule based expert systems, neural networks, recognition of chronicles, fuzzy expert systems, etc. SPARSE (from the Portuguese acronym, which means expert system for incident analysis and restoration support) was one of the developed systems and, in the sequence of its development, came the need to cope with incomplete and/or incorrect information as well as the traditional problems for power systems fault diagnosis based on SCADA (supervisory control and data acquisition) information retrieval, namely real-time operation, huge amounts of information, etc. This paper presents an architecture for a decision support system, which can solve the presented problems, using a symbiosis of the event calculus and the default reasoning rule based system paradigms, insuring soft real-time operation with incomplete, incorrect or domain incoherent information handling ability. A prototype implementation of this system is already at work in the control centre of the Portuguese Transmission Network.
Resumo:
One common problem in all basic techniques of knowledge representation is the handling of the trade-off between precision of inferences and resource constraints, such as time and memory. Michalski and Winston (1986) suggested the Censored Production Rule (CPR) as an underlying representation and computational mechanism to enable logic based systems to exhibit variable precision in which certainty varies while specificity stays constant. As an extension of CPR, the Hierarchical Censored Production Rules (HCPRs) system of knowledge representation, proposed by Bharadwaj & Jain (1992), exhibits both variable certainty as well as variable specificity and offers mechanisms for handling the trade-off between the two. An HCPR has the form: Decision If(preconditions) Unless(censor) Generality(general_information) Specificity(specific_information). As an attempt towards evolving a generalized knowledge representation, an Extended Hierarchical Censored Production Rules (EHCPRs) system is suggested in this paper. With the inclusion of new operators, an Extended Hierarchical Censored Production Rule (EHCPR) takes the general form: Concept If (Preconditions) Unless (Exceptions) Generality (General-Concept) Specificity (Specific Concepts) Has_part (default: structural-parts) Has_property (default:characteristic-properties) Has_instance (instances). How semantic networks and frames are represented in terms of an EHCPRs is shown. Multiple inheritance, inheritance with and without cancellation, recognition with partial match, and a few default logic problems are shown to be tackled efficiently in the proposed system.
Resumo:
Nonmonotonic Logics such as Autoepistemic Logic, Reflective Logic, and Default Logic, are usually defined in terms of set-theoretic fixed-point equations defined over deductively closed sets of sentences of First Order Logic. Such systems may also be represented as necessary equivalences in a Modal Logic stronger than S5 with the added advantage that such representations may be generalized to allow quantified variables crossing modal scopes resulting in a Quantified Autoepistemic Logic, a Quantified Autoepistemic Kernel, a Quantified Reflective Logic, and a Quantified Default Logic. Quantifiers in all these generalizations obey all the normal laws of logic including both the Barcan formula and its converse. Herein, we address the problem of solving some necessary equivalences containing universal quantifiers over modal scopes. Solutions obtained by these methods are then compared to related results obtained in the literature by Circumscription in Second Order Logic since the disjunction of all the solutions of a necessary equivalence containing just normal defaults in these Quantified Logics, is equivalent to that system.
Resumo:
En el trabajo que aquí presentamos se incluye la base teórica (sintaxis y semántica) y una implementación de un framework para codificar el razonamiento de la representación difusa o borrosa del mundo (tal y como nosotros, seres humanos, entendemos éste). El interés en la realización de éste trabajo parte de dos fuentes: eliminar la complejidad existente cuando se realiza una implementación con un lenguaje de programación de los llamados de propósito general y proporcionar una herramienta lo suficientemente inteligente para dar respuestas de forma constructiva a consultas difusas o borrosas. El framework, RFuzzy, permite codificar reglas y consultas en una sintaxis muy cercana al lenguaje natural usado por los seres humanos para expresar sus pensamientos, pero es bastante más que eso. Permite representar conceptos muy interesantes, como fuzzificaciones (funciones usadas para convertir conceptos no difusos en difusos), valores por defecto (que se usan para devolver resultados un poco menos válidos que los que devolveríamos si tuviésemos la información necesaria para calcular los más válidos), similaridad entre atributos (característica que utilizamos para buscar aquellos individuos en la base de datos con una característica similar a la buscada), sinónimos o antónimos y, además, nos permite extender el numero de conectivas y modificadores (incluyendo modificadores de negación) que podemos usar en las reglas y consultas. La personalización de la definición de conceptos difusos (muy útil para lidiar con el carácter subjetivo de los conceptos borrosos, donde nos encontramos con que cualificar a alguien de “alto” depende de la altura de la persona que cualifica) es otra de las facilidades incluida. Además, RFuzzy implementa la semántica multi-adjunta. El interés en esta reside en que introduce la posibilidad de obtener la credibilidad de una regla a partir de un conjunto de datos y una regla dada y no solo el grado de satisfacción de una regla a partir de el universo modelado en nuestro programa. De esa forma podemos obtener automáticamente la credibilidad de una regla para una determinada situación. Aún cuando la contribución teórica de la tesis es interesante en si misma, especialmente la inclusión del modificador de negacion, sus multiples usos practicos lo son también. Entre los diferentes usos que se han dado al framework destacamos el reconocimiento de emociones, el control de robots, el control granular en computacion paralela/distribuída y las busquedas difusas o borrosas en bases de datos. ABSTRACT In this work we provide a theoretical basis (syntax and semantics) and a practical implementation of a framework for encoding the reasoning and the fuzzy representation of the world (as human beings understand it). The interest for this work comes from two sources: removing the existing complexity when doing it with a general purpose programming language (one developed without focusing in providing special constructions for representing fuzzy information) and providing a tool intelligent enough to answer, in a constructive way, expressive queries over conventional data. The framework, RFuzzy, allows to encode rules and queries in a syntax very close to the natural language used by human beings to express their thoughts, but it is more than that. It allows to encode very interesting concepts, as fuzzifications (functions to easily fuzzify crisp concepts), default values (used for providing results less adequate but still valid when the information needed to provide results is missing), similarity between attributes (used to search for individuals with a characteristic similar to the one we are looking for), synonyms or antonyms and it allows to extend the number of connectives and modifiers (even negation) we can use in the rules. The personalization of the definition of fuzzy concepts (very useful for dealing with the subjective character of fuzziness, in which a concept like tall depends on the height of the person performing the query) is another of the facilities included. Besides, RFuzzy implements the multi-adjoint semantics. The interest in them is that in addition to obtaining the grade of satisfaction of a consequent from a rule, its credibility and the grade of satisfaction of the antecedents we can determine from a set of data how much credibility we must assign to a rule to model the behaviour of the set of data. So, we can determine automatically the credibility of a rule for a particular situation. Although the theoretical contribution is interesting by itself, specially the inclusion of the negation modifier, the practical usage of it is equally important. Between the different uses given to the framework we highlight emotion recognition, robocup control, granularity control in parallel/distributed computing and flexible searches in databases.
Resumo:
Default invariance is the idea that default does not change at any scale of law and finance. Default is a conserved quantity in a universe where fundamental principles of law and finance operate. It exists at the micro-level as part of the fundamental structure of every financial transaction, and at the macro- level, as a fixed critical point within the relatively stable phases of the law and finance cycle. A key point is that default is equivalent to maximizing uncertainty at the micro-level and at the macro-level, is equivalent to the phase transition where unbearable fluctuations occur in all forms of risk transformation, including maturity, liquidity and credit. As such, default invariance is the glue that links the micro and macro structures of law and finance. In this essay, we apply naïve category theory (NCT), a type of mapping logic, to these types of phenomena. The purpose of using NCT is to introduce a rigorous (but simple) mathematical methodology to law and finance discourse and to show that these types of structural considerations are of prime practical importance and significance to law and finance practitioners. These mappings imply a number of novel areas of investigation. From the micro- structure, three macro-approximations are implied. These approximations form the core analytical framework which we will use to examine the phenomena and hypothesize rules governing law and finance. Our observations from these approximations are grouped into five findings. While the entirety of the five findings can be encapsulated by the three approximations, since the intended audience of this paper is the non-specialist in law, finance and category theory, for ease of access we will illustrate the use of the mappings with relatively common concepts drawn from law and finance, focusing especially on financial contracts, derivatives, Shadow Banking, credit rating agencies and credit crises.
Resumo:
Disconnectivity between the Default Mode Network (DMN) nodes can cause clinical symptoms and cognitive deficits in Alzheimer׳s disease (AD). We aimed to examine the structural connectivity between DMN nodes, to verify the extent in which white matter disconnection affects cognitive performance. MRI data of 76 subjects (25 mild AD, 21 amnestic Mild Cognitive Impairment subjects and 30 controls) were acquired on a 3.0T scanner. ExploreDTI software (fractional Anisotropy threshold=0.25 and the angular threshold=60°) calculated axial, radial, and mean diffusivities, fractional anisotropy and streamline count. AD patients showed lower fractional anisotropy (P=0.01) and streamline count (P=0.029), and higher radial diffusivity (P=0.014) than controls in the cingulum. After correction for white matter atrophy, only fractional anisotropy and radial diffusivity remained significantly lower in AD compared to controls (P=0.003 and P=0.05). In the parahippocampal bundle, AD patients had lower mean and radial diffusivities (P=0.048 and P=0.013) compared to controls, from which only radial diffusivity survived for white matter adjustment (P=0.05). Regression models revealed that cognitive performance is also accounted for by white matter microstructural values. Structural connectivity within the DMN is important to the execution of high-complexity tasks, probably due to its relevant role in the integration of the network.
Resumo:
The search for an Alzheimer's disease (AD) biomarker is one of the most relevant contemporary research topics due to the high prevalence and social costs of the disease. Functional connectivity (FC) of the default mode network (DMN) is a plausible candidate for such a biomarker. We evaluated 22 patients with mild AD and 26 age- and gender-matched healthy controls. All subjects underwent resting functional magnetic resonance imaging (fMRI) in a 3.0 T scanner. To identify the DMN, seed-based FC of the posterior cingulate was calculated. We also measured the sensitivity/specificity of the method, and verified a correlation with cognitive performance. We found a significant difference between patients with mild AD and controls in average z-scores: DMN, whole cortical positive (WCP) and absolute values. DMN individual values showed a sensitivity of 77.3% and specificity of 70%. DMN and WCP values were correlated to global cognition and episodic memory performance. We showed that individual measures of DMN connectivity could be considered a promising method to differentiate AD, even at an early phase, from normal aging. Further studies with larger numbers of participants, as well as validation of normal values, are needed for more definitive conclusions.
Resumo:
The experiences induced by psychedelics share a wide variety of subjective features, related to the complex changes in perception and cognition induced by this class of drugs. A remarkable increase in introspection is at the core of these altered states of consciousness. Self-oriented mental activity has been consistently linked to the Default Mode Network (DMN), a set of brain regions more active during rest than during the execution of a goal-directed task. Here we used fMRI technique to inspect the DMN during the psychedelic state induced by Ayahuasca in ten experienced subjects. Ayahuasca is a potion traditionally used by Amazonian Amerindians composed by a mixture of compounds that increase monoaminergic transmission. In particular, we examined whether Ayahuasca changes the activity and connectivity of the DMN and the connection between the DMN and the task-positive network (TPN). Ayahuasca caused a significant decrease in activity through most parts of the DMN, including its most consistent hubs: the Posterior Cingulate Cortex (PCC)/Precuneus and the medial Prefrontal Cortex (mPFC). Functional connectivity within the PCC/Precuneus decreased after Ayahuasca intake. No significant change was observed in the DMN-TPN orthogonality. Altogether, our results support the notion that the altered state of consciousness induced by Ayahuasca, like those induced by psilocybin (another serotonergic psychedelic), meditation and sleep, is linked to the modulation of the activity and the connectivity of the DMN.