995 resultados para Default logic


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The "recursive" definition of Default Logic is shown to be representable in a monotonic Modal Quantificational Logic whose modal laws are stronger than S5. Specifically, it is proven that a set of sentences of First Order Logic is a fixed-point of the "recursive" fixed-point equation of Default Logic with an initial set of axioms and defaults if and only if the meaning of the fixed-point is logically equivalent to a particular modal functor of the meanings of that initial set of sentences and of the sentences in those defaults. This is important because the modal representation allows the use of powerful automatic deduction systems for Modal Logic and because unlike the original "recursive" definition of Default Logic, it is easily generalized to the case where quantified variables may be shared across the scope of the components of the defaults.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The nonmonotonic logic called Default Logic is shown to be representable in a monotonic Modal Quantificational Logic whose modal laws are stronger than S5. Specifically, it is proven that a set of sentences of First Order Logic is a fixed-point of the fixed-point equation of Default Logic with an initial set of axioms and defaults if and only if the meaning or rather disquotation of that set of sentences is logically equivalent to a particular modal functor of the meanings of that initial set of sentences and of the sentences in those defaults. This result is important because the modal representation allows the use of powerful automatic deduction systems for Modal Logic and because unlike the original Default Logic, it is easily generalized to the case where quantified variables may be shared across the scope of the components of the defaults thus allowing such defaults to produce quantified consequences. Furthermore, this generalization properly treats such quantifiers since both the Barcan Formula and its converse hold.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reflective Logic and Default Logic are both generalized so as to allow universally quantified variables to cross modal scopes whereby the Barcan formula and its converse hold. This is done by representing both the fixed-point equation for Reflective Logic and the fixed-point equation for Default both as necessary equivalences in the Modal Quantificational Logic Z. and then inserting universal quantifiers before the defaults. The two resulting systems, called Quantified Reflective Logic and Quantified Default Logic, are then compared by deriving metatheorems of Z that express their relationships. The main result is to show that every solution to the equivalence for Quantified Default Logic is a strongly grounded solution to the equivalence for Quantified Reflective Logic. It is further shown that Quantified Reflective Logic and Quantified Default Logic have exactly the same solutions when no default has an entailment condition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La distinción entre argumentación y explicación es una tarea complicada pero necesaria por diversas razones. Una de ellas es la necesidad de incorporar la explicación en un movimiento del diálogo como resultado de una obligación dialéctica. Se propusieron distintos sistemas de diálogo que exploran la distinción enfatizando aspectos pragmáticos. En el presente trabajo me ocupo de aspectos estructurales de la explicación analizados en el marco de la lógica por defecto que permite caracterizar ciertas objeciones en el diálogo. Asimismo, considero que la versión operacional de la lógica por defecto constituye una aproximaciónadecuada en la construcción de la explicación y en la representación de la instancia de diálogo en el intercambio dialéctico

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La distinción entre argumentación y explicación es una tarea complicada pero necesaria por diversas razones. Una de ellas es la necesidad de incorporar la explicación en un movimiento del diálogo como resultado de una obligación dialéctica. Se propusieron distintos sistemas de diálogo que exploran la distinción enfatizando aspectos pragmáticos. En el presente trabajo me ocupo de aspectos estructurales de la explicación analizados en el marco de la lógica por defecto que permite caracterizar ciertas objeciones en el diálogo. Asimismo, considero que la versión operacional de la lógica por defecto constituye una aproximaciónadecuada en la construcción de la explicación y en la representación de la instancia de diálogo en el intercambio dialéctico

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La distinción entre argumentación y explicación es una tarea complicada pero necesaria por diversas razones. Una de ellas es la necesidad de incorporar la explicación en un movimiento del diálogo como resultado de una obligación dialéctica. Se propusieron distintos sistemas de diálogo que exploran la distinción enfatizando aspectos pragmáticos. En el presente trabajo me ocupo de aspectos estructurales de la explicación analizados en el marco de la lógica por defecto que permite caracterizar ciertas objeciones en el diálogo. Asimismo, considero que la versión operacional de la lógica por defecto constituye una aproximaciónadecuada en la construcción de la explicación y en la representación de la instancia de diálogo en el intercambio dialéctico

Relevância:

70.00% 70.00%

Publicador:

Resumo:

After a historical introduction, the bulk of the thesis concerns the study of a declarative semantics for logic programs. The main original contributions are: ² WFSX (Well–Founded Semantics with eXplicit negation), a new semantics for logic programs with explicit negation (i.e. extended logic programs), which compares favourably in its properties with other extant semantics. ² A generic characterization schema that facilitates comparisons among a diversity of semantics of extended logic programs, including WFSX. ² An autoepistemic and a default logic corresponding to WFSX, which solve existing problems of the classical approaches to autoepistemic and default logics, and clarify the meaning of explicit negation in logic programs. ² A framework for defining a spectrum of semantics of extended logic programs based on the abduction of negative hypotheses. This framework allows for the characterization of different levels of scepticism/credulity, consensuality, and argumentation. One of the semantics of abduction coincides with WFSX. ² O–semantics, a semantics that uniquely adds more CWA hypotheses to WFSX. The techniques used for doing so are applicable as well to the well–founded semantics of normal logic programs. ² By introducing explicit negation into logic programs contradiction may appear. I present two approaches for dealing with contradiction, and show their equivalence. One of the approaches consists in avoiding contradiction, and is based on restrictions in the adoption of abductive hypotheses. The other approach consists in removing contradiction, and is based in a transformation of contradictory programs into noncontradictory ones, guided by the reasons for contradiction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Facing with the difficulty in information propagation and synthesizing from conceptual to embodiment design, this paper introduces a function-oriented, axiom based conceptual modeling scheme. Default logic reasoning is exploited for recognition and reconstitution of conceptual product geometric and topological information. The proposed product modeling system and reasoning approach testify a methodology of "structural variation design", which is verified in the implementation of a GPAL (Green Product All Life-cycle) CAD system. The GPAL system includes major enhancement modules of a mechanism layout sketching method based on fuzzy logic, a knowledge-based function-to-form mapping mechanism and conceptual form reconstitution paradigm based on default geometric reasoning. A mechanical hand design example shows a more than 20 times increase in design efficacy with these enhancement modules in the GPAL system on a general 3D CAD platform.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One common problem in all basic techniques of knowledge representation is the handling of the trade-off between precision of inferences and resource constraints, such as time and memory. Michalski and Winston (1986) suggested the Censored Production Rule (CPR) as an underlying representation and computational mechanism to enable logic based systems to exhibit variable precision in which certainty varies while specificity stays constant. As an extension of CPR, the Hierarchical Censored Production Rules (HCPRs) system of knowledge representation, proposed by Bharadwaj & Jain (1992), exhibits both variable certainty as well as variable specificity and offers mechanisms for handling the trade-off between the two. An HCPR has the form: Decision If(preconditions) Unless(censor) Generality(general_information) Specificity(specific_information). As an attempt towards evolving a generalized knowledge representation, an Extended Hierarchical Censored Production Rules (EHCPRs) system is suggested in this paper. With the inclusion of new operators, an Extended Hierarchical Censored Production Rule (EHCPR) takes the general form: Concept If (Preconditions) Unless (Exceptions) Generality (General-Concept) Specificity (Specific Concepts) Has_part (default: structural-parts) Has_property (default:characteristic-properties) Has_instance (instances). How semantic networks and frames are represented in terms of an EHCPRs is shown. Multiple inheritance, inheritance with and without cancellation, recognition with partial match, and a few default logic problems are shown to be tackled efficiently in the proposed system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nonmonotonic Logics such as Autoepistemic Logic, Reflective Logic, and Default Logic, are usually defined in terms of set-theoretic fixed-point equations defined over deductively closed sets of sentences of First Order Logic. Such systems may also be represented as necessary equivalences in a Modal Logic stronger than S5 with the added advantage that such representations may be generalized to allow quantified variables crossing modal scopes resulting in a Quantified Autoepistemic Logic, a Quantified Autoepistemic Kernel, a Quantified Reflective Logic, and a Quantified Default Logic. Quantifiers in all these generalizations obey all the normal laws of logic including both the Barcan formula and its converse. Herein, we address the problem of solving some necessary equivalences containing universal quantifiers over modal scopes. Solutions obtained by these methods are then compared to related results obtained in the literature by Circumscription in Second Order Logic since the disjunction of all the solutions of a necessary equivalence containing just normal defaults in these Quantified Logics, is equivalent to that system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The need to make default assumptions is frequently encountered in reasoning about incompletely specified worlds. Inferences sanctioned by default are best viewed as beliefs which may well be modified or rejected by subsequent observations. It is this property which leads to the non-monotonicity of any logic of defaults. In this paper we propose a logic for default reasoning. We then specialize our treatment to a very large class of commonly occuring defaults. For this class we develop a complete proof theory and show how to interface it with a top down resolution theorem prover. Finally, we provide criteria under which the revision of derived beliefs must be effected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En el trabajo que aquí presentamos se incluye la base teórica (sintaxis y semántica) y una implementación de un framework para codificar el razonamiento de la representación difusa o borrosa del mundo (tal y como nosotros, seres humanos, entendemos éste). El interés en la realización de éste trabajo parte de dos fuentes: eliminar la complejidad existente cuando se realiza una implementación con un lenguaje de programación de los llamados de propósito general y proporcionar una herramienta lo suficientemente inteligente para dar respuestas de forma constructiva a consultas difusas o borrosas. El framework, RFuzzy, permite codificar reglas y consultas en una sintaxis muy cercana al lenguaje natural usado por los seres humanos para expresar sus pensamientos, pero es bastante más que eso. Permite representar conceptos muy interesantes, como fuzzificaciones (funciones usadas para convertir conceptos no difusos en difusos), valores por defecto (que se usan para devolver resultados un poco menos válidos que los que devolveríamos si tuviésemos la información necesaria para calcular los más válidos), similaridad entre atributos (característica que utilizamos para buscar aquellos individuos en la base de datos con una característica similar a la buscada), sinónimos o antónimos y, además, nos permite extender el numero de conectivas y modificadores (incluyendo modificadores de negación) que podemos usar en las reglas y consultas. La personalización de la definición de conceptos difusos (muy útil para lidiar con el carácter subjetivo de los conceptos borrosos, donde nos encontramos con que cualificar a alguien de “alto” depende de la altura de la persona que cualifica) es otra de las facilidades incluida. Además, RFuzzy implementa la semántica multi-adjunta. El interés en esta reside en que introduce la posibilidad de obtener la credibilidad de una regla a partir de un conjunto de datos y una regla dada y no solo el grado de satisfacción de una regla a partir de el universo modelado en nuestro programa. De esa forma podemos obtener automáticamente la credibilidad de una regla para una determinada situación. Aún cuando la contribución teórica de la tesis es interesante en si misma, especialmente la inclusión del modificador de negacion, sus multiples usos practicos lo son también. Entre los diferentes usos que se han dado al framework destacamos el reconocimiento de emociones, el control de robots, el control granular en computacion paralela/distribuída y las busquedas difusas o borrosas en bases de datos. ABSTRACT In this work we provide a theoretical basis (syntax and semantics) and a practical implementation of a framework for encoding the reasoning and the fuzzy representation of the world (as human beings understand it). The interest for this work comes from two sources: removing the existing complexity when doing it with a general purpose programming language (one developed without focusing in providing special constructions for representing fuzzy information) and providing a tool intelligent enough to answer, in a constructive way, expressive queries over conventional data. The framework, RFuzzy, allows to encode rules and queries in a syntax very close to the natural language used by human beings to express their thoughts, but it is more than that. It allows to encode very interesting concepts, as fuzzifications (functions to easily fuzzify crisp concepts), default values (used for providing results less adequate but still valid when the information needed to provide results is missing), similarity between attributes (used to search for individuals with a characteristic similar to the one we are looking for), synonyms or antonyms and it allows to extend the number of connectives and modifiers (even negation) we can use in the rules. The personalization of the definition of fuzzy concepts (very useful for dealing with the subjective character of fuzziness, in which a concept like tall depends on the height of the person performing the query) is another of the facilities included. Besides, RFuzzy implements the multi-adjoint semantics. The interest in them is that in addition to obtaining the grade of satisfaction of a consequent from a rule, its credibility and the grade of satisfaction of the antecedents we can determine from a set of data how much credibility we must assign to a rule to model the behaviour of the set of data. So, we can determine automatically the credibility of a rule for a particular situation. Although the theoretical contribution is interesting by itself, specially the inclusion of the negation modifier, the practical usage of it is equally important. Between the different uses given to the framework we highlight emotion recognition, robocup control, granularity control in parallel/distributed computing and flexible searches in databases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Default invariance is the idea that default does not change at any scale of law and finance. Default is a conserved quantity in a universe where fundamental principles of law and finance operate. It exists at the micro-level as part of the fundamental structure of every financial transaction, and at the macro- level, as a fixed critical point within the relatively stable phases of the law and finance cycle. A key point is that default is equivalent to maximizing uncertainty at the micro-level and at the macro-level, is equivalent to the phase transition where unbearable fluctuations occur in all forms of risk transformation, including maturity, liquidity and credit. As such, default invariance is the glue that links the micro and macro structures of law and finance. In this essay, we apply naïve category theory (NCT), a type of mapping logic, to these types of phenomena. The purpose of using NCT is to introduce a rigorous (but simple) mathematical methodology to law and finance discourse and to show that these types of structural considerations are of prime practical importance and significance to law and finance practitioners. These mappings imply a number of novel areas of investigation. From the micro- structure, three macro-approximations are implied. These approximations form the core analytical framework which we will use to examine the phenomena and hypothesize rules governing law and finance. Our observations from these approximations are grouped into five findings. While the entirety of the five findings can be encapsulated by the three approximations, since the intended audience of this paper is the non-specialist in law, finance and category theory, for ease of access we will illustrate the use of the mappings with relatively common concepts drawn from law and finance, focusing especially on financial contracts, derivatives, Shadow Banking, credit rating agencies and credit crises.