16 resultados para Almost always propositional logic
Resumo:
Due to usage conditions, hazardous environments or intentional causes, physical and virtual systems are subject to faults in their components, which may affect their overall behaviour. In a ‘black-box’ agent modelled by a set of propositional logic rules, in which just a subset of components is externally visible, such faults may only be recognised by examining some output function of the agent. A (fault-free) model of the agent’s system provides the expected output given some input. If the real output differs from that predicted output, then the system is faulty. However, some faults may only become apparent in the system output when appropriate inputs are given. A number of problems regarding both testing and diagnosis thus arise, such as testing a fault, testing the whole system, finding possible faults and differentiating them to locate the correct one. The corresponding optimisation problems of finding solutions that require minimum resources are also very relevant in industry, as is minimal diagnosis. In this dissertation we use a well established set of benchmark circuits to address such diagnostic related problems and propose and develop models with different logics that we formalise and generalise as much as possible. We also prove that all techniques generalise to agents and to multiple faults. The developed multi-valued logics extend the usual Boolean logic (suitable for faultfree models) by encoding values with some dependency (usually on faults). Such logics thus allow modelling an arbitrary number of diagnostic theories. Each problem is subsequently solved with CLP solvers that we implement and discuss, together with a new efficient search technique that we present. We compare our results with other approaches such as SAT (that require substantial duplication of circuits), showing the effectiveness of constraints over multi-valued logics, and also the adequacy of a general set constraint solver (with special inferences over set functions such as cardinality) on other problems. In addition, for an optimisation problem, we integrate local search with a constructive approach (branch-and-bound) using a variety of logics to improve an existing efficient tool based on SAT and ILP.
Resumo:
Dissertação para obtenção do Grau de Mestre em Genética Molecular e Biomedicina
Resumo:
RESUMO: A Diabetes Mellitus é uma doença metabólica crónica, com deficiência a nível do metabolismo dos hidratos de carbono, lípidos e proteínas, resultante de deficiências na secreção ou ação da insulina, ou de ambas, que quando não tratada antecipadamente e de modo conveniente, pode ter consequências muito graves. Dado a incidência a nível mundial da Diabetes Mellitus, torna-se de elevada importância avaliar toda a sua envolvência e estudar bem quais os critérios a ter em consideração. Este trabalho propõe-se estudar para além dos parâmetros bioquímicos relacionados com a doença - Glicose e Hemoglobina Glicada A1c (HbA1c), analisar os resultados dos últimos cinco anos (2008-2012) dos ensaios interlaboratoriais do PNAEQ, do Departamento de Epidemiologia, do Instituto Nacional de Saúde Dr. Ricardo Jorge. Foram também analisadas as metodologias utilizadas e as variações interlaboratoriais, de forma a entender qual ou quais são os parâmetros mais adequados para o seu diagnóstico e controlo. Este estudo utilizou a população de laboratórios portugueses, públicos e privados, de Portugal Continental e Ilhas, um laboratório de Angola e outro de Macau que se inscreveram no PNAEQ nestes cinco anos, sendo a amostra composta pelo n.º de participações. No programa de Química Clinica foram distribuídas 38 amostras e no programa de HbA1c foram distribuídas 22 amostras. Para a glicose, o nível de desempenho nos ensaios é na globalidade das amostras de Excelente, no entanto verifica-se que sempre que a concentração da amostra é de nível patológico, que a maioria dos ensaios o desempenho foi inferior – Bom. O método de eleição e com CV% mais baixos foi o método da hexoquinase. Para a HbA1c, o nível de desempenho nos ensaios é na globalidade das amostras de Excelente. O método de eleição e com CV% mais baixos foi o método de HPLC. O CV% para a glicose ronda desde 2010 a 2012, os 3% e para a HbA1c foi de aproximadamente 4,0% em 2012. A HbA1c tem mostrado ser uma ferramenta muito útil, importante e robusta na monitorização da Diabetes, sendo hoje em dia quase sempre requisitada em análises de rotina a diabéticos de modo a prevenir complicações que possam vir a acorrer. No futuro poderá ser um importante, senão o parâmetro de futuro, para o diagnóstico da Diabetes, no entanto, mesmo já tendo sido muito trabalhada a sua padronização, ainda existem questões por responder como quais são na realidade todos os seus interferentes, qual a verdadeira relação da HbA1c com a glicose média estimada, em todas as populações e com estudos epidemiológicos. Também a própria educação do diabético e clínico deve ser aprimorada, pelo que neste momento as PTGO e os doseamentos de glicose em jejum devem ser utilizados e encontrando-se a Norma da DGS N.º 033/2011 de acordo com as necessidades e com o estado da arte deste parâmetro. A implementação da glicose média estimada será uma mais-valia na monitorização dos diabéticos pelo que deverá ser uma das prioridades a ter em conta no futuro desta padronização, uniformizando a decisão clinica baseada nela e minimizando a dificuldade de interpretação de resultados de laboratório para laboratório. --------------ABSTRACT: Diabetes Mellitus is a chronic metabolic disease, with a deficit in the metabolism of carbohydrates, lipids and proteins, resulting from deficiencies in insulin secretion or action, or both, which if, when not early treated in a proper way, may result in very serious consequences. Given the worldwide incidence of diabetes mellitus, it is highly important to evaluate all its background and study specifically all the criteria to take into consideration. The aim of this thesis is to study and evaluate beyond the biochemical parameters related to the disease - Glucose and Glycated Haemoglobin A1c (HbA1c), analyze the results of the last five years (2008-2012) of the PNAEQ interlaboratorial tests, in the Department of Epidemiology of National Institute of Health Dr. Ricardo Jorge. It is also intended to analyze the methodologies used and the interlaboratorial variations, in order to understand the most suitable parameters for the diagnosis and control. This study was based in a population of Portuguese laboratories, public and private, of Portugal mainland and islands, a laboratory of Angola and other from Macau, who enrolled in PNAEQ in these five years, and the sample was composed by the n. º of holdings. In the Clinical Chemistry Program there were distributed 38 samples and in the program HbA1c were distributed 22 samples. For glucose, the level of performance in the total nº of the samples was Excellent; however, it was found that when the concentration level of the sample was pathological, in most of the tests the performance was Good. The most preferred method with the lowest CV% is the hexokinase method. For the HbA1c, as a whole, the samples’ tests were Excellent, at the level of performance. The method of election with the lower CV% was the HPLC. The CV% for glucose was around 3%, from 2010 to 2012 and the HbA1c was approximately 4.0% in 2012. The HbA1c method has demonstrated to be a very useful tool, important and robust for monitoring diabetes, being nowadays, almost always required in routine analysis to prevent future complications. In the future it may be an important parameter, if not the most important, for the diagnosis of diabetes. However, despite it has already been standardized, there are still some questions that need to be answered, such as, which are in fact all their interferences, which is the true connection of HbA1c, when compared with the estimated average glucose, in all populations and epidemiological studies. Moreover, the education of the patient and the doctor concerning diabetes should be improved. Nowadays, the Oral Glucose Tolerance Test (OGTT) and fasting glucose determinations should be used and, the needs and the state of the art of this parameter, should be in accordance with the Standard DGS N. º 033/2011. The Implementation of the estimated average glucose will be an added value in monitoring diabetics and, therefore, should be a priority to consider in its future standardization and clinical decision based on it, will be uniform and the difficulty of interpreting results from laboratory to laboratory will be minimal.
Resumo:
After a historical introduction, the bulk of the thesis concerns the study of a declarative semantics for logic programs. The main original contributions are: ² WFSX (Well–Founded Semantics with eXplicit negation), a new semantics for logic programs with explicit negation (i.e. extended logic programs), which compares favourably in its properties with other extant semantics. ² A generic characterization schema that facilitates comparisons among a diversity of semantics of extended logic programs, including WFSX. ² An autoepistemic and a default logic corresponding to WFSX, which solve existing problems of the classical approaches to autoepistemic and default logics, and clarify the meaning of explicit negation in logic programs. ² A framework for defining a spectrum of semantics of extended logic programs based on the abduction of negative hypotheses. This framework allows for the characterization of different levels of scepticism/credulity, consensuality, and argumentation. One of the semantics of abduction coincides with WFSX. ² O–semantics, a semantics that uniquely adds more CWA hypotheses to WFSX. The techniques used for doing so are applicable as well to the well–founded semantics of normal logic programs. ² By introducing explicit negation into logic programs contradiction may appear. I present two approaches for dealing with contradiction, and show their equivalence. One of the approaches consists in avoiding contradiction, and is based on restrictions in the adoption of abductive hypotheses. The other approach consists in removing contradiction, and is based in a transformation of contradictory programs into noncontradictory ones, guided by the reasons for contradiction.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Electrotécnica e Computadores
Resumo:
Trabalho apresentado no âmbito do Doutoramento em Informática, como requisito parcial para obtenção do grau de Doutor em Informática
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
Dissertação para obtenção do Grau de Doutor em Matemática - Lógica e Fundamentos da Matemática
Resumo:
Ontologies formalized by means of Description Logics (DLs) and rules in the form of Logic Programs (LPs) are two prominent formalisms in the field of Knowledge Representation and Reasoning. While DLs adhere to the OpenWorld Assumption and are suited for taxonomic reasoning, LPs implement reasoning under the Closed World Assumption, so that default knowledge can be expressed. However, for many applications it is useful to have a means that allows reasoning over an open domain and expressing rules with exceptions at the same time. Hybrid MKNF knowledge bases make such a means available by formalizing DLs and LPs in a common logic, the Logic of Minimal Knowledge and Negation as Failure (MKNF). Since rules and ontologies are used in open environments such as the Semantic Web, inconsistencies cannot always be avoided. This poses a problem due to the Principle of Explosion, which holds in classical logics. Paraconsistent Logics offer a solution to this issue by assigning meaningful models even to contradictory sets of formulas. Consequently, paraconsistent semantics for DLs and LPs have been investigated intensively. Our goal is to apply the paraconsistent approach to the combination of DLs and LPs in hybrid MKNF knowledge bases. In this thesis, a new six-valued semantics for hybrid MKNF knowledge bases is introduced, extending the three-valued approach by Knorr et al., which is based on the wellfounded semantics for logic programs. Additionally, a procedural way of computing paraconsistent well-founded models for hybrid MKNF knowledge bases by means of an alternating fixpoint construction is presented and it is proven that the algorithm is sound and complete w.r.t. the model-theoretic characterization of the semantics. Moreover, it is shown that the new semantics is faithful w.r.t. well-studied paraconsistent semantics for DLs and LPs, respectively, and maintains the efficiency of the approach it extends.
Resumo:
Machine ethics is an interdisciplinary field of inquiry that emerges from the need of imbuing autonomous agents with the capacity of moral decision-making. While some approaches provide implementations in Logic Programming (LP) systems, they have not exploited LP-based reasoning features that appear essential for moral reasoning. This PhD thesis aims at investigating further the appropriateness of LP, notably a combination of LP-based reasoning features, including techniques available in LP systems, to machine ethics. Moral facets, as studied in moral philosophy and psychology, that are amenable to computational modeling are identified, and mapped to appropriate LP concepts for representing and reasoning about them. The main contributions of the thesis are twofold. First, novel approaches are proposed for employing tabling in contextual abduction and updating – individually and combined – plus a LP approach of counterfactual reasoning; the latter being implemented on top of the aforementioned combined abduction and updating technique with tabling. They are all important to model various issues of the aforementioned moral facets. Second, a variety of LP-based reasoning features are applied to model the identified moral facets, through moral examples taken off-the-shelf from the morality literature. These applications include: (1) Modeling moral permissibility according to the Doctrines of Double Effect (DDE) and Triple Effect (DTE), demonstrating deontological and utilitarian judgments via integrity constraints (in abduction) and preferences over abductive scenarios; (2) Modeling moral reasoning under uncertainty of actions, via abduction and probabilistic LP; (3) Modeling moral updating (that allows other – possibly overriding – moral rules to be adopted by an agent, on top of those it currently follows) via the integration of tabling in contextual abduction and updating; and (4) Modeling moral permissibility and its justification via counterfactuals, where counterfactuals are used for formulating DDE.
Resumo:
The amorphous silicon photo-sensor studied in this thesis, is a double pin structure (p(a-SiC:H)-i’(a-SiC:H)-n(a-SiC:H)-p(a-SiC:H)-i(a-Si:H)-n(a-Si:H)) sandwiched between two transparent contacts deposited over transparent glass thus with the possibility of illumination on both sides, responding to wave-lengths from the ultra-violet, visible to the near infrared range. The frontal il-lumination surface, glass side, is used for light signal inputs. Both surfaces are used for optical bias, which changes the dynamic characteristics of the photo-sensor resulting in different outputs for the same input. Experimental studies were made with the photo-sensor to evaluate its applicability in multiplexing and demultiplexing several data communication channels. The digital light sig-nal was defined to implement simple logical operations like the NOT, AND, OR, and complex like the XOR, MAJ, full-adder and memory effect. A pro-grammable pattern emission system was built and also those for the validation and recovery of the obtained signals. This photo-sensor has applications in op-tical communications with several wavelengths, as a wavelength detector and to execute directly logical operations over digital light input signals.
Resumo:
The aim of this dissertation is the analysis of the rules on advertising in advocacy. Presently this is a controversial issue that is far from being consensual. As we will demonstrate through the text, the arguments presented are, one the one hand, a safeguard of the deontological values of the profession that govern this professional class and, on the other hand, the interests of the legal service providers, in the current context. Opinions differ substantially among professionals who exercise the profession in individual practice, that defend balanced and fair rules to assert the true brightness of the professional lawyer, and those who work in an organized structure, such as the law firms, who defend more flexible rules in advertising and promoting the offices. Currently the rules of advertising for lawyers are provided by article 89º of the Statue of the Portuguese Bar Association. However, these rules will soon suffer adjustments that will take into consideration the Law no. 2/2013 of january 10, which will extend the scope of advertising for public associations, in order to increase the competition among these, at national or European level. Following this logic, arguments such as unequal access to available means of advertising for financial reasons or that the better publicized service is not always the most advantageous to the costumer will be analyzed and criticized.