890 resultados para Abductive reasoning
Resumo:
This paper introduces CSP-like communication mechanisms into Backus’ Functional Programming (FP) systems extended by nondeterministic constructs. Several new functionals are used to describe nondeterminism and communication in programs. The functionals union and restriction are introduced into FP systems to develop a simple algebra of programs with nondeterminism. The behaviour of other functionals proposed in this paper are characterized by the properties of union and restriction. The axiomatic semantics of communication constructs are presented. Examples show that it is possible to reason about a communicating program by first transforming it into a non-communicating program by using the axioms of communication, and then reasoning about the resulting non-communicating version of the program. It is also shown that communicating programs can be developed from non-communicating programs given as specifications by using a transformational approach.
Resumo:
The distinction between a priori and a posteriori knowledge has been the subject of an enormous amount of discussion, but the literature is biased against recognizing the intimate relationship between these forms of knowledge. For instance, it seems to be almost impossible to find a sample of pure a priori or a posteriori knowledge. In this paper it will be suggested that distinguishing between a priori and a posteriori is more problematic than is often suggested, and that a priori and a posteriori resources are in fact used in parallel. We will define this relationship between a priori and a posteriori knowledge as the bootstrapping relationship. As we will see, this relationship gives us reasons to seek for an altogether novel definition of a priori and a posteriori knowledge. Specifically, we will have to analyse the relationship between a priori knowledge and a priori reasoning, and it will be suggested that the latter serves as a more promising starting point for the analysis of aprioricity. We will also analyse a number of examples from the natural sciences and consider the role of a priori reasoning in these examples. The focus of this paper is the analysis of the concepts of a priori and a posteriori knowledge rather than the epistemic domain of a posteriori and a priori justification.
Resumo:
Formal specification is vital to the development of distributed real-time systems as these systems are inherently complex and safety-critical. It is widely acknowledged that formal specification and automatic analysis of specifications can significantly increase system reliability. Although a number of specification techniques for real-time systems have been reported in the literature, most of these formalisms do not adequately address to the constraints that the aspects of 'distribution' and 'real-time' impose on specifications. Further, an automatic verification tool is necessary to reduce human errors in the reasoning process. In this regard, this paper is an attempt towards the development of a novel executable specification language for distributed real-time systems. First, we give a precise characterization of the syntax and semantics of DL. Subsequently, we discuss the problems of model checking, automatic verification of satisfiability of DL specifications, and testing conformance of event traces with DL specifications. Effective solutions to these problems are presented as extensions to the classical first-order tableau algorithm. The use of the proposed framework is illustrated by specifying a sample problem.
Resumo:
In social selection the phenotype of an individual depends on its own genotype as well as on the phenotypes, and so genotypes, of other individuals. This makes it impossible to associate an invariant phenotype with a genotype: the social context is crucial. Descriptions of metazoan development, which often is viewed as the acme of cooperative social behaviour, ignore or downplay this fact. The implicit justification for doing so is based on a group-selectionist point of view. Namely, embryos are clones, therefore all cells have the same evolutionary interest, and the visible differences between cells result from a common strategy. The reasoning is flawed, because phenotypic heterogeneity within groups can result from contingent choices made by cells from a flexible repertoire as in multicellular development. What makes that possible is phenotypic plasticity, namely the ability of a genotype to exhibit different phenotypes. However, co-operative social behaviour with division of labour requires that different phenotypes interact appropriately, not that they belong to the same genotype, or have overlapping genetic interests. We sketch a possible route to the evolution of social groups that involves many steps: (a) individuals that happen to be in spatial proximity benefit simply by virtue of their number; (b) traits that are already present act as preadaptations and improve the efficiency of the group; and (c) new adaptations evolve under selection in the social context-that is, via interactions between individuals-and further strengthen group behaviour. The Dictyostelid or cellular slime mould amoebae (CSMs) become multicellular in an unusual way, by the aggregation of free-living cells. In nature the resulting group can be genetically homogeneous (clonal) or heterogeneous (polyclonal); in either case its development, which displays strong cooperation between cells (to the extent of so-called altruism) is not affected. This makes the CSMs exemplars for the study of social behaviour.
Resumo:
An intelligent computer aided defect analysis (ICADA) system, based on artificial intelligence techniques, has been developed to identify design, process or material parameters which could be responsible for the occurrence of defective castings in a manufacturing campaign. The data on defective castings for a particular time frame, which is an input to the ICADA system, has been analysed. It was observed that a large proportion, i.e. 50-80% of all the defective castings produced in a foundry, have two, three or four types of defects occurring above a threshold proportion, say 10%. Also, a large number of defect types are either not found at all or found in a very small proportion, with a threshold value below 2%. An important feature of the ICADA system is the recognition of this pattern in the analysis. Thirty casting defect types and a large number of causes numbering between 50 and 70 for each, as identified in the AFS analysis of casting defects-the standard reference source for a casting process-constituted the foundation for building the knowledge base. Scientific rationale underlying the formation of a defect during the casting process was identified and 38 metacauses were coded. Process, material and design parameters which contribute to the metacauses were systematically examined and 112 were identified as rootcauses. The interconnections between defects, metacauses and rootcauses were represented as a three tier structured graph and the handling of uncertainty in the occurrence of events such as defects, metacauses and rootcauses was achieved by Bayesian analysis. The hill climbing search technique, associated with forward reasoning, was employed to recognize one or several root causes.
Resumo:
We report results of statistical and dynamic analysis of the serrated stress-time curves obtained from compressive constant strain-rate tests on two metallic glass samples with different ductility levels in an effort to extract hidden information in the seemingly irregular serrations. Two distinct types of dynamics are detected in these two alloy samples. The stress-strain curve corresponding to the less ductile Zr65Cu15Ni10Al10 alloy is shown to exhibit a finite correlation dimension and a positive Lyapunov exponent, suggesting that the underlying dynamics is chaotic. In contrast, for the more ductile Cu47.5Zr47.5Al5 alloy, the distributions of stress drop magnitudes and their time durations obey a power-law scaling reminiscent of a self-organized critical state. The exponents also satisfy the scaling relation compatible with self-organized criticality. Possible physical mechanisms contributing to the two distinct dynamic regimes are discussed by drawing on the analogy with the serrated yielding of crystalline samples. The analysis, together with some physical reasoning, suggests that plasticity in the less ductile sample can be attributed to stick-slip of a single shear band, while that of the more ductile sample could be attributed to the simultaneous nucleation of a large number of shear bands and their mutual interactions. (C) 2011 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Resumo:
A framework based on the notion of "conflict-tolerance" was proposed in as a compositional methodology for developing and reasoning about systems that comprise multiple independent controllers. A central notion in this framework is that of a "conflict-tolerant" specification for a controller. In this work we propose a way of defining conflict-tolerant real-time specifications in Metric Interval Temporal Logic (MITL). We call our logic CT-MITL for Conflict-Tolerant MITL. We then give a clock optimal "delay-then-extend" construction for building a timed transition system for monitoring past-MITL formulas. We show how this monitoring transition system can be used to solve the associated verification and synthesis problems for CT-MITL.
Resumo:
A fuzzy logic intelligent system is developed for gas-turbine fault isolation. The gas path measurements used for fault isolation are exhaust gas temperature, low and high rotor speed, and fuel flow. These four measurements are also called the cockpit parameters and are typically found in almost all older and newer jet engines. The fuzzy logic system uses rules developed from a model of performance influence coefficients to isolate engine faults while accounting for uncertainty in gas path measurements. It automates the reasoning process of an experienced powerplant engineer. Tests with simulated data show that the fuzzy system isolates faults with an accuracy of 89% with only the four cockpit measurements. However, if additional pressure and temperature probes between the compressors and before the burner, which are often found in newer jet engines, are considered, the fault isolation accuracy rises to as high as 98%. In addition, the additional sensors are useful in keeping the fault isolation system robust as quality of the measured data deteriorates.
Resumo:
Satisfiability algorithms for propositional logic have improved enormously in recently years. This improvement increases the attractiveness of satisfiability methods for first-order logic that reduce the problem to a series of ground-level satisfiability problems. R. Jeroslow introduced a partial instantiation method of this kind that differs radically from the standard resolution-based methods. This paper lays the theoretical groundwork for an extension of his method that is general enough and efficient enough for general logic programming with indefinite clauses. In particular we improve Jeroslow's approach by (1) extending it to logic with functions, (2) accelerating it through the use of satisfiers, as introduced by Gallo and Rago, and (3) simplifying it to obtain further speedup. We provide a similar development for a "dual" partial instantiation approach defined by Hooker and suggest a primal-dual strategy. We prove correctness of the primal and dual algorithms for full first-order logic with functions, as well as termination on unsatisfiable formulas. We also report some preliminary computational results.
Resumo:
Equilibrium concentrations of various condensed and gaseous phases have been thermodynamically calculated, using the free energy minimization criterion, for the metalorganic chemical vapour deposition (MOCVD) of copper films using bis(2,2,6,6-tetramethyl-3,5-heptadionato)copper(II) as the precursor material. From among the many chemical species that may possibly result from the CVD process, only those expected on the basis of mass spectrometric analysis and chemical reasoning to be present at equilibrium, under different CVD conditions, are used in the thermodynamic calculations. The study predicts the deposition of pure, carbon-free copper in the inert atmosphere of argon as well as in the reactive hydrogen atmosphere, over a wide range of substrate temperatures and total reactor pressures. Thin films of copper, grown on SiO2/Si(100) substrates from this metalorganic precursor by low pressure CVD have been characterized by XRD and AES. The experimentally determined composition of CVD-grown copper films is in reasonable agreement with that predicted by thermodynamic analysis.
Resumo:
A unique phenomenon of ‘autoacceleration’ was observed in a free radical polymerization of vinyl monomers and oxygen. Unlike the well known autoacceleration phenomenon in polymerization processes, this unusual phenomenon is not readily conceivable in terms of solution viscosity based reasoning. Surprisingly, we have observed manifestation of this new autoacceleration during free radical oxidative polymerization of some vinyl monomers at low conversions, where generally the polymerization reaction is zero order, the conversion–time plot are linear and viscosity effects are negligible. In the present paper, we interpret the mechanism of this new autoacceleration phenomenon on the basis of reactivity of the propagating radicals in terms of heat of formation data.
Resumo:
We know, from the classical work of Tarski on real closed fields, that elimination is, in principle, a fundamental engine for mechanized deduction. But, in practice, the high complexity of elimination algorithms has limited their use in the realization of mechanical theorem proving. We advocate qualitative theorem proving, where elimination is attractive since most processes of reasoning take place through the elimination of middle terms, and because the computational complexity of the proof is not an issue. Indeed what we need is the existence of the proof and not its mechanization. In this paper, we treat the linear case and illustrate the power of this paradigm by giving extremely simple proofs of two central theorems in the complexity and geometry of linear programming.
Resumo:
Denial-of-service (DoS) attacks form a very important category of security threats that are prevalent in MIPv6 (mobile internet protocol version 6) today. Many schemes have been proposed to alleviate such threats, including one of our own [9]. However, reasoning about the correctness of such protocols is not trivial. In addition, new solutions to mitigate attacks may need to be deployed in the network on a frequent basis as and when attacks are detected, as it is practically impossible to anticipate all attacks and provide solutions in advance. This makes it necessary to validate the solutions in a timely manner before deployment in the real network. However, threshold schemes needed in group protocols make analysis complex. Model checking threshold-based group protocols that employ cryptography have not been successful so far. Here, we propose a new simulation based approach for validation using a tool called FRAMOGR that supports executable specification of group protocols that use cryptography. FRAMOGR allows one to specify attackers and track probability distributions of values or paths. We believe that infrastructure such as FRAMOGR would be required in future for validating new group based threshold protocols that may be needed for making MIPv6 more robust.
Resumo:
This paper describes techniques to estimate the worst case execution time of executable code on architectures with data caches. The underlying mechanism is Abstract Interpretation, which is used for the dual purposes of tracking address computations and cache behavior. A simultaneous numeric and pointer analysis using an abstraction for discrete sets of values computes safe approximations of access addresses which are then used to predict cache behavior using Must Analysis. A heuristic is also proposed which generates likely worst case estimates. It can be used in soft real time systems and also for reasoning about the tightness of the safe estimate. The analysis methods can handle programs with non-affine access patterns, for which conventional Presburger Arithmetic formulations or Cache Miss Equations do not apply. The precision of the estimates is user-controlled and can be traded off against analysis time. Executables are analyzed directly, which, apart from enhancing precision, renders the method language independent.
Resumo:
Mobile ad hoc networks (MANETs) is one of the successful wireless network paradigms which offers unrestricted mobility without depending on any underlying infrastructure. MANETs have become an exciting and im- portant technology in recent years because of the rapid proliferation of variety of wireless devices, and increased use of ad hoc networks in various applications. Like any other networks, MANETs are also prone to variety of attacks majorly in routing side, most of the proposed secured routing solutions based on cryptography and authentication methods have greater overhead, which results in latency problems and resource crunch problems, especially in energy side. The successful working of these mechanisms also depends on secured key management involving a trusted third authority, which is generally difficult to implement in MANET environ-ment due to volatile topology. Designing a secured routing algorithm for MANETs which incorporates the notion of trust without maintaining any trusted third entity is an interesting research problem in recent years. This paper propose a new trust model based on cognitive reasoning,which associates the notion of trust with all the member nodes of MANETs using a novel Behaviors-Observations- Beliefs(BOB) model. These trust values are used for detec- tion and prevention of malicious and dishonest nodes while routing the data. The proposed trust model works with the DTM-DSR protocol, which involves computation of direct trust between any two nodes using cognitive knowledge. We have taken care of trust fading over time, rewards, and penalties while computing the trustworthiness of a node and also route. A simulator is developed for testing the proposed algorithm, the results of experiments shows incorporation of cognitive reasoning for computation of trust in routing effectively detects intrusions in MANET environment, and generates more reliable routes for secured routing of data.