31 resultados para sociomoral reasoning
Resumo:
Satisfiability algorithms for propositional logic have improved enormously in recently years. This improvement increases the attractiveness of satisfiability methods for first-order logic that reduce the problem to a series of ground-level satisfiability problems. R. Jeroslow introduced a partial instantiation method of this kind that differs radically from the standard resolution-based methods. This paper lays the theoretical groundwork for an extension of his method that is general enough and efficient enough for general logic programming with indefinite clauses. In particular we improve Jeroslow's approach by (1) extending it to logic with functions, (2) accelerating it through the use of satisfiers, as introduced by Gallo and Rago, and (3) simplifying it to obtain further speedup. We provide a similar development for a "dual" partial instantiation approach defined by Hooker and suggest a primal-dual strategy. We prove correctness of the primal and dual algorithms for full first-order logic with functions, as well as termination on unsatisfiable formulas. We also report some preliminary computational results.
Resumo:
Equilibrium concentrations of various condensed and gaseous phases have been thermodynamically calculated, using the free energy minimization criterion, for the metalorganic chemical vapour deposition (MOCVD) of copper films using bis(2,2,6,6-tetramethyl-3,5-heptadionato)copper(II) as the precursor material. From among the many chemical species that may possibly result from the CVD process, only those expected on the basis of mass spectrometric analysis and chemical reasoning to be present at equilibrium, under different CVD conditions, are used in the thermodynamic calculations. The study predicts the deposition of pure, carbon-free copper in the inert atmosphere of argon as well as in the reactive hydrogen atmosphere, over a wide range of substrate temperatures and total reactor pressures. Thin films of copper, grown on SiO2/Si(100) substrates from this metalorganic precursor by low pressure CVD have been characterized by XRD and AES. The experimentally determined composition of CVD-grown copper films is in reasonable agreement with that predicted by thermodynamic analysis.
Resumo:
A unique phenomenon of ‘autoacceleration’ was observed in a free radical polymerization of vinyl monomers and oxygen. Unlike the well known autoacceleration phenomenon in polymerization processes, this unusual phenomenon is not readily conceivable in terms of solution viscosity based reasoning. Surprisingly, we have observed manifestation of this new autoacceleration during free radical oxidative polymerization of some vinyl monomers at low conversions, where generally the polymerization reaction is zero order, the conversion–time plot are linear and viscosity effects are negligible. In the present paper, we interpret the mechanism of this new autoacceleration phenomenon on the basis of reactivity of the propagating radicals in terms of heat of formation data.
Resumo:
We know, from the classical work of Tarski on real closed fields, that elimination is, in principle, a fundamental engine for mechanized deduction. But, in practice, the high complexity of elimination algorithms has limited their use in the realization of mechanical theorem proving. We advocate qualitative theorem proving, where elimination is attractive since most processes of reasoning take place through the elimination of middle terms, and because the computational complexity of the proof is not an issue. Indeed what we need is the existence of the proof and not its mechanization. In this paper, we treat the linear case and illustrate the power of this paradigm by giving extremely simple proofs of two central theorems in the complexity and geometry of linear programming.
Resumo:
Denial-of-service (DoS) attacks form a very important category of security threats that are prevalent in MIPv6 (mobile internet protocol version 6) today. Many schemes have been proposed to alleviate such threats, including one of our own [9]. However, reasoning about the correctness of such protocols is not trivial. In addition, new solutions to mitigate attacks may need to be deployed in the network on a frequent basis as and when attacks are detected, as it is practically impossible to anticipate all attacks and provide solutions in advance. This makes it necessary to validate the solutions in a timely manner before deployment in the real network. However, threshold schemes needed in group protocols make analysis complex. Model checking threshold-based group protocols that employ cryptography have not been successful so far. Here, we propose a new simulation based approach for validation using a tool called FRAMOGR that supports executable specification of group protocols that use cryptography. FRAMOGR allows one to specify attackers and track probability distributions of values or paths. We believe that infrastructure such as FRAMOGR would be required in future for validating new group based threshold protocols that may be needed for making MIPv6 more robust.
Resumo:
This paper describes techniques to estimate the worst case execution time of executable code on architectures with data caches. The underlying mechanism is Abstract Interpretation, which is used for the dual purposes of tracking address computations and cache behavior. A simultaneous numeric and pointer analysis using an abstraction for discrete sets of values computes safe approximations of access addresses which are then used to predict cache behavior using Must Analysis. A heuristic is also proposed which generates likely worst case estimates. It can be used in soft real time systems and also for reasoning about the tightness of the safe estimate. The analysis methods can handle programs with non-affine access patterns, for which conventional Presburger Arithmetic formulations or Cache Miss Equations do not apply. The precision of the estimates is user-controlled and can be traded off against analysis time. Executables are analyzed directly, which, apart from enhancing precision, renders the method language independent.
Resumo:
Mobile ad hoc networks (MANETs) is one of the successful wireless network paradigms which offers unrestricted mobility without depending on any underlying infrastructure. MANETs have become an exciting and im- portant technology in recent years because of the rapid proliferation of variety of wireless devices, and increased use of ad hoc networks in various applications. Like any other networks, MANETs are also prone to variety of attacks majorly in routing side, most of the proposed secured routing solutions based on cryptography and authentication methods have greater overhead, which results in latency problems and resource crunch problems, especially in energy side. The successful working of these mechanisms also depends on secured key management involving a trusted third authority, which is generally difficult to implement in MANET environ-ment due to volatile topology. Designing a secured routing algorithm for MANETs which incorporates the notion of trust without maintaining any trusted third entity is an interesting research problem in recent years. This paper propose a new trust model based on cognitive reasoning,which associates the notion of trust with all the member nodes of MANETs using a novel Behaviors-Observations- Beliefs(BOB) model. These trust values are used for detec- tion and prevention of malicious and dishonest nodes while routing the data. The proposed trust model works with the DTM-DSR protocol, which involves computation of direct trust between any two nodes using cognitive knowledge. We have taken care of trust fading over time, rewards, and penalties while computing the trustworthiness of a node and also route. A simulator is developed for testing the proposed algorithm, the results of experiments shows incorporation of cognitive reasoning for computation of trust in routing effectively detects intrusions in MANET environment, and generates more reliable routes for secured routing of data.
Resumo:
The rapid emergence of infectious diseases calls for immediate attention to determine practical solutions for intervention strategies. To this end, it becomes necessary to obtain a holistic view of the complex hostpathogen interactome. Advances in omics and related technology have resulted in massive generation of data for the interacting systems at unprecedented levels of detail. Systems-level studies with the aid of mathematical tools contribute to a deeper understanding of biological systems, where intuitive reasoning alone does not suffice. In this review, we discuss different aspects of hostpathogen interactions (HPIs) and the available data resources and tools used to study them. We discuss in detail models of HPIs at various levels of abstraction, along with their applications and limitations. We also enlist a few case studies, which incorporate different modeling approaches, providing significant insights into disease. (c) 2013 Wiley Periodicals, Inc.
Resumo:
Moving shadow detection and removal from the extracted foreground regions of video frames, aim to limit the risk of misconsideration of moving shadows as a part of moving objects. This operation thus enhances the rate of accuracy in detection and classification of moving objects. With a similar reasoning, the present paper proposes an efficient method for the discrimination of moving object and moving shadow regions in a video sequence, with no human intervention. Also, it requires less computational burden and works effectively under dynamic traffic road conditions on highways (with and without marking lines), street ways (with and without marking lines). Further, we have used scale-invariant feature transform-based features for the classification of moving vehicles (with and without shadow regions), which enhances the effectiveness of the proposed method. The potentiality of the method is tested with various data sets collected from different road traffic scenarios, and its superiority is compared with the existing methods. (C) 2013 Elsevier GmbH. All rights reserved.
Resumo:
The presence of software bloat in large flexible software systems can hurt energy efficiency. However, identifying and mitigating bloat is fairly effort intensive. To enable such efforts to be directed where there is a substantial potential for energy savings, we investigate the impact of bloat on power consumption under different situations. We conduct the first systematic experimental study of the joint power-performance implications of bloat across a range of hardware and software configurations on modern server platforms. The study employs controlled experiments to expose different effects of a common type of Java runtime bloat, excess temporary objects, in the context of the SPECPower_ssj2008 workload. We introduce the notion of equi-performance power reduction to characterize the impact, in addition to peak power comparisons. The results show a wide variation in energy savings from bloat reduction across these configurations. Energy efficiency benefits at peak performance tend to be most pronounced when bloat affects a performance bottleneck and non-bloated resources have low energy-proportionality. Equi-performance power savings are highest when bloated resources have a high degree of energy proportionality. We develop an analytical model that establishes a general relation between resource pressure caused by bloat and its energy efficiency impact under different conditions of resource bottlenecks and energy proportionality. Applying the model to different "what-if" scenarios, we predict the impact of bloat reduction and corroborate these predictions with empirical observations. Our work shows that the prevalent software-only view of bloat is inadequate for assessing its power-performance impact and instead provides a full systems approach for reasoning about its implications.
Resumo:
We propose a framework for developing and reasoning about hybrid systems that are comprised of a plant with multiple controllers, each of which controls the plant intermittently. The framework is based on the notion of a ``conflict tolerant'' specification for a controller, and provides a modular way of developing and reasoning about such systems. We propose a novel mechanism of defining conflict-tolerant specifications for general hybrid systems, using ``acceptor'' and ``advisor'' components. We also give a decision procedure for verifying whether a controller satisfies its conflict-tolerant specification, in the special case when the components are modeled using initialized rectangular hybrid automata.
Resumo:
Dynamic analysis techniques have been proposed to detect potential deadlocks. Analyzing and comprehending each potential deadlock to determine whether the deadlock is feasible in a real execution requires significant programmer effort. Moreover, empirical evidence shows that existing analyses are quite imprecise. This imprecision of the analyses further void the manual effort invested in reasoning about non-existent defects. In this paper, we address the problems of imprecision of existing analyses and the subsequent manual effort necessary to reason about deadlocks. We propose a novel approach for deadlock detection by designing a dynamic analysis that intelligently leverages execution traces. To reduce the manual effort, we replay the program by making the execution follow a schedule derived based on the observed trace. For a real deadlock, its feasibility is automatically verified if the replay causes the execution to deadlock. We have implemented our approach as part of WOLF and have analyzed many large (upto 160KLoC) Java programs. Our experimental results show that we are able to identify 74% of the reported defects as true (or false) positives automatically leaving very few defects for manual analysis. The overhead of our approach is negligible making it a compelling tool for practical adoption.
Resumo:
Karnataka state in southern India supports a globally significant and the country's largest population of the Asian elephant Elephas maximus. A reliable map of Asian elephant distribution and measures of spatial variation in their abundance, both vital needs for conservation and management action, are unavailable not only in Karnataka, but across its global range. Here, we use various data gathered between 2000 and 2015 to map the distribution of elephants in Karnataka at the scale of the smallest forest management unit, the `beat', while also presenting data on elephant dung density for a subset of `elephant beats.' Elephants occurred in 972 out of 2855 forest beats of Karnataka. Sixty percent of these 972 beats and 55% of the forest habitat lay outside notified protected areas (PM), and included lands designated for agricultural production and human dwelling. While median elephant dung density inside protected areas was nearly thrice as much as outside, elephants routinely occurred in or used habitats outside PM where human density, land fraction under cultivation, and the interface between human-dominated areas and forests were greater. Based on our data, it is clear that India's framework for elephant conservation which legally protects the species wherever it occurs, but protects only some of its habitats while being appropriate in furthering their conservation within PM, seriously falters in situations where elephants reside in and/or seasonally use areas outside PAs. Attempts to further elephant conservation in production and dwelling areas have extracted high costs in human, elephant, material and monetary terms in Karnataka. In such settings, conservation planning exercises are necessary to determine where the needs of elephants or humans must take priority over the other, and to achieve that in a manner that is based not only on reliable scientific data but also on a process of public reasoning. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
The rapid emergence of infectious diseases calls for immediate attention to determine practical solutions for intervention strategies. To this end, it becomes necessary to obtain a holistic view of the complex hostpathogen interactome. Advances in omics and related technology have resulted in massive generation of data for the interacting systems at unprecedented levels of detail. Systems-level studies with the aid of mathematical tools contribute to a deeper understanding of biological systems, where intuitive reasoning alone does not suffice. In this review, we discuss different aspects of hostpathogen interactions (HPIs) and the available data resources and tools used to study them. We discuss in detail models of HPIs at various levels of abstraction, along with their applications and limitations. We also enlist a few case studies, which incorporate different modeling approaches, providing significant insights into disease. (c) 2013 Wiley Periodicals, Inc.
Resumo:
We propose an architecture for dramatically enhancing the stress bearing and energy absorption capacities of a polymer based composite. Different weight fractions of iron oxide nano-particles (NPs) are mixed in a poly(dimethylesiloxane) (PDMS) matrix either uniformly or into several vertically aligned cylindrical pillars. These composites are compressed up to a strain of 60% at a strain rate of 0.01 s(-1) following which they are fully unloaded at the same rate. Load bearing and energy absorption capacities of the composite with uniform distribution of NPs increase by similar to 50% upon addition of 5 wt% of NPs; however, these properties monotonically decrease with further addition of NPs so much so that the load bearing capacity of the composite becomes 1/6th of PDMS upon addition of 20 wt% of NPs. On the contrary, stress at a strain of 60% and energy absorption capacity of the composites with pillar configuration monotonically increase with the weight fraction of NPs in the pillars wherein the load bearing capacity becomes 1.5 times of PDMS when the pillars consisted of 20 wt% of NPs. In situ mechanical testing of composites with pillars reveals outward bending of the pillars wherein the pillars and the PDMS in between two pillars, located along a radius, are significantly compressed. Reasoning based on effects of compressive hydrostatic stress and shape of fillers is developed to explain the observed anomalous strengthening of the composite with pillar architecture.