109 resultados para landslides, riskanalysis, landslide hazard, fuzzy-logic
Resumo:
Different seismic hazard components pertaining to Bangalore city,namely soil overburden thickness, effective shear-wave velocity, factor of safety against liquefaction potential, peak ground acceleration at the seismic bedrock, site response in terms of amplification factor, and the predominant frequency, has been individually evaluated. The overburden thickness distribution, predominantly in the range of 5-10 m in the city, has been estimated through a sub-surface model from geotechnical bore-log data. The effective shear-wave velocity distribution, established through Multi-channel Analysis of Surface Wave (MASW) survey and subsequent data interpretation through dispersion analysis, exhibits site class D (180-360 m/s), site class C (360-760 m/s), and site class B (760-1500 m/s) in compliance to the National Earthquake Hazard Reduction Program (NEHRP) nomenclature. The peak ground acceleration has been estimated through deterministic approach, based on the maximum credible earthquake of M-W = 5.1 assumed to be nucleating from the closest active seismic source (Mandya-Channapatna-Bangalore Lineament). The 1-D site response factor, computed at each borehole through geotechnical analysis across the study region, is seen to be ranging from around amplification of one to as high as four times. Correspondingly, the predominant frequency estimated from the Fourier spectrum is found to be predominantly in range of 3.5-5.0 Hz. The soil liquefaction hazard assessment has been estimated in terms of factor of safety against liquefaction potential using standard penetration test data and the underlying soil properties that indicates 90% of the study region to be non-liquefiable. The spatial distributions of the different hazard entities are placed on a GIS platform and subsequently, integrated through analytical hierarchal process. The accomplished deterministic hazard map shows high hazard coverage in the western areas. The microzonation, thus, achieved is envisaged as a first-cut assessment of the site specific hazard in laying out a framework for higher order seismic microzonation as well as a useful decision support tool in overall land-use planning, and hazard management. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This paper describes a hardware implementation of a two-way converter logic by which conversion between numbers from positive to negative binary representation is possible. Index terms: (i) Negative radix, (ii) Positive radix, (iii) Two-way conversion.
Resumo:
An algebraic generalization of the well-known binary q-function array to a multivalued q-function array is presented. It is possible to associate tree-structure realizations for binary q-functions and multivalued q-functions. Synthesis of multivalued functions using this array is very simple
Resumo:
The increasing variability in device leakage has made the design of keepers for wide OR structures a challenging task. The conventional feedback keepers (CONV) can no longer improve the performance of wide dynamic gates for the future technologies. In this paper, we propose an adaptive keeper technique called rate sensing keeper (RSK) that enables faster switching and tracks the variation across different process corners. It can switch upto 1.9x faster (for 20 legs) than CONV and can scale upto 32 legs as against 20 legs for CONV in a 130-nm 1.2-V process. The delay tracking is within 8% across the different process corners. We demonstrate the circuit operation of RSK using a 32 x 8 register file implemented in an industrial 130-nm 1.2-V CMOS process. The performance of individual dynamic logic gates are also evaluated on chip for various keeper techniques. We show that the RSK technique gives superior performance compared to the other alternatives such as Conditional Keeper (CKP) and current mirror-based keeper (LCR).
Resumo:
Indian logic has a long history. It somewhat covers the domains of two of the six schools (darsanas) of Indian philosophy, namely, Nyaya and Vaisesika. The generally accepted definition of Indian logic over the ages is the science which ascertains valid knowledge either by means of six senses or by means of the five members of the syllogism. In other words, perception and inference constitute the subject matter of logic. The science of logic evolved in India through three ages: the ancient, the medieval and the modern, spanning almost thirty centuries. Advances in Computer Science, in particular, in Artificial Intelligence have got researchers in these areas interested in the basic problems of language, logic and cognition in the past three decades. In the 1980s, Artificial Intelligence has evolved into knowledge-based and intelligent system design, and the knowledge base and inference engine have become standard subsystems of an intelligent system. One of the important issues in the design of such systems is knowledge acquisition from humans who are experts in a branch of learning (such as medicine or law) and transferring that knowledge to a computing system. The second important issue in such systems is the validation of the knowledge base of the system i.e. ensuring that the knowledge is complete and consistent. It is in this context that comparative study of Indian logic with recent theories of logic, language and knowledge engineering will help the computer scientist understand the deeper implications of the terms and concepts he is currently using and attempting to develop.
Resumo:
Formal specification is vital to the development of distributed real-time systems as these systems are inherently complex and safety-critical. It is widely acknowledged that formal specification and automatic analysis of specifications can significantly increase system reliability. Although a number of specification techniques for real-time systems have been reported in the literature, most of these formalisms do not adequately address to the constraints that the aspects of 'distribution' and 'real-time' impose on specifications. Further, an automatic verification tool is necessary to reduce human errors in the reasoning process. In this regard, this paper is an attempt towards the development of a novel executable specification language for distributed real-time systems. First, we give a precise characterization of the syntax and semantics of DL. Subsequently, we discuss the problems of model checking, automatic verification of satisfiability of DL specifications, and testing conformance of event traces with DL specifications. Effective solutions to these problems are presented as extensions to the classical first-order tableau algorithm. The use of the proposed framework is illustrated by specifying a sample problem.
Resumo:
The aim of logic synthesis is to produce circuits which satisfy the given boolean function while meeting timing constraints and requiring the minimum silicon area. Logic synthesis involves two steps namely logic decomposition and technology mapping. Existing methods treat the two as separate operation. The traditional approach is to minimize the number of literals without considering the target technology during the decomposition phase. The decomposed expressions are then mapped on to the target technology to optimize the area, Timing optimization is carried out subsequently, A new approach which treats logic decomposition and technology maping as a single operation is presented. The logic decomposition is based on the parameters of the target technology. The area and timing optimization is carried out during logic decomposition phase itself. Results using MCNC circuits are presented to show that this method produces circuits which are 38% faster while requiring 14% increase in area.
Resumo:
The conventional Cornell's source-based approach of probabilistic seismic-hazard assessment (PSHA) has been employed all around the world, whilst many studies often rely on the use of computer packages such as FRISK (McGuire FRISK-a computer program for seismic risk analysis. Open-File Report 78-1007, United States Geological Survey, Department of Interior, Washington 1978) and SEISRISK III (Bender and Perkins SEISRISK III-a computer program for seismic hazard estimation, Bulletin 1772. United States Geological Survey, Department of Interior, Washington 1987). A ``black-box'' syndrome may be resulted if the user of the software does not have another simple and robust PSHA method that can be used to make comparisons. An alternative method for PSHA, namely direct amplitude-based (DAB) approach, has been developed as a heuristic and efficient method enabling users to undertake their own sanity checks on outputs from computer packages. This paper experiments the application of the DAB approach for three cities in China, Iran, and India, respectively, and compares with documented results computed by the source-based approach. Several insights regarding the procedure of conducting PSHA have also been obtained, which could be useful for future seismic-hazard studies.
Resumo:
Owing to the increased customer demands for make-to-order products and smaller product life-cycles, today assembly lines are designed to ensure a quick switch-over from one product model to another for companies' survival in market place. The complexity associated with the decisions pertaining to the type of training and number of workers and their exposition to the different tasks especially in the current era of customized production is a serious problem that the managers and the HRD gurus are facing in industry. This paper aims to determine the amount of cross-training and dynamic deployment policy caused by workforce flexibility for a make-to-order assembly. The aforementioned issues have been dealt with by adopting the concept of evolutionary fuzzy system because of the linguistic nature of the attributes associated with product variety and task complexity. A fuzzy system-based methodology is proposed to determine the amount of cross-training and dynamic deployment policy. The proposed methodology is tested on 10 sample products of varying complexities and the results obtained are in line with the conclusions drawn by previous researchers.
Resumo:
A framework based on the notion of "conflict-tolerance" was proposed in as a compositional methodology for developing and reasoning about systems that comprise multiple independent controllers. A central notion in this framework is that of a "conflict-tolerant" specification for a controller. In this work we propose a way of defining conflict-tolerant real-time specifications in Metric Interval Temporal Logic (MITL). We call our logic CT-MITL for Conflict-Tolerant MITL. We then give a clock optimal "delay-then-extend" construction for building a timed transition system for monitoring past-MITL formulas. We show how this monitoring transition system can be used to solve the associated verification and synthesis problems for CT-MITL.
Resumo:
The decision-making process for machine-tool selection and operation allocation in a flexible manufacturing system (FMS) usually involves multiple conflicting objectives. Thus, a fuzzy goal-programming model can be effectively applied to this decision problem. The paper addresses application of a fuzzy goal-programming concept to model the problem of machine-tool selection and operation allocation with explicit considerations given to objectives of minimizing the total cost of machining operation, material handling and set-up. The constraints pertaining to the capacity of machines, tool magazine and tool life are included in the model. A genetic algorithm (GA)-based approach is adopted to optimize this fuzzy goal-programming model. An illustrative example is provided and some results of computational experiments are reported.
Resumo:
Satisfiability algorithms for propositional logic have improved enormously in recently years. This improvement increases the attractiveness of satisfiability methods for first-order logic that reduce the problem to a series of ground-level satisfiability problems. R. Jeroslow introduced a partial instantiation method of this kind that differs radically from the standard resolution-based methods. This paper lays the theoretical groundwork for an extension of his method that is general enough and efficient enough for general logic programming with indefinite clauses. In particular we improve Jeroslow's approach by (1) extending it to logic with functions, (2) accelerating it through the use of satisfiers, as introduced by Gallo and Rago, and (3) simplifying it to obtain further speedup. We provide a similar development for a "dual" partial instantiation approach defined by Hooker and suggest a primal-dual strategy. We prove correctness of the primal and dual algorithms for full first-order logic with functions, as well as termination on unsatisfiable formulas. We also report some preliminary computational results.