891 resultados para PLC and SCADA programming
Resumo:
This report presents a system for generating a stable, feasible, and reachable grasp of a polyhedral object. A set of contact points on the object is found that can result in a stable grasp; a feasible grasp is found in which the robot contacts the object at those contact points; and a path is constructed from the initial configuration of the robot to the stable, feasible final grasp configuration. The algorithm described in the report is designed for the Salisbury hand mounted on a Puma 560 arm, but a similar approach could be used to develop grasping systems for other robots.
Resumo:
The primary goal of this research is to develop theoretical tools for analysis, synthesis, application of primitive manipulator operations. The primary method is to extend and apply traditional tools of classical mechanics. The results are of such a general nature that they address many different aspects of industrial robotics, including effector and sensor design, planning and programming tools and design of auxiliary equipment. Some of the manipulator operations studied are: (1) Grasping an object. The object will usually slide and rotate during the period between first contact and prehension. (2) Placing an object. The object may slip slightly in the fingers upon contact with the table as the base aligns with the table. (3) Pushing. Often the final stage of mating two parts involves pushing one object into the other.
Resumo:
This thesis presents a new high level robot programming system. The programming system can be used to construct strategies consisting of compliant motions, in which a moving robot slides along obstacles in its environment. The programming system is referred to as high level because the user is spared of many robot-level details, such as the specification of conditional tests, motion termination conditions, and compliance parameters. Instead, the user specifies task-level information, including a geometric model of the robot and its environment. The user may also have to specify some suggested motions. There are two main system components. The first component is an interactive teaching system which accepts motion commands from a user and attempts to build a compliant motion strategy using the specified motions as building blocks. The second component is an autonomous compliant motion planner, which is intended to spare the user from dealing with "simple" problems. The planner simplifies the representation of the environment by decomposing the configuration space of the robot into a finite state space, whose states are vertices, edges, faces, and combinations thereof. States are inked to each other by arcs, which represent reliable compliant motions. Using best first search, states are expanded until a strategy is found from the start state to a global state. This component represents one of the first implemented compliant motion planners. The programming system has been implemented on a Symbolics 3600 computer, and tested on several examples. One of the resulting compliant motion strategies was successfully executed on an IBM 7565 robot manipulator.
Resumo:
Conventional parallel computer architectures do not provide support for non-uniformly distributed objects. In this thesis, I introduce sparsely faceted arrays (SFAs), a new low-level mechanism for naming regions of memory, or facets, on different processors in a distributed, shared memory parallel processing system. Sparsely faceted arrays address the disconnect between the global distributed arrays provided by conventional architectures (e.g. the Cray T3 series), and the requirements of high-level parallel programming methods that wish to use objects that are distributed over only a subset of processing elements. A sparsely faceted array names a virtual globally-distributed array, but actual facets are lazily allocated. By providing simple semantics and making efficient use of memory, SFAs enable efficient implementation of a variety of non-uniformly distributed data structures and related algorithms. I present example applications which use SFAs, and describe and evaluate simple hardware mechanisms for implementing SFAs. Keeping track of which nodes have allocated facets for a particular SFA is an important task that suggests the need for automatic memory management, including garbage collection. To address this need, I first argue that conventional tracing techniques such as mark/sweep and copying GC are inherently unscalable in parallel systems. I then present a parallel memory-management strategy, based on reference-counting, that is capable of garbage collecting sparsely faceted arrays. I also discuss opportunities for hardware support of this garbage collection strategy. I have implemented a high-level hardware/OS simulator featuring hardware support for sparsely faceted arrays and automatic garbage collection. I describe the simulator and outline a few of the numerous details associated with a "real" implementation of SFAs and SFA-aware garbage collection. Simulation results are used throughout this thesis in the evaluation of hardware support mechanisms.
Resumo:
The work reported here lies in the area of overlap between artificial intelligence software engineering. As research in artificial intelligence, it is a step towards a model of problem solving in the domain of programming. In particular, this work focuses on the routine aspects of programming which involve the application of previous experience with similar programs. I call this programming by inspection. Programming is viewed here as a kind of engineering activity. Analysis and synthesis by inspection area prominent part of expert problem solving in many other engineering disciplines, such as electrical and mechanical engineering. The notion of inspections methods in programming developed in this work is motivated by similar notions in other areas of engineering. This work is also motivated by current practical concerns in the area of software engineering. The inadequacy of current programming technology is universally recognized. Part of the solution to this problem will be to increase the level of automation in programming. I believe that the next major step in the evolution of more automated programming will be interactive systems which provide a mixture of partially automated program analysis, synthesis and verification. One such system being developed at MIT, called the programmer's apprentice, is the immediate intended application of this work. This report concentrates on the knowledge are of the programmer's apprentice, which is the form of a taxonomy of commonly used algorithms and data structures. To the extent that a programmer is able to construct and manipulate programs in terms of the forms in such a taxonomy, he may relieve himself of many details and generally raise the conceptual level of his interaction with the system, as compared with present day programming environments. Also, since it is practical to expand a great deal of effort pre-analyzing the entries in a library, the difficulty of verifying the correctness of programs constructed this way is correspondingly reduced. The feasibility of this approach is demonstrated by the design of an initial library of common techniques for manipulating symbolic data. This document also reports on the further development of a formalism called the plan calculus for specifying computations in a programming language independent manner. This formalism combines both data and control abstraction in a uniform framework that has facilities for representing multiple points of view and side effects.
Resumo:
Act2 is a highly concurrent programming language designed to exploit the processing power available from parallel computer architectures. The language supports advanced concepts in software engineering, providing high-level constructs suitable for implementing artificially-intelligent applications. Act2 is based on the Actor model of computation, consisting of virtual computational agents which communicate by message-passing. Act2 serves as a framework in which to integrate an actor language, a description and reasoning system, and a problem-solving and resource management system. This document describes issues in Act2's design and the implementation of an interpreter for the language.
Resumo:
"The Structure and Interpretation of Computer Programs" is the entry-level subject in Computer Science at the Massachusetts Institute of Technology. It is required of all students at MIT who major in Electrical Engineering or in Computer Science, as one fourth of the "common core curriculum," which also includes two subjects on circuits and linear systems and a subject on the design of digital systems. We have been involved in the development of this subject since 1978, and we have taught this material in its present form since the fall of 1980 to approximately 600 students each year. Most of these students have had little or no prior formal training in computation, although most have played with computers a bit and a few have had extensive programming or hardware design experience. Our design of this introductory Computer Science subject reflects two major concerns. First we want to establish the idea that a computer language is not just a way of getting a computer to perform operations, but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only incidentally for machines to execute. Secondly, we believe that the essential material to be addressed by a subject at this level, is not the syntax of particular programming language constructs, nor clever algorithms for computing particular functions of efficiently, not even the mathematical analysis of algorithms and the foundations of computing, but rather the techniques used to control the intellectual complexity of large software systems.
Resumo:
A model is developed for predicting the resolution of interested component pair and calculating the optimum temperature programming condition in the comprehensive two-dimensional gas chromatography (GC x GC). Based on at least three isothermal runs, retention times and the peak widths at half-height on both dimensions are predicted for any kind of linear temperature-programmed run on the first dimension and isothermal runs on the second dimension. The calculation of the optimum temperature programming condition is based on the prediction of the resolution of "difficult-to-separate components" in a given mixture. The resolution of all the neighboring peaks on the first dimension is obtained by the predicted retention time and peak width on the first dimension, the resolution on the second dimension is calculated only for the adjacent components with un-enough resolution on the first dimension and eluted within a same modulation period on the second dimension. The optimum temperature programming condition is acquired when the resolutions of all components of interest by GC x GC separation meet the analytical requirement and the analysis time is the shortest. The validity of the model has been proven by using it to predict and optimize GC x GC temperature programming condition of an alkylpyridine mixture. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
The gonadal steroids, in particular estradiol, exert an important action during perinatal period in the regulation of sexual dimorphism and neuronal plasticity, and in the growth and development of nervous system. Exposure of the developing female to estrogens during perinatal period may have long-lasting effects that are now regarded as “programming” the female neuroendocrine axis to malfunction in adulthood. The purpose of this study was to describe the effect of a single administration of a low dose (10 μg) of β-estradiol 3-benzoate (EB) to female rats on the day of birth on brain and plasma concentrations of the neuroactive steroid allopregnanolone, general behaviours and behavioral sensitivity to benzodiazepines. Neonatal administration of EB induces a dramatic reduction in the cerebrocortical and plasma levels of allopregnanolone and progesterone that were apparent in both juvenile (21 days) and adult (60 days). In contrast, this treatment did not affect 17β-estradiol levels. Female rats treated with β-estradiol 3-benzoate showed a delay in vaginal opening, aciclicity characterized by prolonged estrus, and ovarian failure. Given that allopregnanolone elicits anxiolytic, antidepressive, anticonvulsant, sedative-hypnotic effects and facilitates social behaviour, we assessed whether this treatment might modify different emotional, cognitive and social behaviours. This treatment did not affect locomotor activity, anxiety- and mood-related behaviours, seizures sensitivity and spatial memory. In contrast, neonatal β-estradiol 3-benzoate-treated rats showed a dominant, but not aggressive, behaviour and an increase in body investigation, especially anogenital investigation, characteristic of male appetitive behaviour. On the contrary, neonatal administration of β-estradiol 3-benzoate to female rats increases sensitivity to the anxiolytic, sedative, and amnesic effects of diazepam in adulthood. These results indicate that the marked and persistent reduction in the cerebrocortical and peripheral concentration of the neuroactive steroid allopregnanolone induced by neonatal treatment with β-estradiol 3-benzoate does not change baseline behaviours in adult rats. On the contrary, the low levels of allopregnanolone seems to be associated to changes in the behavioural sensitivity to the positive allosteric modulator of the GABAA receptor, diazepam. These effects of estradiol suggest that it plays a major role in pharmacological regulation both of GABAergic transmission and of the abundance of endogenous modulators of such transmission during development of the central nervous system.
Resumo:
King, R.D., Garrett, S.M., Coghill, G.M. (2005). On the use of qualitative reasoning to simulate and identify metabolic pathways. Bioinformatics 21(9):2017-2026 RAE2008
Resumo:
Enot, D. and King, R. D. (2003) Application of Inductive Logic Programming to Structure-Based Drug Design. 7th European Conference on Principles and Practice of Knowledge Discovery in Databases (PKDD '03). Springer LNAI 2838 p156-167
Resumo:
Srinivasan, A., King, R. D. and Bain, M.E. (2003) An Empirical Study of the Use of Relevance Information in Inductive Logic Programming. Journal of Machine Learning Research. 4(Jul):369-383
Resumo:
David P. Enot and Ross D. King (2003). Structure based drug design with inductive logic programming. The ACS National Meeting Spring 2003, New Orleans
Resumo:
David P. Enot and Ross D. King (2002) The use of Inductive Logic Programming in drug design. Proceedings of the 14th EuroQSAR Symposium (EuroQSAR 2002). Blackwell Publishing, p247-250
Resumo:
Rowland, J.J. and Taylor, J. (2002). Adaptive denoising in spectral analysis by genetic programming. Proc. IEEE Congress on Evolutionary Computation (part of WCCI), May 2002. pp 133-138. ISBN 0-7803-7281-6