6 resultados para TELEOPERATION
em CentAUR: Central Archive University of Reading - UK
Resumo:
A novel Neuropredictive Teleoperation (NPT) Scheme is presented. The design results from two key ideas: the exploitation of the measured or estimated neural input to the human arm or its electromyograph (EMG) as the system input and the employment of a predictor of the arm movement, based on this neural signal and an arm model, to compensate for time delays in the system. Although a multitude of such models, as well as measuring devices for the neural signals and the EMG, have been proposed, current telemanipulator research has only been considering highly simplified arm models. In the present design, the bilateral constraint that the master and slave are simultaneously compliant to each other's state (equal positions and forces) is abandoned, thus obtaining a simple to analyzesuccession of only locally controlled modules, and a robustness to time delays of up to 500 ms. The proposed designs were inspired by well established physiological evidence that the brain, rather than controlling the movement on-line, programs the arm with an action plan of a complete movement, which is then executed largely in open loop, regulated only by local reflex loops. As a model of the human arm the well-established Stark model is employed, whose mathematical representation is modified to make it suitable for an engineering application. The proposed scheme is however valid for any arm model. BIBO-stability and passivity results for a variety of local control laws are reported. Simulation results and comparisons with traditional designs also highlight the advantages of the proposed design.
Resumo:
Presents a method for model based bilateral control of master-slave arm with time delay between master and slave arms, where the system supports cooperative action between manual and automatic modes. The method realises efficiencies in master-slave arm control with the simplicities of a computer and the flexibility of a skilled human operator.
Resumo:
Intelligent viewing systems are required if efficient and productive teleoperation is to be applied to dynamic manufacturing environments. These systems must automatically provide remote views to an operator which assist in the completion of the task. This assistance increases the productivity of the teleoperation task if the robot controller is responsive to the unpredictable dynamic evolution of the workcell. Behavioral controllers can be utilized to give reactive 'intelligence.' The inherent complex structure of current systems, however, places considerable time overheads on any redesign of the emergent behavior. In industry, where the remote environment and task frequently change, this continual redesign process becomes inefficient. We introduce a novel behavioral controller, based on an 'ego-behavior' architecture, to command an active camera (a camera mounted on a robot) within a remote workcell. Using this ego-behavioral architecture the responses from individual behaviors are rapidly combined to produce an 'intelligent' responsive viewing system. The architecture is single-layered, each behavior being autonomous with no explicit knowledge of the number, description or activity of other behaviors present (if any). This lack of imposed structure decreases the development time as it allows each behavior to be designed and tested independently before insertion into the architecture. The fusion mechanism for the behaviors provides the ability for each behavior to compete and/or co-operate with other behaviors for full or partial control of the viewing active camera. Each behavior continually reassesses this degree of competition or co-operation by measuring its own success in controlling the active camera against pre-defined constraints. The ego-behavioral architecture is demonstrated through simulation and experimentation.
Resumo:
This paper presents a review of the design and development of the Yorick series of active stereo camera platforms and their integration into real-time closed loop active vision systems, whose applications span surveillance, navigation of autonomously guided vehicles (AGVs), and inspection tasks for teleoperation, including immersive visual telepresence. The mechatronic approach adopted for the design of the first system, including head/eye platform, local controller, vision engine, gaze controller and system integration, proved to be very successful. The design team comprised researchers with experience in parallel computing, robot control, mechanical design and machine vision. The success of the project has generated sufficient interest to sanction a number of revisions of the original head design, including the design of a lightweight compact head for use on a robot arm, and the further development of a robot head to look specifically at increasing visual resolution for visual telepresence. The controller and vision processing engines have also been upgraded, to include the control of robot heads on mobile platforms and control of vergence through tracking of an operator's eye movement. This paper details the hardware development of the different active vision/telepresence systems.
Resumo:
This paper discusses a new method of impedance control that has been successfully implemented on the master robot of a teleoperation system. The method involves calibrating the robot to quantify the effect of adjustable controller parameters on the impedances along its different axes. The empirical equations relating end-effector impedance to the controller's feedback gains are obtained by performing system identification tests along individual axes of the robot. With these equations, online control of end-effector stiffness and damping is possible without having to monitor joint torques or solving complex algorithms. Hard contact conditions and compliant interfaces have been effectively demonstrated on a telemanipulation test-bed using appropriate combinations of stiffness and damping settings obtained by this method.