981 resultados para Robot applications
Resumo:
We present a novel, simple and effective approach for tele-operation of aerial robotic vehicles with haptic feedback. Such feedback provides the remote pilot with an intuitive feel of the robot’s state and perceived local environment that will ensure simple and safe operation in cluttered 3D environments common in inspection and surveillance tasks. Our approach is based on energetic considerations and uses the concepts of network theory and port-Hamiltonian systems. We provide a general framework for addressing problems such as mapping the limited stroke of a ‘master’ joystick to the infinite stroke of a ‘slave’ vehicle, while preserving passivity of the closed-loop system in the face of potential time delays in communications links and limited sensor data
Resumo:
To date, most quad-rotor aerial robots have been based on flying toys. Although such systems can be used as prototypes, they are not sufficiently robust to serve as experimental robotics platforms. We have developed the X-4 Flyer, a quad-rotor robot using custom-built chassis and avionics with off-the-shelf motors and batteries, to be a highly reliable experimental platform. The vehicle uses tuned plant dynamics with an onboard embedded attitude controller to stabilise flight. A linear SISO controller was designed to regulate flyer attitude.
Resumo:
This paper proposes the use of optical flow from a moving robot to provide force feedback to an operator's joystick to facilitate collision free teleoperation. Optic flow is measured by wide angle cameras on board the vehicle and used to generate a virtual environmental force that is reflected to the user through the joystick, as well as feeding back into the control of the vehicle. The coupling between optical flow (velocity) and force is modelled as an impedance - in this case an optical impedance. We show that the proposed control is dissipative and prevents the vehicle colliding with the environment as well as providing the operator with a natural feel for the remote environment. The paper focuses on applications to aerial robotics vehicles, however, the ideas apply directly to other force actuated vehicles such as submersibles or space vehicles, and the authors believe the approach has potential for control of terrestrial vehicles and even teleoperation of manipulators. Experimental results are provided for a simulated aerial robot in a virtual environment controlled by a haptic joystick.
Resumo:
This paper describes a novel experiment in which two very different methods of underwater robot localization are compared. The first method is based on a geometric approach in which a mobile node moves within a field of static nodes, and all nodes are capable of estimating the range to their neighbours acoustically. The second method uses visual odometry, from stereo cameras, by integrating scaled optical flow. The fundamental algorithmic principles of each localization technique is described. We also present experimental results comparing acoustic localization with GPS for surface operation, and a comparison of acoustic and visual methods for underwater operation.
Resumo:
This paper demonstrates some interesting connections between the hitherto disparate fields of mobile robot navigation and image-based visual servoing. A planar formulation of the well-known image-based visual servoing method leads to a bearing-only navigation system that requires no explicit localization and directly yields desired velocity. The well known benefits of image-based visual servoing such as robustness apply also to the planar case. Simulation results are presented.
Resumo:
Managing livestock movement in extensive systems has environmental and production benefits. Currently permanent wire fencing is used to control cattle; this is both expensive and inflexible. Cattle are known to respond to auditory and visual cues and we investigated whether these can be used to manipulate their behaviour. Twenty-five Belmont Red steers with a mean live weight of 270kg were each randomly assigned to one of five treatments. Treatments consisted of a combination of cues (audio, tactile and visual stimuli) and consequence (electrical stimulation). The treatments were electrical stimulation alone, audio plus electrical stimulation, vibration plus electrical stimulation, light plus electrical stimulation and electrified electric fence (6kV) plus electrical stimulation. Cue stimuli were administered for 3s followed immediately by electrical stimulation (consequence) of 1kV for 1s. The experiment tested the operational efficacy of an on-animal control or virtual fencing system. A collar-halter device was designed to carry the electronics, batteries and equipment providing the stimuli, including audio, vibration, light and electrical of a prototype virtual fencing device. Cattle were allowed to travel along a 40m alley to a group of peers and feed while their rate of travel and response to the stimuli were recorded. The prototype virtual fencing system was successful in modifying the behaviour of the cattle. The rate of travel of cattle along the alley demonstrated the large variability in behavioural response associated with tactile, visual and audible cues. The experiment demonstrated virtual fencing has potential for controlling cattle in extensive grazing systems. However, larger numbers of cattle need to be tested to derive a better understanding of the behavioural variance. Further controlled experimental work is also necessary to quantify the interaction between cues, consequences and cattle learning.
Resumo:
Performing reliable localisation and navigation within highly unstructured underwater coral reef environments is a difficult task at the best of times. Typical research and commercial underwater vehicles use expensive acoustic positioning and sonar systems which require significant external infrastructure to operate effectively. This paper is focused on the development of a robust vision-based motion estimation technique using low-cost sensors for performing real-time autonomous and untethered environmental monitoring tasks in the Great Barrier Reef without the use of acoustic positioning. The technique is experimentally shown to provide accurate odometry and terrain profile information suitable for input into the vehicle controller to perform a range of environmental monitoring tasks.
Resumo:
Performing reliable localisation and navigation within highly unstructured underwater coral reef environments is a difficult task at the best of times. Typical research and commercial underwater vehicles use expensive acoustic positioning and sonar systems which require significant external infrastructure to operate effectively. This paper is focused on the development of a robust vision-based motion estimation technique using low-cost sensors for performing real-time autonomous and untethered environmental monitoring tasks in the Great Barrier Reef without the use of acoustic positioning. The technique is experimentally shown to provide accurate odometry and terrain profile information suitable for input into the vehicle controller to perform a range of environmental monitoring tasks.
Resumo:
The 5th International Conference on Field and Service Robotics (FSR05) was held in Port Douglas, Australia, on 29th - 31st July 2005, and brought together the worlds' leading experts in field and service automation. The goal of the conference was to report and encourage the latest research and practical results towards the use of field and service robotics in the community with particular focus on proven technology. The conference provided a forum for researchers, professionals and robot manufacturers to exchange up-to-date technical knowledge and experience. Field robots are robots which operate in outdoor, complex, and dynamic environments. Service robots are those that work closely with humans, with particular applications involving indoor and structured environments. There are a wide range of topics presented in this issue on field and service robots including: Agricultural and Forestry Robotics, Mining and Exploration Robots, Robots for Construction, Security & Defence Robots, Cleaning Robots, Autonomous Underwater Vehicles and Autonomous Flying Robots. This meeting was the fifth in the series and brings FSR back to Australia where it was first held. FSR has been held every 2 years, starting with Canberra 1997, followed by Pittsburgh 1999, Helsinki 2001 and Lake Yamanaka 2003.
Resumo:
Virtual fencing has the potential to control grazing livestock. Understanding and refi ning the cues that can alter behaviour is an integral part of autonomous animal control. A series of tests have been completed to explore the relationship between temperament and control. Prior to exposure to virtual fencing control the animals were scored for temperament using fl ight speed and a sociability index using contact logging devices. The behavioural response of 30, Belmont Red steers were observed for behavioural changes when presented with cues prior to receiving an electrical stimulation. A control and four treatments designed to interrupt the animal’s movement down an alley were tested. The treatments consisted of sound plus electrical stimulation, vibration plus electrical stimulation, a visual cue plus electrical stimulation and electrical stimulation by itself. The treatments were randomly applied to each animal over fi ve consecutive trials. A control treatment in which no cues were applied was used to establish a basal behavioural pattern. A trial was considered completed after each animal had been retained behind the cue barrier for at least 60 sec. All cues and electrical stimulation were manually applied from a laptop located on a portable 3.5 m tower located immediately outside the alley. The electric stimulation consisted of 1.0 Kv of electricity. Electric stimulation, sound and vibration along with the Global Position System (GPS) hardware to autonomously record the animal’s path within the alley were recorded every second.
Resumo:
The development of autonomous air vehicles can be an expensive research pursuit. To alleviate some of the financial burden of this process, we have constructed a system consisting of four winches each attached to a central pod (the simulated air vehicle) via cables - a cable-array robot. The system is capable of precisely controlling the three dimensional position of the pod allowing effective testing of sensing and control strategies before experimentation on a free-flying vehicle. In this paper, we present a brief overview of the system and provide a practical control strategy for such a system.
Resumo:
A vast amount of research into autonomous underwater navigation has, and is, being conducted around the world. However, typical research and commercial platforms have limited autonomy and are generally unable to navigate efficiently within coral reef environments without tethers and significant external infrastructure. This paper outlines the development and presents experimental results into the performance evaluation of a new robotic vehicle for underwater monitoring and surveying in highly unstructured environments. The hybrid AUV design developed by the CSIRO robotic reef monitoring team realises a compromise between endurance, manoeuvrability and functionality. The vehicle represents a new era in AUV design specifically focused at providing a truly lowcost research capability that will progress environmental monitoring through unaided navigation, cooperative robotics, sensor network distribution and data harvesting.
Resumo:
This paper introduces the application of a sensor network to navigate a flying robot. We have developed distributed algorithms and efficient geographic routing techniques to incrementally guide one or more robots to points of interest based on sensor gradient fields, or along paths defined in terms of Cartesian coordinates. The robot itself is an integral part of the localization process which establishes the positions of sensors which are not known a priori. We use this system in a large-scale outdoor experiment with Mote sensors to guide an autonomous helicopter along a path encoded in the network. A simple handheld device, using this same environmental infrastructure, is used to guide humans.
Resumo:
This paper demonstrates some interesting connections between the hitherto disparate fields of mobile robot navigation and image-based visual servoing. A planar formulation of the well-known image-based visual servoing method leads to a bearing-only navigation system that requires no explicit localization and directly yields desired velocity. The well known benefits of image-based visual servoing such as robustness apply also to the planar case. Simulation results are presented.