377 resultados para Helicopter automation
Resumo:
This paper describes a novel experiment in which two very different methods of underwater robot localization are compared. The first method is based on a geometric approach in which a mobile node moves within a field of static nodes, and all nodes are capable of estimating the range to their neighbours acoustically. The second method uses visual odometry, from stereo cameras, by integrating scaled optical flow. The fundamental algorithmic principles of each localization technique is described. We also present experimental results comparing acoustic localization with GPS for surface operation, and a comparison of acoustic and visual methods for underwater operation.
Resumo:
This paper demonstrates some interesting connections between the hitherto disparate fields of mobile robot navigation and image-based visual servoing. A planar formulation of the well-known image-based visual servoing method leads to a bearing-only navigation system that requires no explicit localization and directly yields desired velocity. The well known benefits of image-based visual servoing such as robustness apply also to the planar case. Simulation results are presented.
Resumo:
This paper describes automation of the digging cycle of a mining rope shovel which considers autonomous dipper (bucket) filling and determining methods to detect when to disengage the dipper from the bank. Novel techniques to overcome dipper stall and the online estimation of dipper “fullness” are described with in-field experimental results of laser DTM generation, machine automation and digging using a 1/7th scale model rope shovel presented.
Resumo:
Performing reliable localisation and navigation within highly unstructured underwater coral reef environments is a difficult task at the best of times. Typical research and commercial underwater vehicles use expensive acoustic positioning and sonar systems which require significant external infrastructure to operate effectively. This paper is focused on the development of a robust vision-based motion estimation technique using low-cost sensors for performing real-time autonomous and untethered environmental monitoring tasks in the Great Barrier Reef without the use of acoustic positioning. The technique is experimentally shown to provide accurate odometry and terrain profile information suitable for input into the vehicle controller to perform a range of environmental monitoring tasks.
Resumo:
Performing reliable localisation and navigation within highly unstructured underwater coral reef environments is a difficult task at the best of times. Typical research and commercial underwater vehicles use expensive acoustic positioning and sonar systems which require significant external infrastructure to operate effectively. This paper is focused on the development of a robust vision-based motion estimation technique using low-cost sensors for performing real-time autonomous and untethered environmental monitoring tasks in the Great Barrier Reef without the use of acoustic positioning. The technique is experimentally shown to provide accurate odometry and terrain profile information suitable for input into the vehicle controller to perform a range of environmental monitoring tasks.
Resumo:
The 5th International Conference on Field and Service Robotics (FSR05) was held in Port Douglas, Australia, on 29th - 31st July 2005, and brought together the worlds' leading experts in field and service automation. The goal of the conference was to report and encourage the latest research and practical results towards the use of field and service robotics in the community with particular focus on proven technology. The conference provided a forum for researchers, professionals and robot manufacturers to exchange up-to-date technical knowledge and experience. Field robots are robots which operate in outdoor, complex, and dynamic environments. Service robots are those that work closely with humans, with particular applications involving indoor and structured environments. There are a wide range of topics presented in this issue on field and service robots including: Agricultural and Forestry Robotics, Mining and Exploration Robots, Robots for Construction, Security & Defence Robots, Cleaning Robots, Autonomous Underwater Vehicles and Autonomous Flying Robots. This meeting was the fifth in the series and brings FSR back to Australia where it was first held. FSR has been held every 2 years, starting with Canberra 1997, followed by Pittsburgh 1999, Helsinki 2001 and Lake Yamanaka 2003.
Resumo:
We consider the problem of monitoring and controlling the position of herd animals, and view animals as networked agents with natural mobility but not strictly controllable. By exploiting knowledge of individual and herd behavior we would like to apply a vast body of theory in robotics and motion planning to achieving the constrained motion of a herd. In this paper we describe the concept of a virtual fence which applies a stimulus to an animal as a function of its pose with respect to the fenceline. Multiple fence lines can define a region, and the fences can be static or dynamic. The fence algorithm is implemented by a small position-aware computer device worn by the animal, which we refer to as a Smart Collar.We describe a herd-animal simulator, the Smart Collar hardware and algorithms for tracking and controlling animals as well as the results of on-farm experiments with up to ten Smart Collars.
Resumo:
Starbug is an inexpensive, miniature autonomous underwater vehicle ideal for data collection and ecosystem surveys. Starbug is small enough to be launched by one person without the need for specialised equipment, such as cranes, and it operates with minimal to no human intervention. Starbug was one of the first autonomous underwater vehicles (AUVs) in the world where vision is the primary means of navigation and control. More details of Starbug can be found here: http://www.csiro.au/science/starbug.html
Resumo:
The article described an open-source toolbox for machine vision called Machine Vision Toolbox (MVT). MVT includes more than 60 functions including image file reading and writing, acquisition, display, filtering, blob, point and line feature extraction, mathematical morphology, homographies, visual Jacobians, camera calibration, and color space conversion. MVT can be used for research into machine vision but is also versatile enough to be usable for real-time work and even control. MVT, combined with MATLAB and a model workstation computer, is a useful and convenient environment for the investigation of machine vision algorithms. The article illustrated the use of a subset of toolbox functions for some typical problems and described MVT operations including the simulation of a complete image-based visual servo system.
Resumo:
We present a novel vision-based technique for navigating an Unmanned Aerial Vehicle (UAV) through urban canyons. Our technique relies on both optic flow and stereo vision information. We show that the combination of stereo and optic-flow (stereo-flow) is more effective at navigating urban canyons than either technique alone. Optic flow from a pair of sideways-looking cameras is used to stay centered in a canyon and initiate turns at junctions, while stereo vision from a forward-facing stereo head is used to avoid obstacles to the front. The technique was tested in full on an autonomous tractor at CSIRO and in part on the USC autonomous helicopter. Experimental results are presented from these two robotic platforms operating in outdoor environments. We show that the autonomous tractor can navigate urban canyons using stereoflow, and that the autonomous helicopter can turn away from obstacles to the side using optic flow. In addition, preliminary results show that a single pair of forward-facing fisheye cameras can be used for both stereo and optic flow. The center portions of the fisheye images are used for stereo, while flow is measured in the periphery of the images.
Resumo:
The highly unstructured nature of coral reef environments makes them difficult for current robotic vehicles to efficiently navigate. Typical research and commercial platforms have limited autonomy within these environments and generally require tethers and significant external infrastructure. This paper outlines the development of a new robotic vehicle for underwater monitoring and surveying in highly unstructured environments and presents experimental results illustrating the vehicle’s performance. The hybrid AUV design developed by the CSIRO robotic reef monitoring team realises a compromise between endurance, manoeuvrability and functionality. The vehicle represents a new era in AUV design specifically focused at providing a truly low-cost research capability that will progress environmental monitoring through unaided navigation, cooperative robotics, sensor network distribution and data harvesting.
Resumo:
The development of autonomous air vehicles can be an expensive research pursuit. To alleviate some of the financial burden of this process, we have constructed a system consisting of four winches each attached to a central pod (the simulated air vehicle) via cables - a cable-array robot. The system is capable of precisely controlling the three dimensional position of the pod allowing effective testing of sensing and control strategies before experimentation on a free-flying vehicle. In this paper, we present a brief overview of the system and provide a practical control strategy for such a system.
Resumo:
The Dynamic Data eXchange (DDX) is our third generation platform for building distributed robot controllers. DDX allows a coalition of programs to share data at run-time through an efficient shared memory mechanism managed by a store. Further, stores on multiple machines can be linked by means of a global catalog and data is moved between the stores on an as needed basis by multi-casting. Heterogeneous computer systems are handled. We describe the architecture of DDX and the standard clients we have developed that let us rapidly build complex control systems with minimal coding.