942 resultados para computerised navigation
Resumo:
This paper describes an autonomous docking system and web interface that allows long-term unaided use of a sophisticated robot by untrained web users. These systems have been applied to the biologically inspired RatSLAM system as a foundation for testing both its long-term stability and its practicality. While docking and web interface systems already exist, this system allows for a significantly larger margin of error in docking accuracy due to the mechanical design, thereby increasing robustness against navigational errors. Also a standard vision sensor is used for both long-range and short-range docking, compared to the many systems that require both omni-directional cameras and high resolution Laser range finders for navigation. The web interface has been designed to accommodate the significant delays experienced on the Internet, and to facilitate the non- Cartesian operation of the RatSLAM system.
Resumo:
Motion has been examined in biology to be a critical component for obstacle avoidance and navigation. In particular, optical flow is a powerful motion cue that has been exploited in many biological systems for survival. In this paper, we investigate an obstacle detection system that uses optical flow to obtain range information to objects. Our experimental results demonstrate that optical flow is capable of providing good obstacle information but has obvious failure modes. We acknowledge that our optical flow system has certain disadvantages and cannot be solely used for navigation. Instead, we believe that optical flow is a critical visual subsystem used when moving at reason- able speeds. When combined with other visual subsystems, considerable synergy can result.
Resumo:
RatSLAM is a vision-based SLAM system based on extended models of the rodent hippocampus. RatSLAM creates environment representations that can be processed by the experience mapping algorithm to produce maps suitable for goal recall. The experience mapping algorithm also allows RatSLAM to map environments many times larger than could be achieved with a one to one correspondence between the map and environment, by reusing the RatSLAM maps to represent multiple sections of the environment. This paper describes experiments investigating the effects of the environment-representation size ratio and visual ambiguity on mapping and goal navigation performance. The experiments demonstrate that system performance is weakly dependent on either parameter in isolation, but strongly dependent on their joint values.
Resumo:
The RatSLAM system can perform vision based SLAM using a computational model of the rodent hippocampus. When the number of pose cells used to represent space in RatSLAM is reduced, artifacts are introduced that hinder its use for goal directed navigation. This paper describes a new component for the RatSLAM system called an experience map, which provides a coherent representation for goal directed navigation. Results are presented for two sets of real world experiments, including comparison with the original goal memory system's performance in the same environment. Preliminary results are also presented demonstrating the ability of the experience map to adapt to simple short term changes in the environment.
Resumo:
Approaches with Vertical Guidance (APV) can provide greater safety and cost savings to general aviation through accurate GPS horizontal and vertical navigation. However, GPS needs augmentation to achieve APV fault detection requirements. Aircraft Based Augmentation Systems (ABAS) fuse GPS with additional sensors at the aircraft. Typical ABAS designs assume high-quality inertial sensors with Kalman filters but these are too expensive for general aviation. Instead of using high-quality (and expensive) sensors, the purpose of this paper is to investigate augmenting GPS with a low-quality MEMS IMU and Aircraft Dynamic Model (ADM). The IMU and ADM are fused together using a multiple model fusion strategy in a bank of Extended Kalman Filters (EKF) with the Normalized Solution Separation (NSS) fault detection scheme. A tightly-coupled configuration with GPS is used and frequent GPS updates are applied to the IMU and ADM to compensate for their errors. Based upon a simulated APV approach, the performance of this architecture in detecting a GPS ramp fault is investigated showing a performance improvement over a GPS-only “snapshot” implementation of the NSS method. The effect of fusing the IMU with the ADM is evaluated by comparing a GPS-IMU-ADM EKF with a GPS-IMU EKF where a small improvement in protection levels is shown.
Resumo:
An Interactive Installation with holographic 3D projections, satellite imagery, surround sound and intuitive body driven interactivity. Remnant (v.1) was commissioned by the 2010 TreeLine ecoArt event - an initiative of the Sunshine Coast Council and presented at a remnant block of subtropical rainforest called ‘Mary Cairncross Scenic Reserve’ - located 100kms north of Brisbane near the township of Maleny. V2 was later commissioned for KickArts Gallery, Cairns, re-presenting the work in a new open format which allowed audiences to both experience the original power of the work and to also understand the construction of the work's powerful illusory, visual spaces. This art-science project focused upon the idea of remnant landscapes - isolated blocks of forest (or other vegetation types) typically set within a patchwork quilt of surrounding farmed land. Participants peer into a mysterious, long tunnel of imagery whilst navigating entirely through gentle head movements - allowing them to both 'steer' in three dimensions and also 'alight', as a butterfly might, upon a sector of landscape - which in turn reveals an underlying 'landscape of mind'. The work challenges audiences to re-imagine our conceptions of country in ways that will lead us to better reconnect and sustain today’s heavily divided landscapes. The research field involved developing new digital image projection methods, alternate embodied interaction and engagement strategies for eco-political media arts practice. The context was the creation of improved embodied and improvisational experiences for participants, further informed by ‘eco-philosophical’ and sustainment theories. By engaging with deep conceptions of connectivity between apparently disparate elements, the work considered novel strategies for fostering new desires, for understanding and re-thinking the requisite physical and ecological links between ‘things’ that have been historically shattered. The methodology was primarily practice-led and in concert with underlying theories. The work’s knowledge contribution was to question how new media interactive experience and embodied interaction might prompt participants to reflect upon appropriate resources and knowledges required to generate this substantive desire for new approaches to sustainment. This accentuated through the power of learning implied by the works' strongly visual and kinaesthetic interface (i.e. the tunnel of imagery and the head and torso operated navigation). The work was commissioned by the 2010 TreeLine ecoArt event - an initiative of the Sunshine Coast Council and the second version was commissioned by Kickarts Gallery, Cairns, specifically funded by a national optometrist chain. It was also funded in development by Arts Queensland and reviewed in Realtime.
Resumo:
Performing reliable localisation and navigation within highly unstructured underwater coral reef environments is a difficult task at the best of times. Typical research and commercial underwater vehicles use expensive acoustic positioning and sonar systems which require significant external infrastructure to operate effectively. This paper is focused on the development of a robust vision-based motion estimation technique using low-cost sensors for performing real-time autonomous and untethered environmental monitoring tasks in the Great Barrier Reef without the use of acoustic positioning. The technique is experimentally shown to provide accurate odometry and terrain profile information suitable for input into the vehicle controller to perform a range of environmental monitoring tasks.
Resumo:
Performing reliable localisation and navigation within highly unstructured underwater coral reef environments is a difficult task at the best of times. Typical research and commercial underwater vehicles use expensive acoustic positioning and sonar systems which require significant external infrastructure to operate effectively. This paper is focused on the development of a robust vision-based motion estimation technique using low-cost sensors for performing real-time autonomous and untethered environmental monitoring tasks in the Great Barrier Reef without the use of acoustic positioning. The technique is experimentally shown to provide accurate odometry and terrain profile information suitable for input into the vehicle controller to perform a range of environmental monitoring tasks.
Resumo:
Starbug is an inexpensive, miniature autonomous underwater vehicle ideal for data collection and ecosystem surveys. Starbug is small enough to be launched by one person without the need for specialised equipment, such as cranes, and it operates with minimal to no human intervention. Starbug was one of the first autonomous underwater vehicles (AUVs) in the world where vision is the primary means of navigation and control. More details of Starbug can be found here: http://www.csiro.au/science/starbug.html
Resumo:
If mobile robots are to perform useful tasks in the real-world they will require a catalog of fundamental navigation competencies and a means to select between them. In this paper we describe our work on strongly vision-based competencies: road-following, person or vehicle following, pose and position stabilization. Results from experiments on an outdoor autonomous tractor, a car-like vehicle, are presented.
Resumo:
A vast amount of research into autonomous underwater navigation has, and is, being conducted around the world. However, typical research and commercial platforms have limited autonomy and are generally unable to navigate efficiently within coral reef environments without tethers and significant external infrastructure. This paper outlines the development and presents experimental results into the performance evaluation of a new robotic vehicle for underwater monitoring and surveying in highly unstructured environments. The hybrid AUV design developed by the CSIRO robotic reef monitoring team realises a compromise between endurance, manoeuvrability and functionality. The vehicle represents a new era in AUV design specifically focused at providing a truly lowcost research capability that will progress environmental monitoring through unaided navigation, cooperative robotics, sensor network distribution and data harvesting.
Resumo:
This paper presents a technique for tracking road edges in a panoramic image sequence. The major contribution is that instead of unwarping the image to find parallel lines representing the road edges, we choose to warp the parallel groundplane lines into the image plane of the equiangular panospheric camera. Updating the parameters of the line thus involves searching a very small number of pixels in the panoramic image, requiring considerably less computation than unwarping. Results using real-world images, including shadows, intersections and curves, are presented.
Resumo:
In this paper we describe a low-cost flight control system for a small (60 class) helicopter which is part of a larger project to develop an autonomous flying vehicle. Our approach differs from that of others in not using an expensive inertial/GPS sensing system. The primary sensors for vehicle stabilization are a low-cost inertial sensor and a pair of CMOS cameras. We describe the architecture of our flight control system, the inertial and visual sensing subsystems and present some flight control results.
Resumo:
In recent months the extremes of Australia’s weather have affected, killed a good number of people and millions of dollars lost. Contrary to a manned aircraft or a helicopter; which have restricted air time, a UAS or a group of UAS could provide 24 hours coverage of the disaster area and be instrumented with infrared cameras to locate distressed people and relay information to emergency services. The solar powered UAV is capable of carrying a 0.25Kg payload consuming 0.5 watt and fly continuously for at low altitude for 24 hrs ,collect the data and create a special distribution . This system, named Green Falcon, is fully autonomous in navigation and power generation, equipped with solar cells covering its wing, it retrieves energy from the sun in order to supply power to the propulsion system and the control electronics, and charge the battery with the surplus of energy. During the night, the only energy available comes from the battery, which discharges slowly until the next morning when a new cycle starts. The prototype airplane was exhibited at the Melbourne Museum form Nov09 to Feb 2010.