3 resultados para insect visual guidance

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Locomoting through the environment typically involves anticipating impending changes in heading trajectory in addition to maintaining the current direction of travel. We explored the neural systems involved in the “far road” and “near road” mechanisms proposed by Land and Horwood (1995) using simulated forward or backward travel where participants were required to gauge their current direction of travel (rather than directly control it). During forward egomotion, the distant road edges provided future path information, which participants used to improve their heading judgments. During backward egomotion, the road edges did not enhance performance because they no longer provided prospective information. This behavioral dissociation was reflected at the neural level, where only simulated forward travel increased activation in a region of the superior parietal lobe and the medial intraparietal sulcus. Providing only near road information during a forward heading judgment task resulted in activation in the motion complex. We propose a complementary role for the posterior parietal cortex and motion complex in detecting future path information and maintaining current lane positioning, respectively. (PsycINFO Database Record (c) 2010 APA, all rights reserved)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Near ground maneuvers, such as hover, approach and landing, are key elements of autonomy in unmanned aerial vehicles. Such maneuvers have been tackled conventionally by measuring or estimating the velocity and the height above the ground often using ultrasonic or laser range finders. Near ground maneuvers are naturally mastered by flying birds and insects as objects below may be of interest for food or shelter. These animals perform such maneuvers efficiently using only the available vision and vestibular sensory information. In this paper, the time-to-contact (Tau) theory, which conceptualizes the visual strategy with which many species are believed to approach objects, is presented as a solution for Unmanned Aerial Vehicles (UAV) relative ground distance control. The paper shows how such an approach can be visually guided without knowledge of height and velocity relative to the ground. A control scheme that implements the Tau strategy is developed employing only visual information from a monocular camera and an inertial measurement unit. To achieve reliable visual information at a high rate, a novel filtering system is proposed to complement the control system. The proposed system is implemented on-board an experimental quadrotor UAV and shown not only to successfully land and approach ground, but also to enable the user to choose the dynamic characteristics of the approach. The methods presented in this paper are applicable to both aerial and space autonomous vehicles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Near-ground maneuvers, such as hover, approach, and landing, are key elements of autonomy in unmanned aerial vehicles. Such maneuvers have been tackled conventionally by measuring or estimating the velocity and the height above the ground, often using ultrasonic or laser range finders. Near-ground maneuvers are naturally mastered by flying birds and insects because objects below may be of interest for food or shelter. These animals perform such maneuvers efficiently using only the available vision and vestibular sensory information. In this paper, the time-tocontact (tau) theory, which conceptualizes the visual strategy with which many species are believed to approach objects, is presented as a solution for relative ground distance control for unmanned aerial vehicles. The paper shows how such an approach can be visually guided without knowledge of height and velocity relative to the ground. A control scheme that implements the tau strategy is developed employing only visual information from a monocular camera and an inertial measurement unit. To achieve reliable visual information at a high rate, a novel filtering system is proposed to complement the control system. The proposed system is implemented onboard an experimental quadrotor unmannedaerial vehicle and is shown to not only successfully land and approach ground, but also to enable the user to choose the dynamic characteristics of the approach. The methods presented in this paper are applicable to both aerial and space autonomous vehicles.