195 resultados para Optical navigation
em Queensland University of Technology - ePrints Archive
Resumo:
Современный этап развития комплексов автоматического управления и навигации малогабаритными БЛА многократного применения предъявляет высокие требования к автономности, точности и миниатюрности данных систем. Противоречивость требований диктует использование функционального и алгоритмического объединения нескольких разнотипных источников навигационной информации в едином вычислительном процессе на основе методов оптимальной фильтрации. Получили широкое развитие бесплатформенные инерциальные навигационные системы (БИНС) на основе комплексирования данных микромеханических датчиков инерциальной информации и датчиков параметров движения в воздушном потоке с данными спутниковых навигационных систем (СНС). Однако в современных условиях такой подход не в полной мере реализует требования к помехозащищённости, автономности и точности получаемой навигационной информации. Одновременно с этим достигли значительного прогресса навигационные системы, использующие принципы корреляционно экстремальной навигации по оптическим ориентирам и цифровым картам местности. Предлагается схема построения автономной автоматической навигационной системы (АНС) для БЛА многоразового применения на основе объединения алгоритмов БИНС, спутниковой навигационной системы и оптической навигационной системы. The modern stage of automatic control and guidance systems development for small unmanned aerial vehicles (UAV) is determined by advanced requirements for autonomy, accuracy and size of the systems. The contradictory of the requirements dictates novel functional and algorithmic tight coupling of several different onboard sensors into one computational process, which is based on methods of optimal filtering. Nowadays, data fusion of micro-electro mechanical sensors of inertial measurement units, barometric pressure sensors, and signals of global navigation satellite systems (GNSS) receivers is widely used in numerous strap down inertial navigation systems (INS). However, the systems do not fully comply with such requirements as jamming immunity, fault tolerance, autonomy, and accuracy of navigation. At the same time, the significant progress has been recently demonstrated by the navigation systems, which use the correlation extremal principle applied for optical data flow and digital maps. This article proposes a new architecture of automatic navigation management system (ANMS) for small UAV, which combines algorithms of strap down INS, satellite navigation and optical navigation system.
Resumo:
This thesis investigates the problem of robot navigation using only landmark bearings. The proposed system allows a robot to move to a ground target location specified by the sensor values observed at this ground target posi- tion. The control actions are computed based on the difference between the current landmark bearings and the target landmark bearings. No Cartesian coordinates with respect to the ground are computed by the control system. The robot navigates using solely information from the bearing sensor space. Most existing robot navigation systems require a ground frame (2D Cartesian coordinate system) in order to navigate from a ground point A to a ground point B. The commonly used sensors such as laser range scanner, sonar, infrared, and vision do not directly provide the 2D ground coordi- nates of the robot. The existing systems use the sensor measurements to localise the robot with respect to a map, a set of 2D coordinates of the objects of interest. It is more natural to navigate between the points in the sensor space corresponding to A and B without requiring the Cartesian map and the localisation process. Research on animals has revealed how insects are able to exploit very limited computational and memory resources to successfully navigate to a desired destination without computing Cartesian positions. For example, a honeybee balances the left and right optical flows to navigate in a nar- row corridor. Unlike many other ants, Cataglyphis bicolor does not secrete pheromone trails in order to find its way home but instead uses the sun as a compass to keep track of its home direction vector. The home vector can be inaccurate, so the ant also uses landmark recognition. More precisely, it takes snapshots and compass headings of some landmarks. To return home, the ant tries to line up the landmarks exactly as they were before it started wandering. This thesis introduces a navigation method based on reflex actions in sensor space. The sensor vector is made of the bearings of some landmarks, and the reflex action is a gradient descent with respect to the distance in sensor space between the current sensor vector and the target sensor vec- tor. Our theoretical analysis shows that except for some fully characterized pathological cases, any point is reachable from any other point by reflex action in the bearing sensor space provided the environment contains three landmarks and is free of obstacles. The trajectories of a robot using reflex navigation, like other image- based visual control strategies, do not correspond necessarily to the shortest paths on the ground, because the sensor error is minimized, not the moving distance on the ground. However, we show that the use of a sequence of waypoints in sensor space can address this problem. In order to identify relevant waypoints, we train a Self Organising Map (SOM) from a set of observations uniformly distributed with respect to the ground. This SOM provides a sense of location to the robot, and allows a form of path planning in sensor space. The navigation proposed system is analysed theoretically, and evaluated both in simulation and with experiments on a real robot.
Resumo:
Motion has been examined in biology to be a critical component for obstacle avoidance and navigation. In particular, optical flow is a powerful motion cue that has been exploited in many biological systems for survival. In this paper, we investigate an obstacle detection system that uses optical flow to obtain range information to objects. Our experimental results demonstrate that optical flow is capable of providing good obstacle information but has obvious failure modes. We acknowledge that our optical flow system has certain disadvantages and cannot be solely used for navigation. Instead, we believe that optical flow is a critical visual subsystem used when moving at reason- able speeds. When combined with other visual subsystems, considerable synergy can result.
Resumo:
This paper presents a visual SLAM method for temporary satellite dropout navigation, here applied on fixed- wing aircraft. It is designed for flight altitudes beyond typical stereo ranges, but within the range of distance measurement sensors. The proposed visual SLAM method consists of a common localization step with monocular camera resectioning, and a mapping step which incorporates radar altimeter data for absolute scale estimation. With that, there will be no scale drift of the map and the estimated flight path. The method does not require simplifications like known landmarks and it is thus suitable for unknown and nearly arbitrary terrain. The method is tested with sensor datasets from a manned Cessna 172 aircraft. With 5% absolute scale error from radar measurements causing approximately 2-6% accumulation error over the flown distance, stable positioning is achieved over several minutes of flight time. The main limitations are flight altitudes above the radar range of 750 m where the monocular method will suffer from scale drift, and, depending on the flight speed, flights below 50 m where image processing gets difficult with a downwards-looking camera due to the high optical flow rates and the low image overlap.
Resumo:
Person tracking systems to date have either relied on motion detection or optical flow as a basis for person detection and tracking. As yet, systems have not been developed that utilise both these techniques. We propose a person tracking system that uses both, made possible by a novel hybrid optical flow-motion detection technique that we have developed. This provides the system with two methods of person detection, helping to avoid missed detections and the need to predict position, which can lead to errors in tracking and mistakes when handling occlusion situations. Our results show that our system is able to track people accurately, with an average error less than four pixels, and that our system outperforms the current CAVIAR benchmark system.