113 resultados para Inertial navigation
Resumo:
This paper presents a visual SLAM method for temporary satellite dropout navigation, here applied on fixed- wing aircraft. It is designed for flight altitudes beyond typical stereo ranges, but within the range of distance measurement sensors. The proposed visual SLAM method consists of a common localization step with monocular camera resectioning, and a mapping step which incorporates radar altimeter data for absolute scale estimation. With that, there will be no scale drift of the map and the estimated flight path. The method does not require simplifications like known landmarks and it is thus suitable for unknown and nearly arbitrary terrain. The method is tested with sensor datasets from a manned Cessna 172 aircraft. With 5% absolute scale error from radar measurements causing approximately 2-6% accumulation error over the flown distance, stable positioning is achieved over several minutes of flight time. The main limitations are flight altitudes above the radar range of 750 m where the monocular method will suffer from scale drift, and, depending on the flight speed, flights below 50 m where image processing gets difficult with a downwards-looking camera due to the high optical flow rates and the low image overlap.
Resumo:
This paper presents an unmanned aircraft system (UAS) that uses a probabilistic model for autonomous front-on environmental sensing or photography of a target. The system is based on low-cost and readily-available sensor systems in dynamic environments and with the general intent of improving the capabilities of dynamic waypoint-based navigation systems for a low-cost UAS. The behavioural dynamics of target movement for the design of a Kalman filter and Markov model-based prediction algorithm are included. Geometrical concepts and the Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of a target, thus delivering a new waypoint for autonomous navigation. The results of the application to aerial filming with low-cost UAS are presented, achieving the desired goal of maintained front-on perspective without significant constraint to the route or pace of target movement.
Resumo:
Background Cervical Spinal Manipulation (CSM) is considered a high-level skill of the central nervous system because it requires bimanual coordinated rhythmical movements therefore necessitating training to achieve proficiency. The objective of the present study was to investigate the effect of real-time feedback on the performance of CSM. Methods Six postgraduate physiotherapy students attending a training workshop on Cervical Spine Manipulation Technique (CSMT) using inertial sensor derived real-time feedback participated in this study. The key variables were pre-manipulative position, angular displacement of the thrust and angular velocity of the thrust. Differences between variables before and after training were investigated using t-tests. Results There were no significant differences after training for the pre-manipulative position (rotation p = 0.549; side bending p = 0.312) or for thrust displacement (rotation p = 0.247; side bending p = 0.314). Thrust angular velocity demonstrated a significant difference following training for rotation (pre-training mean (sd) 48.9°/s (35.1); post-training mean (sd) 96.9°/s (53.9); p = 0.027) but not for side bending (p = 0.521). Conclusion Real-time feedback using an inertial sensor may be valuable in the development of specific manipulative skill. Future studies investigating manipulation could consider a randomized controlled trial using inertial sensor real time feedback compared to traditional training.
Resumo:
We address the problem of the rangefinder-based avoidance of unforeseen static obstacles during a visual navigation task. We extend previous strategies which are efficient in most cases but remain still hampered by some drawbacks (e.g., risks of collisions or of local minima in some particular cases, etc.). The key idea is to complete the control strategy by adding a controller providing the robot some anticipative skills to guarantee non collision and by defining more general transition conditions to deal with local minima. Simulation results show the proposed strategy efficiency.
Resumo:
A number of hurdles must be overcome in order to integrate unmanned aircraft into civilian airspace for routine operations. The ability of the aircraft to land safely in an emergency is essential to reduce the risk to people, infrastructure and aircraft. To date, few field-demonstrated systems have been presented that show online re-planning and repeatability from failure to touchdown. This paper presents the development of the Guidance, Navigation and Control (GNC) component of an Automated Emergency Landing System (AELS) intended to address this gap, suited to a variety of fixed-wing aircraft. Field-tested on both a fixed-wing UAV and Cessna 172R during repeated emergency landing experiments, a trochoid-based path planner computes feasible trajectories and a simplified control system executes the required manoeuvres to guide the aircraft towards touchdown on a predefined landing site. This is achieved in zero-thrust conditions with engine forced to idle to simulate failure. During an autonomous landing, the controller uses airspeed, inertial and GPS data to track motion and maintains essential flight parameters to guarantee flyability, while the planner monitors glide ratio and re-plans to ensure approach at correct altitude. Simulations show reliability of the system in a variety of wind conditions and its repeated ability to land within the boundary of a predefined landing site. Results from field-tests for the two aircraft demonstrate the effectiveness of the proposed GNC system in live operation. Results show that the system is capable of guiding the aircraft to close proximity of a predefined keyhole in nearly 100% of cases.
Resumo:
Acoustic recordings play an increasingly important role in monitoring terrestrial and aquatic environments. However, rapid advances in technology make it possible to accumulate thousands of hours of recordings, more than ecologists can ever listen to. Our approach to this big-data challenge is to visualize the content of long-duration audio recordings on multiple scales, from minutes, hours, days to years. The visualization should facilitate navigation and yield ecologically meaningful information prior to listening to the audio. To construct images, we calculate acoustic indices, statistics that describe the distribution of acoustic energy and reflect content of ecological interest. We combine various indices to produce false-color spectrogram images that reveal acoustic content and facilitate navigation. The technical challenge we investigate in this work is how to navigate recordings that are days or even months in duration. We introduce a method of zooming through multiple temporal scales, analogous to Google Maps. However, the “landscape” to be navigated is not geographical and not therefore intrinsically visual, but rather a graphical representation of the underlying audio. We describe solutions to navigating spectrograms that range over three orders of magnitude of temporal scale. We make three sets of observations: 1. We determine that at least ten intermediate scale steps are required to zoom over three orders of magnitude of temporal scale; 2. We determine that three different visual representations are required to cover the range of temporal scales; 3. We present a solution to the problem of maintaining visual continuity when stepping between different visual representations. Finally, we demonstrate the utility of the approach with four case studies.
Resumo:
There is an increased interest on the use of Unmanned Aerial Vehicles (UAVs) for wildlife and feral animal monitoring around the world. This paper describes a novel system which uses a predictive dynamic application that places the UAV ahead of a user, with a low cost thermal camera, a small onboard computer that identifies heat signatures of a target animal from a predetermined altitude and transmits that target’s GPS coordinates. A map is generated and various data sets and graphs are displayed using a GUI designed for easy use. The paper describes the hardware and software architecture and the probabilistic model for downward facing camera for the detection of an animal. Behavioral dynamics of target movement for the design of a Kalman filter and Markov model based prediction algorithm are used to place the UAV ahead of the user. Geometrical concepts and Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of the user, thus delivering a new way point for autonomous navigation. Results show that the system is capable of autonomously locating animals from a predetermined height and generate a map showing the location of the animals ahead of the user.
Resumo:
This thesis examined passengers' intuitive navigation in airports. It aims to ensure that passengers can navigate fast and efficiently through these complex environments. Field research was conducted at two Australian international airports. Participants wore eye-tracking glasses while finding their way through the terminal. Insight was gained into the intuitive use of navigation elements in the airport environment. With a detailed understanding of how passengers' navigate, the findings from this research can be used to improve airport design and planning. This will assist passengers who don't regularly fly as well as those who are frequent flyers.