786 resultados para Inherited control
Resumo:
The advantages of a spherical imaging model are increasingly well recognized within the robotics community. Perhaps less well known is the use of the sphere for attitude estimation, control and scene structure estimation. This paper proposes the sphere as a unifying concept, not just for cameras, but for sensor fusion, estimation and control. We review and summarize relevant work in these areas and illustrate this with relevant simulation examples for spherical visual servoing and scene structure estimation.
Resumo:
To date, most quad-rotor aerial robots have been based on flying toys. Although such systems can be used as prototypes, they are not sufficiently robust to serve as experimental robotics platforms. We have developed the X-4 Flyer, a quad-rotor robot using custom-built chassis and avionics with off-the-shelf motors and batteries, to be a highly reliable experimental platform. The vehicle uses tuned plant dynamics with an onboard embedded attitude controller to stabilise flight. A linear SISO controller was designed to regulate flyer attitude.
Resumo:
Fast thrust changes are important for authoritive control of VTOL micro air vehicles. Fixed-pitch rotors that alter thrust by varying rotor speed require high-bandwidth control systems to provide adequate performace. We develop a feedback compensator for a brushless hobby motor driving a custom rotor suitable for UAVs. The system plant is identified using step excitation experiments. The aerodynamic operating conditions of these rotors are unusual and so experiments are performed to characterise expected load disturbances. The plant and load models lead to a proportional controller design capable of significantly decreasing rise-time and propagation of disturbances, subject to bus voltage constraints.
Resumo:
We present details and results obtained with an underwater system comprising two different autonomous underwater robots (AUV) and ten static underwater nodes (USN) networked together optically and acoustically. The AUVs can locate and hover above the static nodes for data upload, and they can perform network maintenance functions such as deployment, relocation, and recovery. The AUVs can also locate each other, dock, and move using coordinated control that takes advantage of each AUV’s strength.
Resumo:
This paper investigates a mobile, wireless sensor/actuator network application for use in the cattle breeding industry. Our goal is to prevent fighting between bulls in on-farm breeding paddocks by autonomously applying appropriate stimuli when one bull approaches another bull. This is an important application because fighting between high-value animals such as bulls during breeding seasons causes significant financial loss to producers. Furthermore, there are significant challenges in this type of application because it requires dynamic animal state estimation, real-time actuation and efficient mobile wireless transmissions. We designed and implemented an animal state estimation algorithm based on a state-machine mechanism for each animal. Autonomous actuation is performed based on the estimated states of an animal relative to other animals. A simple, yet effective, wireless communication model has been proposed and implemented to achieve high delivery rates in mobile environments. We evaluated the performance of our design by both simulations and field experiments, which demonstrated the effectiveness of our autonomous animal control system.
Resumo:
Managing livestock movement in extensive systems has environmental and production benefits. Currently permanent wire fencing is used to control cattle; this is both expensive and inflexible. Cattle are known to respond to auditory and visual cues and we investigated whether these can be used to manipulate their behaviour. Twenty-five Belmont Red steers with a mean live weight of 270kg were each randomly assigned to one of five treatments. Treatments consisted of a combination of cues (audio, tactile and visual stimuli) and consequence (electrical stimulation). The treatments were electrical stimulation alone, audio plus electrical stimulation, vibration plus electrical stimulation, light plus electrical stimulation and electrified electric fence (6kV) plus electrical stimulation. Cue stimuli were administered for 3s followed immediately by electrical stimulation (consequence) of 1kV for 1s. The experiment tested the operational efficacy of an on-animal control or virtual fencing system. A collar-halter device was designed to carry the electronics, batteries and equipment providing the stimuli, including audio, vibration, light and electrical of a prototype virtual fencing device. Cattle were allowed to travel along a 40m alley to a group of peers and feed while their rate of travel and response to the stimuli were recorded. The prototype virtual fencing system was successful in modifying the behaviour of the cattle. The rate of travel of cattle along the alley demonstrated the large variability in behavioural response associated with tactile, visual and audible cues. The experiment demonstrated virtual fencing has potential for controlling cattle in extensive grazing systems. However, larger numbers of cattle need to be tested to derive a better understanding of the behavioural variance. Further controlled experimental work is also necessary to quantify the interaction between cues, consequences and cattle learning.
Resumo:
This paper considers the question of designing a fully image-based visual servo control for a class of dynamic systems. The work is motivated by the ongoing development of image-based visual servo control of small aerial robotic vehicles. The kinematics and dynamics of a rigid-body dynamical system (such as a vehicle airframe) maneuvering over a flat target plane with observable features are expressed in terms of an unnormalized spherical centroid and an optic flow measurement. The image-plane dynamics with respect to force input are dependent on the height of the camera above the target plane. This dependence is compensated by introducing virtual height dynamics and adaptive estimation in the proposed control. A fully nonlinear adaptive control design is provided that ensures asymptotic stability of the closed-loop system for all feasible initial conditions. The choice of control gains is based on an analysis of the asymptotic dynamics of the system. Results from a realistic simulation are presented that demonstrate the performance of the closed-loop system. To the author's knowledge, this paper documents the first time that an image-based visual servo control has been proposed for a dynamic system using vision measurement for both position and velocity.
Resumo:
Controlling free-ranging livestock requires low-stress cues to alter animal behaviour. Recently modulated sound and electric shock were demonstrated to be effective in controlling free-ranging cattle. In this study the behaviour of 60, 300 kg Belmont Red heifers were observed for behavioural changes when presented cues designed to impede their movement through an alley. The heifers were given an overnight drylot shrink off feed but not drinking water prior to being tested. Individual cattle were allowed to move down a 6.5 m wide alley towards a pen of peers and feed located 71 m from their point of release. Each animal was allowed to move through the alley unimpeded five times to establish a basal behavioural pattern. Animals were then randomly assigned to treatments consisting of sound plus shock, vibration plus shock, a visual cue plus shock, shock by itself and a control. The time each animal required to reach the pen of peers and feed was recorded. If the animal was prevented from reaching the pen of peers and feed by not penetrating through the cue barrier at set points along the alley for at least 60 sec the test was stopped and the animal was returned to peers located behind the release pen. Cues and shock were manually applied from a laptop while animals were observed from a 3.5 m tower located outside the alley. Electric shock, sound, vibration and Global Position System (GPS) hardware were housed in a neck collar. Results and implications will be discussed.
Resumo:
The article described an open-source toolbox for machine vision called Machine Vision Toolbox (MVT). MVT includes more than 60 functions including image file reading and writing, acquisition, display, filtering, blob, point and line feature extraction, mathematical morphology, homographies, visual Jacobians, camera calibration, and color space conversion. MVT can be used for research into machine vision but is also versatile enough to be usable for real-time work and even control. MVT, combined with MATLAB and a model workstation computer, is a useful and convenient environment for the investigation of machine vision algorithms. The article illustrated the use of a subset of toolbox functions for some typical problems and described MVT operations including the simulation of a complete image-based visual servo system.
Resumo:
Virtual fencing has the potential to control grazing livestock. Understanding and refi ning the cues that can alter behaviour is an integral part of autonomous animal control. A series of tests have been completed to explore the relationship between temperament and control. Prior to exposure to virtual fencing control the animals were scored for temperament using fl ight speed and a sociability index using contact logging devices. The behavioural response of 30, Belmont Red steers were observed for behavioural changes when presented with cues prior to receiving an electrical stimulation. A control and four treatments designed to interrupt the animal’s movement down an alley were tested. The treatments consisted of sound plus electrical stimulation, vibration plus electrical stimulation, a visual cue plus electrical stimulation and electrical stimulation by itself. The treatments were randomly applied to each animal over fi ve consecutive trials. A control treatment in which no cues were applied was used to establish a basal behavioural pattern. A trial was considered completed after each animal had been retained behind the cue barrier for at least 60 sec. All cues and electrical stimulation were manually applied from a laptop located on a portable 3.5 m tower located immediately outside the alley. The electric stimulation consisted of 1.0 Kv of electricity. Electric stimulation, sound and vibration along with the Global Position System (GPS) hardware to autonomously record the animal’s path within the alley were recorded every second.
Resumo:
This paper describes some new wireless sensor hardware developed for pastoral and environmental applications. From our early experiments with Mote hardware we were inspired to develop our devices with improved radio range, solar power capability, mechanical and electrical robustness, and with unique combinations of sensors. Here we describe the design and evolution of a small family of devices: radio/processor board, a soil moisture sensor interface, and a single board multi-sensor unit for animal tracking experiments.
Resumo:
This paper investigates the automatic atti- tude and depth control of a torpedo shaped submarine. Both experimental results and dynamic simulations are used to tune feed- back control loops in order to obtain stable control of yaw, pitch and roll of the craft.
Resumo:
Visual servoing has been a viable method of robot manipulator control for more than a decade. Initial developments involved positionbased visual servoing (PBVS), in which the control signal exists in Cartesian space. The younger method, image-based visual servoing (IBVS), has seen considerable development in recent years. PBVS and IBVS offer tradeoffs in performance, and neither can solve all tasks that may confront a robot. In response to these issues, several methods have been devised that partition the control scheme, allowing some motions to be performed in the manner of a PBVS system, while the remaining motions are performed using an IBVS approach. To date, there has been little research that explores the relative strengths and weaknesses of these methods. In this paper we present such an evaluation. We have chosen three recent visual servo approaches for evaluation in addition to the traditional PBVS and IBVS approaches. We posit a set of performance metrics that measure quantitatively the performance of a visual servo controller for a specific task. We then evaluate each of the candidate visual servo methods for four canonical tasks with simulations and with experiments in a robotic work cell.