576 resultados para Electrical submersible pumping. Automation. Control. Artificial lift
Resumo:
This paper describes a novel experiment in which two very different methods of underwater robot localization are compared. The first method is based on a geometric approach in which a mobile node moves within a field of static nodes, and all nodes are capable of estimating the range to their neighbours acoustically. The second method uses visual odometry, from stereo cameras, by integrating scaled optical flow. The fundamental algorithmic principles of each localization technique is described. We also present experimental results comparing acoustic localization with GPS for surface operation, and a comparison of acoustic and visual methods for underwater operation.
Resumo:
This paper demonstrates some interesting connections between the hitherto disparate fields of mobile robot navigation and image-based visual servoing. A planar formulation of the well-known image-based visual servoing method leads to a bearing-only navigation system that requires no explicit localization and directly yields desired velocity. The well known benefits of image-based visual servoing such as robustness apply also to the planar case. Simulation results are presented.
Resumo:
Managing livestock movement in extensive systems has environmental and production benefits. Currently permanent wire fencing is used to control cattle; this is both expensive and inflexible. Cattle are known to respond to auditory and visual cues and we investigated whether these can be used to manipulate their behaviour. Twenty-five Belmont Red steers with a mean live weight of 270kg were each randomly assigned to one of five treatments. Treatments consisted of a combination of cues (audio, tactile and visual stimuli) and consequence (electrical stimulation). The treatments were electrical stimulation alone, audio plus electrical stimulation, vibration plus electrical stimulation, light plus electrical stimulation and electrified electric fence (6kV) plus electrical stimulation. Cue stimuli were administered for 3s followed immediately by electrical stimulation (consequence) of 1kV for 1s. The experiment tested the operational efficacy of an on-animal control or virtual fencing system. A collar-halter device was designed to carry the electronics, batteries and equipment providing the stimuli, including audio, vibration, light and electrical of a prototype virtual fencing device. Cattle were allowed to travel along a 40m alley to a group of peers and feed while their rate of travel and response to the stimuli were recorded. The prototype virtual fencing system was successful in modifying the behaviour of the cattle. The rate of travel of cattle along the alley demonstrated the large variability in behavioural response associated with tactile, visual and audible cues. The experiment demonstrated virtual fencing has potential for controlling cattle in extensive grazing systems. However, larger numbers of cattle need to be tested to derive a better understanding of the behavioural variance. Further controlled experimental work is also necessary to quantify the interaction between cues, consequences and cattle learning.
Resumo:
This paper considers the question of designing a fully image-based visual servo control for a class of dynamic systems. The work is motivated by the ongoing development of image-based visual servo control of small aerial robotic vehicles. The kinematics and dynamics of a rigid-body dynamical system (such as a vehicle airframe) maneuvering over a flat target plane with observable features are expressed in terms of an unnormalized spherical centroid and an optic flow measurement. The image-plane dynamics with respect to force input are dependent on the height of the camera above the target plane. This dependence is compensated by introducing virtual height dynamics and adaptive estimation in the proposed control. A fully nonlinear adaptive control design is provided that ensures asymptotic stability of the closed-loop system for all feasible initial conditions. The choice of control gains is based on an analysis of the asymptotic dynamics of the system. Results from a realistic simulation are presented that demonstrate the performance of the closed-loop system. To the author's knowledge, this paper documents the first time that an image-based visual servo control has been proposed for a dynamic system using vision measurement for both position and velocity.
Resumo:
Performing reliable localisation and navigation within highly unstructured underwater coral reef environments is a difficult task at the best of times. Typical research and commercial underwater vehicles use expensive acoustic positioning and sonar systems which require significant external infrastructure to operate effectively. This paper is focused on the development of a robust vision-based motion estimation technique using low-cost sensors for performing real-time autonomous and untethered environmental monitoring tasks in the Great Barrier Reef without the use of acoustic positioning. The technique is experimentally shown to provide accurate odometry and terrain profile information suitable for input into the vehicle controller to perform a range of environmental monitoring tasks.
Resumo:
Performing reliable localisation and navigation within highly unstructured underwater coral reef environments is a difficult task at the best of times. Typical research and commercial underwater vehicles use expensive acoustic positioning and sonar systems which require significant external infrastructure to operate effectively. This paper is focused on the development of a robust vision-based motion estimation technique using low-cost sensors for performing real-time autonomous and untethered environmental monitoring tasks in the Great Barrier Reef without the use of acoustic positioning. The technique is experimentally shown to provide accurate odometry and terrain profile information suitable for input into the vehicle controller to perform a range of environmental monitoring tasks.
Resumo:
We consider the problem of monitoring and controlling the position of herd animals, and view animals as networked agents with natural mobility but not strictly controllable. By exploiting knowledge of individual and herd behavior we would like to apply a vast body of theory in robotics and motion planning to achieving the constrained motion of a herd. In this paper we describe the concept of a virtual fence which applies a stimulus to an animal as a function of its pose with respect to the fenceline. Multiple fence lines can define a region, and the fences can be static or dynamic. The fence algorithm is implemented by a small position-aware computer device worn by the animal, which we refer to as a Smart Collar.We describe a herd-animal simulator, the Smart Collar hardware and algorithms for tracking and controlling animals as well as the results of on-farm experiments with up to ten Smart Collars.
Resumo:
Starbug is an inexpensive, miniature autonomous underwater vehicle ideal for data collection and ecosystem surveys. Starbug is small enough to be launched by one person without the need for specialised equipment, such as cranes, and it operates with minimal to no human intervention. Starbug was one of the first autonomous underwater vehicles (AUVs) in the world where vision is the primary means of navigation and control. More details of Starbug can be found here: http://www.csiro.au/science/starbug.html
Resumo:
Controlling free-ranging livestock requires low-stress cues to alter animal behaviour. Recently modulated sound and electric shock were demonstrated to be effective in controlling free-ranging cattle. In this study the behaviour of 60, 300 kg Belmont Red heifers were observed for behavioural changes when presented cues designed to impede their movement through an alley. The heifers were given an overnight drylot shrink off feed but not drinking water prior to being tested. Individual cattle were allowed to move down a 6.5 m wide alley towards a pen of peers and feed located 71 m from their point of release. Each animal was allowed to move through the alley unimpeded five times to establish a basal behavioural pattern. Animals were then randomly assigned to treatments consisting of sound plus shock, vibration plus shock, a visual cue plus shock, shock by itself and a control. The time each animal required to reach the pen of peers and feed was recorded. If the animal was prevented from reaching the pen of peers and feed by not penetrating through the cue barrier at set points along the alley for at least 60 sec the test was stopped and the animal was returned to peers located behind the release pen. Cues and shock were manually applied from a laptop while animals were observed from a 3.5 m tower located outside the alley. Electric shock, sound, vibration and Global Position System (GPS) hardware were housed in a neck collar. Results and implications will be discussed.
Resumo:
Virtual fencing has the potential to control grazing livestock. Understanding and refi ning the cues that can alter behaviour is an integral part of autonomous animal control. A series of tests have been completed to explore the relationship between temperament and control. Prior to exposure to virtual fencing control the animals were scored for temperament using fl ight speed and a sociability index using contact logging devices. The behavioural response of 30, Belmont Red steers were observed for behavioural changes when presented with cues prior to receiving an electrical stimulation. A control and four treatments designed to interrupt the animal’s movement down an alley were tested. The treatments consisted of sound plus electrical stimulation, vibration plus electrical stimulation, a visual cue plus electrical stimulation and electrical stimulation by itself. The treatments were randomly applied to each animal over fi ve consecutive trials. A control treatment in which no cues were applied was used to establish a basal behavioural pattern. A trial was considered completed after each animal had been retained behind the cue barrier for at least 60 sec. All cues and electrical stimulation were manually applied from a laptop located on a portable 3.5 m tower located immediately outside the alley. The electric stimulation consisted of 1.0 Kv of electricity. Electric stimulation, sound and vibration along with the Global Position System (GPS) hardware to autonomously record the animal’s path within the alley were recorded every second.
Resumo:
In this paper we discuss how a network of sensors and robots can cooperate to solve important robotics problems such as localization and navigation. We use a robot to localize sensor nodes, and we then use these localized nodes to navigate robots and humans through the sensorized space. We explore these novel ideas with results from two large-scale sensor network and robot experiments involving 50 motes, two types of flying robot: an autonomous helicopter and a large indoor cable array robot, and a human-network interface. We present the distributed algorithms for localization, geographic routing, path definition and incremental navigation. We also describe how a human can be guided using a simple hand-held device that interfaces to this same environmental infrastructure.
Resumo:
The development of autonomous air vehicles can be an expensive research pursuit. To alleviate some of the financial burden of this process, we have constructed a system consisting of four winches each attached to a central pod (the simulated air vehicle) via cables - a cable-array robot. The system is capable of precisely controlling the three dimensional position of the pod allowing effective testing of sensing and control strategies before experimentation on a free-flying vehicle. In this paper, we present a brief overview of the system and provide a practical control strategy for such a system.