123 resultados para ground-based measurement


Relevância:

40.00% 40.00%

Publicador:

Resumo:

For many years, computer vision has lured researchers with promises of a low-cost, passive, lightweight and information-rich sensor suitable for navigation purposes. The prime difficulty in vision-based navigation is that the navigation solution will continually drift with time unless external information is available, whether it be cues from the appearance of the scene, a map of features (whether built online or known a priori), or from an externally-referenced sensor. It is not merely position that is of interest in the navigation problem. Attitude (i.e. the angular orientation of a body with respect to a reference frame) is integral to a visionbased navigation solution and is often of interest in its own right (e.g. flight control). This thesis examines vision-based attitude estimation in an aerospace environment, and two methods are proposed for constraining drift in the attitude solution; one through a novel integration of optical flow and the detection of the sky horizon, and the other through a loosely-coupled integration of Visual Odometry and GPS position measurements. In the first method, roll angle, pitch angle and the three aircraft body rates are recovered though a novel method of tracking the horizon over time and integrating the horizonderived attitude information with optical flow. An image processing front-end is used to select several candidate lines in a image that may or may not correspond to the true horizon, and the optical flow is calculated for each candidate line. Using an Extended Kalman Filter (EKF), the previously estimated aircraft state is propagated using a motion model and a candidate horizon line is associated using a statistical test based on the optical flow measurements and location of the horizon in the image. Once associated, the selected horizon line, along with the associated optical flow, is used as a measurement to the EKF. To evaluate the accuracy of the algorithm, two flights were conducted, one using a highly dynamic Uninhabited Airborne Vehicle (UAV) in clear flight conditions and the other in a human-piloted Cessna 172 in conditions where the horizon was partially obscured by terrain, haze and smoke. The UAV flight resulted in pitch and roll error standard deviations of 0.42° and 0.71° respectively when compared with a truth attitude source. The Cessna 172 flight resulted in pitch and roll error standard deviations of 1.79° and 1.75° respectively. In the second method for estimating attitude, a novel integrated GPS/Visual Odometry (GPS/VO) navigation filter is proposed, using a structure similar to a classic looselycoupled GPS/INS error-state navigation filter. Under such an arrangement, the error dynamics of the system are derived and a Kalman Filter is developed for estimating the errors in position and attitude. Through similar analysis to the GPS/INS problem, it is shown that the proposed filter is capable of recovering the complete attitude (i.e. pitch, roll and yaw) of the platform when subjected to acceleration not parallel to velocity for both the monocular and stereo variants of the filter. Furthermore, it is shown that under general straight line motion (e.g. constant velocity), only the component of attitude in the direction of motion is unobservable. Numerical simulations are performed to demonstrate the observability properties of the GPS/VO filter in both the monocular and stereo camera configurations. Furthermore, the proposed filter is tested on imagery collected using a Cessna 172 to demonstrate the observability properties on real-world data. The proposed GPS/VO filter does not require additional restrictions or assumptions such as platform-specific dynamics, map-matching, feature-tracking, visual loop-closing, gravity vector or additional sensors such as an IMU or magnetic compass. Since no platformspecific dynamics are required, the proposed filter is not limited to the aerospace domain and has the potential to be deployed in other platforms such as ground robots or mobile phones.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The future emergence of many types of airborne vehicles and unpiloted aircraft in the national airspace means collision avoidance is of primary concern in an uncooperative airspace environment. The ability to replicate a pilot’s see and avoid capability using cameras coupled with vision based avoidance control is an important part of an overall collision avoidance strategy. But unfortunately without range collision avoidance has no direct way to guarantee a level of safety. Collision scenario flight tests with two aircraft and a monocular camera threat detection and tracking system were used to study the accuracy of image-derived angle measurements. The effect of image-derived angle errors on reactive vision-based avoidance performance was then studied by simulation. The results show that whilst large angle measurement errors can significantly affect minimum ranging characteristics across a variety of initial conditions and closing speeds, the minimum range is always bounded and a collision never occurs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we present a monocular vision based autonomous navigation system for Micro Aerial Vehicles (MAVs) in GPS-denied environments. The major drawback of monocular systems is that the depth scale of the scene can not be determined without prior knowledge or other sensors. To address this problem, we minimize a cost function consisting of a drift-free altitude measurement and up-to-scale position estimate obtained using the visual sensor. We evaluate the scale estimator, state estimator and controller performance by comparing with ground truth data acquired using a motion capture system. All resources including source code, tutorial documentation and system models are available online.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Process compliance measurement is getting increasing attention in companies due to stricter legal requirements and market pressure for operational excellence. On the other hand, the metrics to quantify process compliance have only been defined recently. A major criticism points to the fact that existing measures appear to be unintuitive. In this paper, we trace back this problem to a more foundational question: which notion of behavioural equivalence is appropriate for discussing compliance? We present a quantification approach based on behavioural profiles, which is a process abstraction mechanism. Behavioural profiles can be regarded as weaker than existing equivalence notions like trace equivalence, and they can be calculated efficiently. As a validation, we present a respective implementation that measures compliance of logs against a normative process model. This implementation is being evaluated in a case study with an international service provider.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The understanding of the load applied on the residuum through the prosthesis of individuals with transfemoral amputation (TFA) is essential to address a number of concerns that could strongly reduce their quality of life (e.g., residuum skin lesion, prosthesis fitting, alignment). This inner prosthesis loading could be estimated using a typical gait laboratory relying on inverse dynamics equations. Alternative, technological advances proposed over the last decade enabled direct measurement of this kinetic information in a broad variety of situations that could potentially be more relevant in clinical settings. The purposes of this presentation are (A) to review the literature about recent developments in measure and analyses of inner prosthesis loading of TFA, and (B) to extract information that could potentially contribute to a better evidence-based practice.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The demand for an evidence-based clinical practice involving lower limb amputees is increasing. Some of the critical care decisions are related to the loading applied on the residuum partially responsible for comfort and functional outcome. This loading can be assessed using inverse dynamics equations. Typically, this method requires a gait laboratory (e.g., 3D motion analysis system, force-plates). It is mainly suited for the analysis only few steps of walking while being expensive and labour intensive. However, recent scientific and industrial developments demonstrated that discrete and light portable sensors can be placed within the prosthesis to measure accurately the loading during an unlimited number of steps and activities of daily living. Several studies indicated that method based on direct measurements might provide more realistic results. Furthermore, it is a user-friendly method more accessible to clinicians, such as prosthetists. The purpose of this symposium will be to give an overview of these additional opportunities for clinicians to obtain relevant data for evidence-based practice. The three main aims will be: • To present some of the equipment used for direct measurements, • To propose ways to analyse some key data sets, • To give some practical example of data sets for transtibial and transfemoral amputees.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research improved the measurement of public transport accessibility by capturing; travellers' behaviour; diversity of public transport mode; and the subjectivity of travellers' decision in the complex transport networks. The results of this research not only highlighted the importance of considering public transport network characteristics but also, revealed the impact of public transport diversity in the modelling of public transport accessibility. The research developed a hybrid discrete choice model with a nested logit structure to treat the correlation among the public transport mode choices and, a logit correction factor to rectify the correlation among the stop choices.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Online dynamic load modeling has become possible with the availability of Static Voltage Compensator (SVC) and Phasor Measurement Unit (PMU) devices. The power of the load response to the small random bounded voltage fluctuations caused from SVC can be measured by PMU for modelling purposes. The aim of this paper is to illustrate the capability of identifying an aggregated load model from high voltage substation level in the online environment. The induction motor is used as the main test subject since it contributes the majority of the dynamic loads. A test system representing simple electromechanical generator model serving dynamic loads through the transmission network is used to verify the proposed method. Also, dynamic load with multiple induction motors are modeled to achieve a better realistic load representation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The measurement of ICT (information and communication technology) integration is emerging as an area of research interest with such systems as Education Queensland including it in their recently released list of research priorities. Studies to trial differing integration measurement instruments have taken place within Australia in the last few years, particularly Western Australia (Trinidad, Clarkson, & Newhouse, 2004; Trinidad, Newhouse & Clarkson, 2005), Tasmania (Fitzallen 2005) and Queensland (Finger, Proctor, & Watson, 2005). This paper will add to these investigations by describing an alternate and original methodological approach which was trialled in a small-scale pilot study conducted jointly by Queensland Catholic Education Commission (QCEC) and the Centre of Learning Innovation, Queensland University of Technology (QUT) in late 2005. The methodology described is based on tasks which, through a process of profiling, can be seen to be artefacts which embody the internal and external factors enabling and constraining ICT integration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A range of influences, technical and organizational, has encouraged the wide spread adaption of Enterprise Systems (ES). Nevertheless, there is a growing consensus that Enterprise Systems have in the many cases failed to provide the expected benefits to organizations. This paper presents ongoing research, which analyzes the benefits realization approach of the Queensland Government. This approach applies a modified Balance Scorecard. First, history and background of Queensland Government’s Enterprise Systems initiative is introduced. Second, the most common reasons for ES under performance are related. Third, relevant performance measurement models and the Balanced Scorecard in particular are discussed. Finally, the Queensland Government initiative is evaluated in light of this overview of current work in the area. In the current and future work, the authors aim to use their active involvement in Queensland Government’s benefits realization initiative for an Action Research based project investigating the appropriateness of the Balanced Scorecard for the purposes of Enterprise Systems benefits realization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Principal Topic: It is well known that most new ventures suffer from a significant lack of resources, which increases the risk of failure (Shepherd, Douglas and Shanley, 2000) and makes it difficult to attract stakeholders and financing for the venture (Bhide & Stevenson, 1999). The Resource-Based View (RBV) (Barney, 1991; Wernerfelt, 1984) is a dominant theoretical base increasingly drawn on within Strategic Management. While theoretical contributions applying RBV in the domain of entrepreneurship can arguably be traced back to Penrose (1959), there has been renewed attention recently (e.g. Alvarez & Busenitz, 2001; Alvarez & Barney, 2004). This said, empirical work is in its infancy. In part, this may be due to a lack of well developed measuring instruments for testing ideas derived from RBV. The purpose of this study is to develop a measurement scales that can serve to assist such empirical investigations. In so doing we will try to overcome three deficiencies in current empirical measures used for the application of RBV to the entrepreneurship arena. First, measures for resource characteristics and configurations associated with typical competitive advantages found in entrepreneurial firms need to be developed. These include such things as alertness and industry knowledge (Kirzner, 1973), flexibility (Ebben & Johnson, 2005), strong networks (Lee et al., 2001) and within knowledge intensive contexts, unique technical expertise (Wiklund and Shepard, 2003). Second, the RBV has the important limitations of being relatively static and modelled on large, established firms. In that context, traditional RBV focuses on competitive advantages. However, newly established firms often face disadvantages, especially those associated with the liabilities of newness (Aldrich & Auster, 1986). It is therefore important in entrepreneurial contexts to expand to an investigation of responses to competitive disadvantage through an RBV lens. Conversely, recent research has suggested that resource constraints actually have a positive effect on firm growth and performance under some circumstances (e.g., George, 2005; Katila & Shane, 2005; Mishina et al., 2004; Mosakowski, 2002; cf. also Baker & Nelson, 2005). Third, current empirical applications of RBV measured levels or amounts of particular resources available to a firm. They infer that these resources deliver firms competitive advantage by establishing a relationship between these resource levels and performance (e.g. via regression on profitability). However, there is the opportunity to directly measure the characteristics of resource configurations that deliver competitive advantage, such as Barney´s well known VRIO (Valuable, Rare, Inimitable and Organized) framework (Barney, 1997). Key Propositions and Methods: The aim of our study is to develop and test scales for measuring resource advantages (and disadvantages) and inimitability for entrepreneurial firms. The study proceeds in three stages. The first stage developed our initial scales based on earlier literature. Where possible, we adapt scales based on previous work. The first block of the scales related to the level of resource advantages and disadvantages. Respondents were asked the degree to which each resource category represented an advantage or disadvantage relative to other businesses in their industry on a 5 point response scale: Major Disadvantage, Slight Disadvantage, No Advantage or Disadvantage, Slight Advantage and Major Advantage. Items were developed as follows. Network capabilities (3 items) were adapted from (Madsen, Alsos, Borch, Ljunggren & Brastad, 2006). Knowledge resources marketing expertise / customer service (3 items) and technical expertise (3 items) were adapted from Wiklund and Shepard (2003). flexibility (2 items), costs (4 items) were adapted from JIBS B97. New scales were developed for industry knowledge / alertness (3 items) and product / service advantages. The second block asked the respondent to nominate the most important resource advantage (and disadvantage) of the firm. For the advantage, they were then asked four questions to determine how easy it would be for other firms to imitate and/or substitute this resource on a 5 point likert scale. For the disadvantage, they were asked corresponding questions related to overcoming this disadvantage. The second stage involved two pre-tests of the instrument to refine the scales. The first was an on-line convenience sample of 38 respondents. The second pre-test was a telephone interview with a random sample of 31 Nascent firms and 47 Young firms (< 3 years in operation) generated using a PSED method of randomly calling households (Gartner et al. 2004). Several items were dropped or reworded based on the pre-tests. The third stage (currently in progress) is part of Wave 1 of CAUSEE (Nascent Firms) and FEDP (Young Firms), a PSED type study being conducted in Australia. The scales will be tested and analysed with a random sample of approximately 700 Nascent and Young firms respectively. In addition, a judgement sample of approximately 100 high potential businesses in each category will be included. Findings and Implications: The paper will report the results of the main study (stage 3 – currently data collection is in progress) will allow comparison of the level of resource advantage / disadvantage across various sub-groups of the population. Of particular interest will be a comparison of the high potential firms with the random sample. Based on the smaller pre-tests (N=38 and N=78) the factor structure of the items confirmed the distinctiveness of the constructs. The reliabilities are within an acceptable range: Cronbach alpha ranged from 0.701 to 0.927. The study will provide an opportunity for researchers to better operationalize RBV theory in studies within the domain of entrepreneurship. This is a fundamental requirement for the ability to test hypotheses derived from RBV in systematic, large scale research studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evidence-based Practice (EBP) has recently emerged as a topic of discussion amongst professionals within the library and information services (LIS) industry. Simply stated, EBP is the process of using formal research skills and methods to assist in decision making and establishing best practice. The emerging interest in EBP within the library context serves to remind the library profession that research skills and methods can help ensure that the library industry remains current and relevant in changing times. The LIS sector faces ongoing challenges in terms of the expectation that financial and human resources will be managed efficiently, particularly if library budgets are reduced and accountability to the principal stakeholders is increased. Library managers are charged with the responsibility to deliver relevant and cost effective services, in an environment characterised by rapidly changing models of information provision, information access and user behaviours. Consequently they are called upon not only to justify the services they provide, or plan to introduce, but also to measure the effectiveness of these services and to evaluate the impact on the communities they serve. The imperative for innovation in and enhancements to library practice is accompanied by the need for a strong understanding of the processes of review, measurement, assessment and evaluation. In 2001 the Centre for Information Research was commissioned by the Chartered Institute of Library and Information Professionals (CILIP) in the UK to conduct an examination into the research landscape for library and information science. The examination concluded that research is “important for the LIS [library and information science] domain in a number of ways” (McNicol & Nankivell, 2001, p.77). At the professional level, research can inform practice, assist in the future planning of the profession, raise the profile of the discipline, and indeed the reputation and standing of the library and information service itself. At the personal level, research can “broaden horizons and offer individuals development opportunities” (McNicol & Nankivell, 2001, p.77). The study recommended that “research should be promoted as a valuable professional activity for practitioners to engage in” (McNicol & Nankivell, 2001, p.82). This chapter will consider the role of EBP within the library profession. A brief review of key literature in the area is provided. The review considers issues of definition and terminology, highlights the importance of research in professional practice and outlines the research approaches that underpin EBP. The chapter concludes with a consideration of the specific application of EBP within the dynamic and evolving field of information literacy (IL).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigates the problem of robot navigation using only landmark bearings. The proposed system allows a robot to move to a ground target location specified by the sensor values observed at this ground target posi- tion. The control actions are computed based on the difference between the current landmark bearings and the target landmark bearings. No Cartesian coordinates with respect to the ground are computed by the control system. The robot navigates using solely information from the bearing sensor space. Most existing robot navigation systems require a ground frame (2D Cartesian coordinate system) in order to navigate from a ground point A to a ground point B. The commonly used sensors such as laser range scanner, sonar, infrared, and vision do not directly provide the 2D ground coordi- nates of the robot. The existing systems use the sensor measurements to localise the robot with respect to a map, a set of 2D coordinates of the objects of interest. It is more natural to navigate between the points in the sensor space corresponding to A and B without requiring the Cartesian map and the localisation process. Research on animals has revealed how insects are able to exploit very limited computational and memory resources to successfully navigate to a desired destination without computing Cartesian positions. For example, a honeybee balances the left and right optical flows to navigate in a nar- row corridor. Unlike many other ants, Cataglyphis bicolor does not secrete pheromone trails in order to find its way home but instead uses the sun as a compass to keep track of its home direction vector. The home vector can be inaccurate, so the ant also uses landmark recognition. More precisely, it takes snapshots and compass headings of some landmarks. To return home, the ant tries to line up the landmarks exactly as they were before it started wandering. This thesis introduces a navigation method based on reflex actions in sensor space. The sensor vector is made of the bearings of some landmarks, and the reflex action is a gradient descent with respect to the distance in sensor space between the current sensor vector and the target sensor vec- tor. Our theoretical analysis shows that except for some fully characterized pathological cases, any point is reachable from any other point by reflex action in the bearing sensor space provided the environment contains three landmarks and is free of obstacles. The trajectories of a robot using reflex navigation, like other image- based visual control strategies, do not correspond necessarily to the shortest paths on the ground, because the sensor error is minimized, not the moving distance on the ground. However, we show that the use of a sequence of waypoints in sensor space can address this problem. In order to identify relevant waypoints, we train a Self Organising Map (SOM) from a set of observations uniformly distributed with respect to the ground. This SOM provides a sense of location to the robot, and allows a form of path planning in sensor space. The navigation proposed system is analysed theoretically, and evaluated both in simulation and with experiments on a real robot.