860 resultados para Vision-based row tracking algorithm


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are a range of studies based in the low carbon arena which use various ‘futures’- based techniques as ways of exploring uncertainties. These techniques range from ‘scenarios’ and ‘roadmaps’ through to ‘transitions’ and ‘pathways’ as well as ‘vision’-based techniques. The overall aim of the paper is therefore to compare and contrast these techniques to develop a simple working typology with the further objective of identifying the implications of this analysis for RETROFIT 2050. Using recent examples of city-based and energy-based studies throughout, the paper compares and contrasts these techniques and finds that the distinctions between them have often been blurred in the field of low carbon. Visions, for example, have been used in both transition theory and futures/Foresight methods, and scenarios have also been used in transition-based studies as well as futures/Foresight studies. Moreover, Foresight techniques which capture expert knowledge and map existing knowledge to develop a set of scenarios and roadmaps which can inform the development of transitions and pathways can not only help potentially overcome any ‘disconnections’ that may exist between the social and the technical lenses in which such future trajectories are mapped, but also promote a strong ‘co-evolutionary’ content.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ERA-Interim reanalysis data from the past 35 years have been used with a newly-developed feature tracking algorithm to identify Indian monsoon depressions originating in or near the Bay of Bengal. These were then rotated, centralised and combined to give a fully three-dimensional 106-depression composite structure – a considerably larger sample than any previous detailed study on monsoon depressions and their structure. Many known features of depression structure are confirmed, particularly the existence of a maximum to the southwest of the centre in rainfall and other fields, and a westward axial tilt in others. Additionally, the depressions are found to have significant asymmetry due to the presence of the Himalayas; a bimodal mid-tropospheric potential vorticity core; a separation into thermally cold- (~–1.5K) and neutral- (~0K) cores near the surface with distinct properties; and that the centre has very large CAPE and very small CIN. Variability as a function of background state has also been explored, with land/coast/sea, diurnal, ENSO, active/break and Indian Ocean Dipole contrasts considered. Depressions are found to be markedly stronger during the active phase of the monsoon, as well as during La Niña. Depressions on land are shown to be more intense and more tightly constrained to the central axis. A detailed schematic diagram of a vertical cross-section through a composite depression is also presented, showing its inherent asymmetric structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the top ten most influential data mining algorithms, k-means, is known for being simple and scalable. However, it is sensitive to initialization of prototypes and requires that the number of clusters be specified in advance. This paper shows that evolutionary techniques conceived to guide the application of k-means can be more computationally efficient than systematic (i.e., repetitive) approaches that try to get around the above-mentioned drawbacks by repeatedly running the algorithm from different configurations for the number of clusters and initial positions of prototypes. To do so, a modified version of a (k-means based) fast evolutionary algorithm for clustering is employed. Theoretical complexity analyses for the systematic and evolutionary algorithms under interest are provided. Computational experiments and statistical analyses of the results are presented for artificial and text mining data sets. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wooden railway sleeper inspections in Sweden are currently performed manually by a human operator; such inspections are based on visual analysis. Machine vision based approach has been done to emulate the visual abilities of human operator to enable automation of the process. Through this process bad sleepers are identified, and a spot is marked on it with specific color (blue in the current case) on the rail so that the maintenance operators are able to identify the spot and replace the sleeper. The motive of this thesis is to help the operators to identify those sleepers which are marked by color (spots), using an “Intelligent Vehicle” which is capable of running on the track. Capturing video while running on the track and segmenting the object of interest (spot) through this vehicle; we can automate this work and minimize the human intuitions. The video acquisition process depends on camera position and source light to obtain fine brightness in acquisition, we have tested 4 different types of combinations (camera position and source light) here to record the video and test the validity of proposed method. A sequence of real time rail frames are extracted from these videos and further processing (depending upon the data acquisition process) is done to identify the spots. After identification of spot each frame is divided in to 9 regions to know the particular region where the spot lies to avoid overlapping with noise, and so on. The proposed method will generate the information regarding in which region the spot lies, based on nine regions in each frame. From the generated results we have made some classification regarding data collection techniques, efficiency, time and speed. In this report, extensive experiments using image sequences from particular camera are reported and the experiments were done using intelligent vehicle as well as test vehicle and the results shows that we have achieved 95% success in identifying the spots when we use video as it is, in other method were we can skip some frames in pre-processing to increase the speed of video but the segmentation results we reduced to 85% and the time was very less compared to previous one. This shows the validity of proposed method in identification of spots lying on wooden railway sleepers where we can compromise between time and efficiency to get the desired result.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a computer-vision based marker-free method for gait-impairment detection in Patients with Parkinson's disease (PWP). The system is based upon the idea that a normal human body attains equilibrium during the gait by aligning the body posture with Axis-of-Gravity (AOG) using feet as the base of support. In contrast, PWP appear to be falling forward as they are less-able to align their body with AOG due to rigid muscular tone. A normal gait exhibits periodic stride-cycles with stride-angle around 45o between the legs, whereas PWP walk with shortened stride-angle with high variability between the stride-cycles. In order to analyze Parkinsonian-gait (PG), subjects were videotaped with several gait-cycles. The subject's body was segmented using a color-segmentation method to form a silhouette. The silhouette was skeletonized for motion cues extraction. The motion cues analyzed were stride-cycles (based on the cyclic leg motion of skeleton) and posture lean (based on the angle between leaned torso of skeleton and AOG). Cosine similarity between an imaginary perfect gait pattern and the subject gait patterns produced 100% recognition rate of PG for 4 normal-controls and 3 PWP. Results suggested that the method is a promising tool to be used for PG assessment in home-environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper a state of the art of a system of automated deduction called SAD is described . An architecture of SAD corresponds well to a modern vision of the Evidence Algorithm programme, initiated by Academician V.Glushkov.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pumping of fluids in pipelines is the most economic and safe form of transporting fluids. That explains why in Europe there was in 1999 about 30.000 Km [7] of pipelines of several diameters, transporting millíons of cubic meters of crude oil end refined products, belonging to COCAWE (assaciation of companies of petroleum of Europe for health, environment and safety, that joint several petroleum companies). In Brazil they are about 18.000 Km of pipelines transporting millions of cubic meters of liquids and gases. In 1999, nine accidents were registered to COCAWE. Among those accidents one brought a fatal victim. The oil loss was of 171 m3, equivalent to O,2 parts per million of the total of the transported volume. Same considering the facts mentioned the costs involved in ao accident can be high. An accident of great proportions can bríng loss of human lives, severe environmental darnages, loss of drained product, loss . for dismissed profit and damages to the image of the company high recovery cost. In consonance with that and in some cases for legal demands, the companies are, more and more, investing in systems of Leak detection in pipelines based on computer algorithm that operate in real time, seeking wíth that to minimize still more the drained volumes. This decreases the impacts at the environment and the costs. In general way, all the systems based on softWare present some type of false alarm. In general a commitment exists betWeen the sensibílity of the system and the number of false alarms. This work has as objective make a review of thé existent methods and to concentrate in the analysis of a specific system, that is, the system based on hydraulic noise, Pressure Point Analyzis (PPA). We will show which are the most important aspects that must be considered in the implementation of a Leak Detection System (LDS), from the initial phase of the analysis of risks passing by the project bases, design, choice of the necessary field instrumentation to several LDS, implementation and tests. We Will make na analysis of events (noises) originating from the flow system that can be generator of false alarms and we will present a computer algorithm that restricts those noises automatically

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A challenge that remains in the robotics field is how to make a robot to react in real time to visual stimulus. Traditional computer vision algorithms used to overcome this problem are still very expensive taking too long when using common computer processors. Very simple algorithms like image filtering or even mathematical morphology operations may take too long. Researchers have implemented image processing algorithms in high parallelism hardware devices in order to cut down the time spent in the algorithms processing, with good results. By using hardware implemented image processing techniques and a platform oriented system that uses the Nios II Processor we propose an approach that uses the hardware processing and event based programming to simplify the vision based systems while at the same time accelerating some parts of the used algorithms

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose in this work a software architecture for robotic boats intended to act in diverse aquatic environments, fully autonomously, performing telemetry to a base station and getting this mission to be accomplished. This proposal aims to apply within the project N-Boat Lab NatalNet DCA, which aims to empower a sailboat navigating autonomously. The constituent components of this architecture are the memory modules, strategy, communication, sensing, actuation, energy, security and surveillance, making these systems the boat and base station. To validate the simulator was developed in C language and implemented using the graphics API OpenGL resources, whose main results were obtained in the implementation of memory, performance and strategy modules, more specifically data sharing, control of sails and rudder and planning short routes based on an algorithm for navigation, respectively. The experimental results, shown in this study indicate the feasibility of the actual use of the software architecture developed and their application in the area of autonomous mobile robotics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of aerodynamic loading variations has many engineering applications, including helicopter rotor blades, wind turbines and turbo machinery. This work uses a Vortex Method to make a lagrangian description of the a twodimensional airfoil/ incident wake vortex interaction. The flow is incompressible, newtonian, homogeneus and the Reynolds Number is 5x105 .The airfoil is a NACA 0018 placed a angle of attack of the 0° and 5°simulates with the Painel Method with a constant density vorticity panels and a generation poit is near the painel. The protector layer is created does not permit vortex inside the body. The vortex Lamb convection is realized with the Euler Method (first order) and Adans-Bashforth (second order). The Random Walk Method is used to simulate the diffusion. The circular wake has 366 vortex all over positive or negative vorticity located at different heights with respect to the airfoil chord. The Lift was calculated based in the algorithm created by Ricci (2002). This simulation uses a ready algorithm vatidated with single body does not have a incident wake. The results are compared with a experimental work The comparasion concludes that the experimental results has a good agrement with this papper

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a multi-objective approach for observing the performance of distribution systems with embedded generators in the steady state, based on heuristic and power system analysis, is proposed. The proposed hybrid performance index describes the quality of the operating state in each considered distribution network configuration. In order to represent the system state, the loss allocation in the distribution systems, based on the Z-bus loss allocation method and compensation-based power flow algorithm, is determined. Also, an investigation of the impact of the integration of embedded generators on the overall performance of the distribution systems in the steady state, is performed. Results obtained from several case studies are presented and discussed. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work intends to analyze the application and execution time of a numerical algorithm that simulates incompressible and isothermal flows. It was used the explicit scheme of the Characteristic Based Split (CBS) algorithm and the Artificial Compressibility (AC) scheme for coupling pressure-velocity equations. The discretization was done with the finite element method using a bilinear elements grid. The free software GNU-Octave was used for implementation and execution of routines. The results were analyzed using the classic lid-driven cavity problem. This work shows results for tests with several Reynolds' number. The results for these tests show a good agreement when compared with previous ones obtained from bibliography. The code runtime's analysis shows yet that the matrix's assembly is the part of greater consumption time in the implementation.