35 resultados para Vision-based row tracking algorithm
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Various molecular systems are available for epidemiological, genetic, evolutionary, taxonomic and systematic studies of innumerable fungal infections, especially those caused by the opportunistic pathogen C. albicans. A total of 75 independent oral isolates were selected in order to compare Multilocus Enzyme Electrophoresis (MLEE), Electrophoretic Karyotyping (EK) and Microsatellite Markers (Simple Sequence Repeats - SSRs), in their abilities to differentiate and group C. albicans isolates (discriminatory power), and also, to evaluate the concordance and similarity of the groups of strains determined by cluster analysis for each fingerprinting method. Isoenzyme typing was performed using eleven enzyme systems: Adh, Sdh, M1p, Mdh, Idh, Gdh, G6pdh, Asd, Cat, Po, and Lap (data previously published). The EK method consisted of chromosomal DNA separation by pulsed-field gel electrophoresis using a CHEF system. The microsatellite markers were investigated by PCR using three polymorphic loci: EF3, CDC3, and HIS3. Dendrograms were generated by the SAHN method and UPGMA algorithm based on similarity matrices (S(SM)). The discriminatory power of the three methods was over 95%, however a paired analysis among them showed a parity of 19.7-22.4% in the identification of strains. Weak correlation was also observed among the genetic similarity matrices (S(SM)(MLEE) x S(SM)(EK) x S(SM)(SSRs)). Clustering analyses showed a mean of 9 +/- 12.4 isolates per cluster (3.8 +/- 8 isolates/taxon) for MLEE, 6.2 +/- 4.9 isolates per cluster (4 +/- 4.5 isolates/taxon) for SSRs, and 4.1 +/- 2.3 isolates per cluster (2.6 +/- 2.3 isolates/taxon) for EK. A total of 45 (13%), 39(11.2%), 5 (1.4%) and 3 (0.9%) clusters pairs from 347 showed similarity (Si) of 0.1-10%, 10.1-20%, 20.1-30% and 30.1-40%, respectively. Clinical and molecular epidemiological correlation involving the opportunistic pathogen C. albicans may be attributed dependently of each method of genotyping (i.e., MLEE, EK, and SSRs) supplemented with similarity and grouping analysis. Therefore, the use of genotyping systems that give results which offer minimum disparity, or the combination of the results of these systems, can provide greater security and consistency in the determination of strains and their genetic relationships. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The study of robust design methodologies and techniques has become a new topical area in design optimizations in nearly all engineering and applied science disciplines in the last 10 years due to inevitable and unavoidable imprecision or uncertainty which is existed in real word design problems. To develop a fast optimizer for robust designs, a methodology based on polynomial chaos and tabu search algorithm is proposed. In the methodology, the polynomial chaos is employed as a stochastic response surface model of the objective function to efficiently evaluate the robust performance parameter while a mechanism to assign expected fitness only to promising solutions is introduced in tabu search algorithm to minimize the requirement for determining robust metrics of intermediate solutions. The proposed methodology is applied to the robust design of a practical inverse problem with satisfactory results.
Resumo:
Function approximation is a very important task in environments where computation has to be based on extracting information from data samples in real world processes. Neural networks and wavenets have been recently seen as attractive tools for developing efficient solutions for many real world problems in function approximation. In this paper, it is shown how feedforward neural networks can be built using a different type of activation function referred to as the PPS-wavelet. An algorithm is presented to generate a family of PPS-wavelets that can be used to efficiently construct feedforward networks for function approximation.
Resumo:
A low-cost computer procedure to determine the orbit of an artificial satellite by using short arc data from an onboard GPS receiver is proposed. Pseudoranges are used as measurements to estimate the orbit via recursive least squares method. The algorithm applies orthogonal Givens rotations for solving recursive and sequential orbit determination problems. To assess the procedure, it was applied to the TOPEX/POSEIDON satellite for data batches of one orbital period (approximately two hours), and force modelling, due to the full JGM-2 gravity field model, was considered. When compared with the reference Precision Orbit Ephemeris (POE) of JPL/NASA, the results have indicated that precision better than 9 m is easily obtained, even when short batches of data are used. Copyright (c) 2007.
Resumo:
Common sense tells us that the future is an essential element in any strategy. In addition, there is a good deal of literature on scenario planning, which is an important tool in considering the future in terms of strategy. However, in many organizations there is serious resistance to the development of scenarios, and they are not broadly implemented by companies. But even organizations that do not rely heavily on the development of scenarios do, in fact, construct visions to guide their strategies. But it might be asked, what happens when this vision is not consistent with the future? To address this problem, the present article proposes a method for checking the content and consistency of an organization's vision of the future, no matter how it was conceived. The proposed method is grounded on theoretical concepts from the field of future studies, which are described in this article. This study was motivated by the search for developing new ways of improving and using scenario techniques as a method for making strategic decisions. The method was then tested on a company in the field of information technology in order to check its operational feasibility. The test showed that the proposed method is, in fact, operationally feasible and was capable of analyzing the vision of the company being studied, indicating both its shortcomings and points of inconsistency. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The problem of dynamic camera calibration considering moving objects in close range environments using straight lines as references is addressed. A mathematical model for the correspondence of a straight line in the object and image spaces is discussed. This model is based on the equivalence between the vector normal to the interpretation plane in the image space and the vector normal to the rotated interpretation plane in the object space. In order to solve the dynamic camera calibration, Kalman Filtering is applied; an iterative process based on the recursive property of the Kalman Filter is defined, using the sequentially estimated camera orientation parameters to feedback the feature extraction process in the image. For the dynamic case, e.g. an image sequence of a moving object, a state prediction and a covariance matrix for the next instant is obtained using the available estimates and the system model. Filtered state estimates can be computed from these predicted estimates using the Kalman Filtering approach and based on the system model parameters with good quality, for each instant of an image sequence. The proposed approach was tested with simulated and real data. Experiments with real data were carried out in a controlled environment, considering a sequence of images of a moving cube in a linear trajectory over a flat surface.
Resumo:
Methods based on visual estimation still is the most widely used analysis of the distances that is covered by soccer players during matches, and most description available in the literature were obtained using such an approach. Recently, systems based on computer vision techniques have appeared and the very first results are available for comparisons. The aim of the present study was to analyse the distances covered by Brazilian soccer players and compare the results to the European players', both data measured by automatic tracking system. Four regular Brazilian First Division Championship matches between different teams were filmed. Applying a previously developed automatic tracking system (DVideo, Campinas, Brazil), the results of 55 outline players participated in the whole game (n = 55) are presented. The results of mean distances covered, standard deviations (s) and coefficient of variation (cv) after 90 minutes were 10,012 m, s = 1,024 m and cv = 10.2%, respectively. The results of three-way ANOVA according to playing positions, showed that the distances covered by external defender (10642 ± 663 m), central midfielders (10476 ± 702 m) and external midfielders (10598 ± 890 m) were greater than forwards (9612 ± 772 m) and forwards covered greater distances than central defenders (9029 ± 860 m). The greater distances were covered in standing, walking, or jogging, 5537 ± 263 m, followed by moderate-speed running, 1731 ± 399 m; low speed running, 1615 ± 351 m; high-speed running, 691 ± 190 m and sprinting, 437 ± 171 m. Mean distance covered in the first half was 5,173 m (s = 394 m, cv = 7.6%) highly significant greater (p < 0.001) than the mean value 4,808 m (s = 375 m, cv = 7.8%) in the second half. A minute-by-minute analysis revealed that after eight minutes of the second half, player performance has already decreased and this reduction is maintained throughout the second half. ©Journal of Sports Science and Medicine (2007).
Resumo:
This letter describes a novel algorithm that is based on autoregressive decomposition and pole tracking used to recognize two patterns of speech data: normal voice and disphonic voice caused by nodules. The presented method relates the poles and the peaks of the signal spectrum which represent the periodic components of the voice. The results show that the perturbation contained in the signal is clearly depicted by pole's positions. Their variability is related to jitter and shimmer. The pole dispersion for pathological voices is about 20% higher than for normal voices, therefore, the proposed approach is a more trustworthy measure than the classical ones. © 2007.
ANN statistical image recognition method for computer vision in agricultural mobile robot navigation
Resumo:
The main application area in this project, is to deploy image processing and segmentation techniques in computer vision through an omnidirectional vision system to agricultural mobile robots (AMR) used for trajectory navigation problems, as well as localization matters. Thereby, computational methods based on the JSEG algorithm were used to provide the classification and the characterization of such problems, together with Artificial Neural Networks (ANN) for image recognition. Hence, it was possible to run simulations and carry out analyses of the performance of JSEG image segmentation technique through Matlab/Octave computational platforms, along with the application of customized Back-propagation Multilayer Perceptron (MLP) algorithm and statistical methods as structured heuristics methods in a Simulink environment. Having the aforementioned procedures been done, it was practicable to classify and also characterize the HSV space color segments, not to mention allow the recognition of segmented images in which reasonably accurate results were obtained. © 2010 IEEE.
Resumo:
A significant set of information stored in different databases around the world, can be shared through peer-topeer databases. With that, is obtained a large base of knowledge, without the need for large investments because they are used existing databases, as well as the infrastructure in place. However, the structural characteristics of peer-topeer, makes complex the process of finding such information. On the other side, these databases are often heterogeneous in their schemas, but semantically similar in their content. A good peer-to-peer databases systems should allow the user access information from databases scattered across the network and receive only the information really relate to your topic of interest. This paper proposes to use ontologies in peer-to-peer database queries to represent the semantics inherent to the data. The main contribution of this work is enable integration between heterogeneous databases, improve the performance of such queries and use the algorithm of optimization Ant Colony to solve the problem of locating information on peer-to-peer networks, which presents an improve of 18% in results. © 2011 IEEE.
Resumo:
Aiming to ensure greater reliability and consistency of data stored in the database, the data cleaning stage is set early in the process of Knowledge Discovery in Databases (KDD) and is responsible for eliminating problems and adjust the data for the later stages, especially for the stage of data mining. Such problems occur in the instance level and schema, namely, missing values, null values, duplicate tuples, values outside the domain, among others. Several algorithms were developed to perform the cleaning step in databases, some of them were developed specifically to work with the phonetics of words, since a word can be written in different ways. Within this perspective, this work presents as original contribution an optimization of algorithm for the detection of duplicate tuples in databases through phonetic based on multithreading without the need for trained data, as well as an independent environment of language to be supported for this. © 2011 IEEE.
Resumo:
This paper presents evaluations among the most usual MPPT techniques, doing meaningful comparisons with respect to the amount of energy extracted from the photovoltaic panel (PV) (Tracking Factor - TF) in relation to the available power, PV voltage ripple, dynamic response and use of sensors. Using MatLab/Simulink® and DSpace platforms, a digitally controlled boost DC-DC converter was implemented and connected to an Agilent Solar Array E4350B simulator in order to verify the analytical procedures. The main experimental results are presented and a contribution in the implementation of the IC algorithm is performed and called IC based on PI. Moreover, the dynamic response and the tracking factor are also evaluated using a Friendly User Interface, which is capable of online program power curves and compute the TF. Finally, a typical daily insulation is used in order to verify the experimental results for the main PV MPPT methods. © 2011 IEEE.
Resumo:
Augmented Reality (AR) systems which use optical tracking with fiducial marker for registration have had an important role in popularizing this technology, since only a personal computer with a conventional webcam is required. However, in most these applications, the virtual elements are shown only in the foreground a real element does not occlude a virtual one. The method presented enables AR environments based on fiducial markers to support mutual occlusion between a real element and many virtual ones, according to the elements position (depth) in the environment. © 2012 IEEE.
Resumo:
Dental recognition is very important for forensic human identification, mainly regarding the mass disasters, which have frequently happened due to tsunamis, airplanes crashes, etc. Algorithms for automatic, precise, and robust teeth segmentation from radiograph images are crucial for dental recognition. In this work we propose the use of a graph-based algorithm to extract the teeth contours from panoramic dental radiographs that are used as dental features. In order to assess our proposal, we have carried out experiments using a database of 1126 tooth images, obtained from 40 panoramic dental radiograph images from 20 individuals. The results of the graph-based algorithm was qualitatively assessed by a human expert who reported excellent scores. For dental recognition we propose the use of the teeth shapes as biometric features, by the means of BAS (Bean Angle Statistics) and Shape Context descriptors. The BAS descriptors showed, on the same database, a better performance (EER 14%) than the Shape Context (EER 20%). © 2012 IEEE.