828 resultados para robot mapping
Resumo:
This paper presents the implementation of a modified particle filter for vision-based simultaneous localization and mapping of an autonomous robot in a structured indoor environment. Through this method, artificial landmarks such as multi-coloured cylinders can be tracked with a camera mounted on the robot, and the position of the robot can be estimated at the same time. Experimental results in simulation and in real environments show that this approach has advantages over the extended Kalman filter with ambiguous data association and various levels of odometric noise.
Resumo:
Probabilistic robotics most often applied to the problem of simultaneous localisation and mapping (SLAM), requires measures of uncertainty to accompany observations of the environment. This paper describes how uncertainty can be characterised for a vision system that locates coloured landmarks in a typical laboratory environment. The paper describes a model of the uncertainty in segmentation, the internal cameral model and the mounting of the camera on the robot. It explains the implementation of the system on a laboratory robot, and provides experimental results that show the coherence of the uncertainty model.
Resumo:
Simultaneous Localization and Mapping (SLAM) is a procedure used to determine the location of a mobile vehicle in an unknown environment, while constructing a map of the unknown environment at the same time. Mobile platforms, which make use of SLAM algorithms, have industrial applications in autonomous maintenance, such as the inspection of flaws and defects in oil pipelines and storage tanks. A typical SLAM consists of four main components, namely, experimental setup (data gathering), vehicle pose estimation, feature extraction, and filtering. Feature extraction is the process of realizing significant features from the unknown environment such as corners, edges, walls, and interior features. In this work, an original feature extraction algorithm specific to distance measurements obtained through SONAR sensor data is presented. This algorithm has been constructed by combining the SONAR Salient Feature Extraction Algorithm and the Triangulation Hough Based Fusion with point-in-polygon detection. The reconstructed maps obtained through simulations and experimental data with the fusion algorithm are compared to the maps obtained with existing feature extraction algorithms. Based on the results obtained, it is suggested that the proposed algorithm can be employed as an option for data obtained from SONAR sensors in environment, where other forms of sensing are not viable. The algorithm fusion for feature extraction requires the vehicle pose estimation as an input, which is obtained from a vehicle pose estimation model. For the vehicle pose estimation, the author uses sensor integration to estimate the pose of the mobile vehicle. Different combinations of these sensors are studied (e.g., encoder, gyroscope, or encoder and gyroscope). The different sensor fusion techniques for the pose estimation are experimentally studied and compared. The vehicle pose estimation model, which produces the least amount of error, is used to generate inputs for the feature extraction algorithm fusion. In the experimental studies, two different environmental configurations are used, one without interior features and another one with two interior features. Numerical and experimental findings are discussed. Finally, the SLAM algorithm is implemented along with the algorithms for feature extraction and vehicle pose estimation. Three different cases are experimentally studied, with the floor of the environment intentionally altered to induce slipping. Results obtained for implementations with and without SLAM are compared and discussed. The present work represents a step towards the realization of autonomous inspection platforms for performing concurrent localization and mapping in harsh environments.
Resumo:
The first mechanical Automaton concept was found in a Chinese text written in the 3rd century BC, while Computer Vision was born in the late 1960s. Therefore, visual perception applied to machines (i.e. the Machine Vision) is a young and exciting alliance. When robots came in, the new field of Robotic Vision was born, and these terms began to be erroneously interchanged. In short, we can say that Machine Vision is an engineering domain, which concern the industrial use of Vision. The Robotic Vision, instead, is a research field that tries to incorporate robotics aspects in computer vision algorithms. Visual Servoing, for example, is one of the problems that cannot be solved by computer vision only. Accordingly, a large part of this work deals with boosting popular Computer Vision techniques by exploiting robotics: e.g. the use of kinematics to localize a vision sensor, mounted as the robot end-effector. The remainder of this work is dedicated to the counterparty, i.e. the use of computer vision to solve real robotic problems like grasping objects or navigate avoiding obstacles. Will be presented a brief survey about mapping data structures most widely used in robotics along with SkiMap, a novel sparse data structure created both for robotic mapping and as a general purpose 3D spatial index. Thus, several approaches to implement Object Detection and Manipulation, by exploiting the aforementioned mapping strategies, will be proposed, along with a completely new Machine Teaching facility in order to simply the training procedure of modern Deep Learning networks.
Resumo:
Dulce de leche samples available in the Brazilian market were submitted to sensory profiling by quantitative descriptive analysis and acceptance test, as well sensory evaluation using the just-about-right scale and purchase intent. External preference mapping and the ideal sensory characteristics of dulce de leche were determined. The results were also evaluated by principal component analysis, hierarchical cluster analysis, partial least squares regression, artificial neural networks, and logistic regression. Overall, significant product acceptance was related to intermediate scores of the sensory attributes in the descriptive test, and this trend was observed even after consumer segmentation. The results obtained by sensometric techniques showed that optimizing an ideal dulce de leche from the sensory standpoint is a multidimensional process, with necessary adjustments on the appearance, aroma, taste, and texture attributes of the product for better consumer acceptance and purchase. The optimum dulce de leche was characterized by high scores for the attributes sweet taste, caramel taste, brightness, color, and caramel aroma in accordance with the preference mapping findings. In industrial terms, this means changing the parameters used in the thermal treatment and quantitative changes in the ingredients used in formulations.
Resumo:
The evolution and population dynamics of avian coronaviruses (AvCoVs) remain underexplored. In the present study, in-depth phylogenetic and Bayesian phylogeographic studies were conducted to investigate the evolutionary dynamics of AvCoVs detected in wild and synanthropic birds. A total of 500 samples, including tracheal and cloacal swabs collected from 312 wild birds belonging to 42 species, were analysed using molecular assays. A total of 65 samples (13%) from 22 bird species were positive for AvCoV. Molecular evolution analyses revealed that the sequences from samples collected in Brazil did not cluster with any of the AvCoV S1 gene sequences deposited in the GenBank database. Bayesian framework analysis estimated an AvCoV strain from Sweden (1999) as the most recent common ancestor of the AvCoVs detected in this study. Furthermore, the analysis inferred an increase in the AvCoV dynamic demographic population in different wild and synanthropic bird species, suggesting that birds may be potential new hosts responsible for spreading this virus.
Resumo:
Mapping of elements in biological tissue by laser induced mass spectrometry is a fast growing analytical methodology in life sciences. This method provides a multitude of useful information of metal, nonmetal, metalloid and isotopic distribution at major, minor and trace concentration ranges, usually with a lateral resolution of 12-160 µm. Selected applications in medical research require an improved lateral resolution of laser induced mass spectrometric technique at the low micrometre scale and below. The present work demonstrates the applicability of a recently developed analytical methodology - laser microdissection associated to inductively coupled plasma mass spectrometry (LMD ICP-MS) - to obtain elemental images of different solid biological samples at high lateral resolution. LMD ICP-MS images of mouse brain tissue samples stained with uranium and native are shown, and a direct comparison of LMD and laser ablation (LA) ICP-MS imaging methodologies, in terms of elemental quantification, is performed.
Resumo:
QTL mapping provides usefull information for breeding programs since it allows the estimation of genomic locations and genetic effects of chromossomal regions related to the expression of quantitative traits. The objective of this study was to map QTL related to several agronomic important traits associated with grain yield: ear weight (EW), prolificacy (PROL), ear number (NE), ear length (EL) and diameter (ED), number of rows on the ear (NRE) and number of kernels per row on the ear (NKPR). Four hundred F-2:3 tropical maize progenies were evaluated in five environments in Piracicaba, Sao Paulo, Brazil. The genetic map was previously estimated and had 117 microssatelite loci with average distance of 14 cM. Data was analysed using Composite Interval Mapping for each trait. Thirty six QTL were mapped and related to the expression of EW (2), PROL (3), NE (2), EL (5), ED (5), NRE (10), NKPR (5). Few QTL were mapped since there was high GxE interaction. Traits EW, PROL and EN showed high genetic correlation with grain yield and several QTL mapped to similar genomic regions, which could cause the observed correlation. However, further analysis using apropriate statistical models are required to separate linked versus pleiotropic QTL. Five QTL (named Ew1, Ne1, Ed3, Nre3 and Nre10) had high genetic effects, explaining from 10.8% (Nre3) to 16.9% (Nre10) of the phenotypic variance, and could be considered in further studies.
Resumo:
The identification of alternatively spliced transcripts has contributed to a better comprehension of developmental mechanisms, tissue-specific physiological processes and human diseases. Polymerase chain reaction amplification of alternatively spliced variants commonly leads to the formation of heteroduplexes as a result of base pairing involving exons common between the two variants. S1 nuclease cleaves single-stranded loops of heteroduplexes and also nicks the opposite DNA strand. In order to establish a strategy for mapping alternative splice-prone sites in the whole transcriptome, we developed a method combining the formation of heteroduplexes between 2 distinct splicing variants and S1 nuclease digestion. For 20 consensuses identified here using this methodology, 5 revealed a conserved splice site after inspection of the cDNA alignment against the human genome (exact splice sites). For 8 other consensuses, conserved splice sites were mapped at 2 to 30 bp from the border, called proximal splice sites; for the other 7 consensuses, conserved splice sites were mapped at 40 to 800 bp, called distal splice sites. These latter cases showed a nonspecific activity of S1 nuclease in digesting double-strand DNA. From the 20 consensuses identified here, 5 were selected for reverse transcription-polymerase chain reaction validation, confirming the splice sites. These data showed the potential of the strategy in mapping splice sites. However, the lack of specificity of the S1 nuclease enzyme is a significant obstacle that impedes the use of this strategy in large-scale studies.
Resumo:
A combined analytical and numerical study is performed of the mapping between strongly interacting fermions and weakly interacting spins, in the framework of the Hubbard, t-J, and Heisenberg models. While for spatially homogeneous models in the thermodynamic limit the mapping is thoroughly understood, we here focus on aspects that become relevant in spatially inhomogeneous situations, such as the effect of boundaries, impurities, superlattices, and interfaces. We consider parameter regimes that are relevant for traditional applications of these models, such as electrons in cuprates and manganites, and for more recent applications to atoms in optical lattices. The rate of the mapping as a function of the interaction strength is determined from the Bethe-Ansatz for infinite systems and from numerical diagonalization for finite systems. We show analytically that if translational symmetry is broken through the presence of impurities, the mapping persists and is, in a certain sense, as local as possible, provided the spin-spin interaction between two sites of the Heisenberg model is calculated from the harmonic mean of the onsite Coulomb interaction on adjacent sites of the Hubbard model. Numerical calculations corroborate these findings also in interfaces and superlattices, where analytical calculations are more complicated.
Resumo:
This paper develops a Markovian jump model to describe the fault occurrence in a manipulator robot of three joints. This model includes the changes of operation points and the probability that a fault occurs in an actuator. After a fault, the robot works as a manipulator with free joints. Based on the developed model, a comparative study among three Markovian controllers, H(2), H(infinity), and mixed H(2)/H(infinity) is presented, applied in an actual manipulator robot subject to one and two consecutive faults.
Resumo:
This work presents an automated system for the measurement of form errors of mechanical components using an industrial robot. A three-probe error separation technique was employed to allow decoupling between the measured form error and errors introduced by the robotic system. A mathematical model of the measuring system was developed to provide inspection results by means of the solution of a system of linear equations. A new self-calibration procedure, which employs redundant data from several runs, minimizes the influence of probes zero-adjustment on the final result. Experimental tests applied to the measurement of straightness errors of mechanical components were accomplished and demonstrated the effectiveness of the employed methodology. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
This paper reports the use of a non-destructive, continuous magnetic Barkhausen noise (CMBN) technique to investigate the size and thickness of volumetric defects, in a 1070 steel. The magnetic behavior of the used probe was analyzed by numerical simulation, using the finite element method (FEM). Results indicated that the presence of a ferrite coil core in the probe favors MBN emissions. The samples were scanned with different speeds and probe configurations to determine the effect of the flaw on the CMBN signal amplitude. A moving smooth window, based on a second-order statistical moment, was used for analyzing the time signal. The results show the technique`s good repeatability, and high capacity for detection of this type of defect. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Simulated annealing (SA) is an optimization technique that can process cost functions with degrees of nonlinearities, discontinuities and stochasticity. It can process arbitrary boundary conditions and constraints imposed on these cost functions. The SA technique is applied to the problem of robot path planning. Three situations are considered here: the path is represented as a polyline; as a Bezier curve; and as a spline interpolated curve. In the proposed SA algorithm, the sensitivity of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensitivity of each parameter is associated to its probability distribution in the definition of the next candidate. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The competition among the companies depends on the velocity and efficience they can create and commercialize knowledge in a timely and cost-efficient manner. In this context, collaboration emerges as a reaction to the environmental changes. Although strategic alliances and networks have been exploited in the strategic literature for decades, the complexity and continuous usage of these cooperation structures, in a world of growing competition, justify the continuous interest in both themes. This article presents a scanning of the contemporary academic production in strategic alliances and networks, covering the period from January 1997 to august 2007, based on the top five journals accordingly to the journal of Citation Report 2006 in the business and management categories simultaneously. The results point to a retraction in publications about strategic alliances and a significant growth in the area of strategic. networks. The joint view of strategic alliances and networks, cited by some authors a the evolutionary path of study, still did not appear salient. The most cited topics found in the alliance literature are the governance structure, cooperation, knowledge transfer, culture, control, trust, alliance formation,,previous experience, resources, competition and partner selection. The theme network focuses mainly on structure, knowledge transfer and social network, while the joint vision is highly concentrated in: the subjects of alliance formation and the governance choice.