754 resultados para Vision algorithms for grasping
Resumo:
OBJECTIVES: To assess the influence of Nd:YAG (neodymium: yttrium-aluminum- garnet) laser unilateral posterior capsulotomy on visual acuity and patients' perception of difficulties with vision-related activities of daily life. METHODS: We conducted an interventional survey that included 48 patients between 40 and 80 years of age with uni- or bilateral pseudophakia, posterior capsule opacification, and visual acuity <0.30 (logMAR) in one eye who were seen at a Brazilian university hospital. All patients underwent posterior capsulotomy using an Nd:YAG laser. Before and after the intervention, patients were asked to complete a questionnaire that was developed in an exploratory study. RESULTS: Before posterior capsulotomy, the median visual acuity (logMAR) of the included patients was 0.52 (range 0.30-1.60). After posterior capsulotomy, the median visual acuity of the included patients improved to 0.10 (range 0.0-0.52). According to the subjects' perceptions, their ability to perform most of their daily life activities improved after the intervention (p<0.05). CONCLUSIONS: After patients underwent posterior capsulotomy with an Nd:YAG laser, a significant improvement in the visual acuity of the treated eye was observed. Additionally, subjects felt that they experienced less difficulty performing most of their vision-dependent activities of daily living.
Resumo:
The trails formed by many ant species between nest and food source are two-way roads on which outgoing and returning workers meet and touch each other all along. The way to get back home, after grasping a food load, is to take the same route on which they have arrived from the nest. In many species such trails are chemically marked by pheromones providing orientation cues for the ants to find their way. Other species rely on their vision and use landmarks as cues. We have developed a method to stop foraging ants from shuttling on two-way trails. The only way to forage is to take two separate roads, as they cannot go back on their steps after arriving at the food or at the nest. The condition qualifies as a problem because all their orientation cues-chemical, visual or any other - are disrupted, as all of them cannot but lead the ants back to the route on which they arrived. We have found that workers of the leaf-cutting ant Atta sexdens rubropilosa can solve the problem. They could not only find the alternative way, but also used the unidirectional traffic system to forage effectively. We suggest that their ability is an evolutionary consequence of the need to deal with environmental irregularities that cannot be negotiated by means of excessively stereotyped behavior, and that it is but an example of a widespread phenomenon. We also suggest that our method can be adapted to other species, invertebrate and vertebrate, in the study of orientation, memory, perception, learning and communication.
Resumo:
We propose and analyze two different Bayesian online algorithms for learning in discrete Hidden Markov Models and compare their performance with the already known Baldi-Chauvin Algorithm. Using the Kullback-Leibler divergence as a measure of generalization we draw learning curves in simplified situations for these algorithms and compare their performances.
Resumo:
Today several different unsupervised classification algorithms are commonly used to cluster similar patterns in a data set based only on its statistical properties. Specially in image data applications, self-organizing methods for unsupervised classification have been successfully applied for clustering pixels or group of pixels in order to perform segmentation tasks. The first important contribution of this paper refers to the development of a self-organizing method for data classification, named Enhanced Independent Component Analysis Mixture Model (EICAMM), which was built by proposing some modifications in the Independent Component Analysis Mixture Model (ICAMM). Such improvements were proposed by considering some of the model limitations as well as by analyzing how it should be improved in order to become more efficient. Moreover, a pre-processing methodology was also proposed, which is based on combining the Sparse Code Shrinkage (SCS) for image denoising and the Sobel edge detector. In the experiments of this work, the EICAMM and other self-organizing models were applied for segmenting images in their original and pre-processed versions. A comparative analysis showed satisfactory and competitive image segmentation results obtained by the proposals presented herein. (C) 2008 Published by Elsevier B.V.
Resumo:
Use of peripheral vision to organize and reorganize an interceptive action was investigated in young adults. Temporal errors and kinematic variables were evaluated in the interception of a virtual moving target, in situations in which its initial velocity was kept unchanged or was unexpectedly decreased. Observation of target approach was made through continuous visual pursuit (focal vision) or keeping visual focus at the origin of the trajectory or at the contact spot (peripheral vision). Results showed that visual focus at the contact spot led to temporal errors similar to focal vision, although showing a distinct kinematic profile, while focus at the origin led to an impoverished performance
Resumo:
The goal of this study was to examine the coupling between visual information and body sway with binocular and monocular vision at two distances from the front wall of a moving room. Ten participants stood as still as possible inside of a moving room facing the front wall in conditions that combined room movement with monocular/binocular vision and distance from the front wall (75 and 150cm). Visual information effect on body sway decreased with monocular vision and with increased distance from the front wall. In addition, the combination of monocular vision with the farther distance resulted in the smallest body sway response to the driving stimulus provided by the moving room. These results suggest that binocularvision near the front wall provides visual information of a better quality than the monocular vision far from the front wall. We discuss the results with respect to two modes of visual detection of body sway: ocular and extraocular. (C) 2009 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Voltage and current waveforms of a distribution or transmission power system are not pure sinusoids. There are distortions in these waveforms that can be represented as a combination of the fundamental frequency, harmonics and high frequency transients. This paper presents a novel approach to identifying harmonics in power system distorted waveforms. The proposed method is based on Genetic Algorithms, which is an optimization technique inspired by genetics and natural evolution. GOOAL, a specially designed intelligent algorithm for optimization problems, was successfully implemented and tested. Two kinds of representations concerning chromosomes are utilized: binary and real. The results show that the proposed method is more precise than the traditional Fourier Transform, especially considering the real representation of the chromosomes.
Resumo:
This paper presents a strategy for the solution of the WDM optical networks planning. Specifically, the problem of Routing and Wavelength Allocation (RWA) in order to minimize the amount of wavelengths used. In this case, the problem is known as the Min-RWA. Two meta-heuristics (Tabu Search and Simulated Annealing) are applied to take solutions of good quality and high performance. The key point is the degradation of the maximum load on the virtual links in favor of minimization of number of wavelengths used; the objective is to find a good compromise between the metrics of virtual topology (load in Gb/s) and of the physical topology (quantity of wavelengths). The simulations suggest good results when compared to some existing in the literature.
Resumo:
This technical note develops information filter and array algorithms for a linear minimum mean square error estimator of discrete-time Markovian jump linear systems. A numerical example for a two-mode Markovian jump linear system, to show the advantage of using array algorithms to filter this class of systems, is provided.
Resumo:
The continuous growth of peer-to-peer networks has made them responsible for a considerable portion of the current Internet traffic. For this reason, improvements in P2P network resources usage are of central importance. One effective approach for addressing this issue is the deployment of locality algorithms, which allow the system to optimize the peers` selection policy for different network situations and, thus, maximize performance. To date, several locality algorithms have been proposed for use in P2P networks. However, they usually adopt heterogeneous criteria for measuring the proximity between peers, which hinders a coherent comparison between the different solutions. In this paper, we develop a thoroughly review of popular locality algorithms, based on three main characteristics: the adopted network architecture, distance metric, and resulting peer selection algorithm. As result of this study, we propose a novel and generic taxonomy for locality algorithms in peer-to-peer networks, aiming to enable a better and more coherent evaluation of any individual locality algorithm.
Resumo:
In this paper a computational implementation of an evolutionary algorithm (EA) is shown in order to tackle the problem of reconfiguring radial distribution systems. The developed module considers power quality indices such as long duration interruptions and customer process disruptions due to voltage sags, by using the Monte Carlo simulation method. Power quality costs are modeled into the mathematical problem formulation, which are added to the cost of network losses. As for the EA codification proposed, a decimal representation is used. The EA operators, namely selection, recombination and mutation, which are considered for the reconfiguration algorithm, are herein analyzed. A number of selection procedures are analyzed, namely tournament, elitism and a mixed technique using both elitism and tournament. The recombination operator was developed by considering a chromosome structure representation that maps the network branches and system radiality, and another structure that takes into account the network topology and feasibility of network operation to exchange genetic material. The topologies regarding the initial population are randomly produced so as radial configurations are produced through the Prim and Kruskal algorithms that rapidly build minimum spanning trees. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This paper presents a family of algorithms for approximate inference in credal networks (that is, models based on directed acyclic graphs and set-valued probabilities) that contain only binary variables. Such networks can represent incomplete or vague beliefs, lack of data, and disagreements among experts; they can also encode models based on belief functions and possibilistic measures. All algorithms for approximate inference in this paper rely on exact inferences in credal networks based on polytrees with binary variables, as these inferences have polynomial complexity. We are inspired by approximate algorithms for Bayesian networks; thus the Loopy 2U algorithm resembles Loopy Belief Propagation, while the Iterated Partial Evaluation and Structured Variational 2U algorithms are, respectively, based on Localized Partial Evaluation and variational techniques. (C) 2007 Elsevier Inc. All rights reserved.
Resumo:
The flowshop scheduling problem with blocking in-process is addressed in this paper. In this environment, there are no buffers between successive machines: therefore intermediate queues of jobs waiting in the system for their next operations are not allowed. Heuristic approaches are proposed to minimize the total tardiness criterion. A constructive heuristic that explores specific characteristics of the problem is presented. Moreover, a GRASP-based heuristic is proposed and Coupled with a path relinking strategy to search for better outcomes. Computational tests are presented and the comparisons made with an adaptation of the NEH algorithm and with a branch-and-bound algorithm indicate that the new approaches are promising. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
When building genetic maps, it is necessary to choose from several marker ordering algorithms and criteria, and the choice is not always simple. In this study, we evaluate the efficiency of algorithms try (TRY), seriation (SER), rapid chain delineation (RCD), recombination counting and ordering (RECORD) and unidirectional growth (UG), as well as the criteria PARF (product of adjacent recombination fractions), SARF (sum of adjacent recombination fractions), SALOD (sum of adjacent LOD scores) and LHMC (likelihood through hidden Markov chains), used with the RIPPLE algorithm for error verification, in the construction of genetic linkage maps. A linkage map of a hypothetical diploid and monoecious plant species was simulated containing one linkage group and 21 markers with fixed distance of 3 cM between them. In all, 700 F(2) populations were randomly simulated with and 400 individuals with different combinations of dominant and co-dominant markers, as well as 10 and 20% of missing data. The simulations showed that, in the presence of co-dominant markers only, any combination of algorithm and criteria may be used, even for a reduced population size. In the case of a smaller proportion of dominant markers, any of the algorithms and criteria (except SALOD) investigated may be used. In the presence of high proportions of dominant markers and smaller samples (around 100), the probability of repulsion linkage increases between them and, in this case, use of the algorithms TRY and SER associated to RIPPLE with criterion LHMC would provide better results. Heredity (2009) 103, 494-502; doi:10.1038/hdy.2009.96; published online 29 July 2009