93 resultados para Image processing - Digital techniques
Resumo:
In this paper we face the problem of positioning a camera attached to the end-effector of a robotic manipulator so that it gets parallel to a planar object. Such problem has been treated for a long time in visual servoing. Our approach is based on linking to the camera several laser pointers so that its configuration is aimed to produce a suitable set of visual features. The aim of using structured light is not only for easing the image processing and to allow low-textured objects to be treated, but also for producing a control scheme with nice properties like decoupling, stability, well conditioning and good camera trajectory
Resumo:
Image registration is an important component of image analysis used to align two or more images. In this paper, we present a new framework for image registration based on compression. The basic idea underlying our approach is the conjecture that two images are correctly registered when we can maximally compress one image given the information in the other. The contribution of this paper is twofold. First, we show that the image registration process can be dealt with from the perspective of a compression problem. Second, we demonstrate that the similarity metric, introduced by Li et al., performs well in image registration. Two different versions of the similarity metric have been used: the Kolmogorov version, computed using standard real-world compressors, and the Shannon version, calculated from an estimation of the entropy rate of the images
Resumo:
In this paper, an information theoretic framework for image segmentation is presented. This approach is based on the information channel that goes from the image intensity histogram to the regions of the partitioned image. It allows us to define a new family of segmentation methods which maximize the mutual information of the channel. Firstly, a greedy top-down algorithm which partitions an image into homogeneous regions is introduced. Secondly, a histogram quantization algorithm which clusters color bins in a greedy bottom-up way is defined. Finally, the resulting regions in the partitioning algorithm can optionally be merged using the quantized histogram
Resumo:
Mosaics have been commonly used as visual maps for undersea exploration and navigation. The position and orientation of an underwater vehicle can be calculated by integrating the apparent motion of the images which form the mosaic. A feature-based mosaicking method is proposed in this paper. The creation of the mosaic is accomplished in four stages: feature selection and matching, detection of points describing the dominant motion, homography computation and mosaic construction. In this work we demonstrate that the use of color and textures as discriminative properties of the image can improve, to a large extent, the accuracy of the constructed mosaic. The system is able to provide 3D metric information concerning the vehicle motion using the knowledge of the intrinsic parameters of the camera while integrating the measurements of an ultrasonic sensor. The experimental results of real images have been tested on the GARBI underwater vehicle
Resumo:
We present a georeferenced photomosaic of the Lucky Strike hydrothermal vent field (Mid-Atlantic Ridge, 37°18’N). The photomosaic was generated from digital photographs acquired using the ARGO II seafloor imaging system during the 1996 LUSTRE cruise, which surveyed a ~1 km2 zone and provided a coverage of ~20% of the seafloor. The photomosaic has a pixel resolution of 15 mm and encloses the areas with known active hydrothermal venting. The final mosaic is generated after an optimization that includes the automatic detection of the same benthic features across different images (feature-matching), followed by a global alignment of images based on the vehicle navigation. We also provide software to construct mosaics from large sets of images for which georeferencing information exists (location, attitude, and altitude per image), to visualize them, and to extract data. Georeferencing information can be provided by the raw navigation data (collected during the survey) or result from the optimization obtained from imatge matching. Mosaics based solely on navigation can be readily generated by any user but the optimization and global alignment of the mosaic requires a case-by-case approach for which no universally software is available. The Lucky Strike photomosaics (optimized and navigated-only) are publicly available through the Marine Geoscience Data System (MGDS, http://www.marine-geo.org). The mosaic-generating and viewing software is available through the Computer Vision and Robotics Group Web page at the University of Girona (http://eia.udg.es/_rafa/mosaicviewer.html)
Resumo:
In this paper we present a Bayesian image reconstruction algorithm with entropy prior (FMAPE) that uses a space-variant hyperparameter. The spatial variation of the hyperparameter allows different degrees of resolution in areas of different statistical characteristics, thus avoiding the large residuals resulting from algorithms that use a constant hyperparameter. In the first implementation of the algorithm, we begin by segmenting a Maximum Likelihood Estimator (MLE) reconstruction. The segmentation method is based on using a wavelet decomposition and a self-organizing neural network. The result is a predetermined number of extended regions plus a small region for each star or bright object. To assign a different value of the hyperparameter to each extended region and star, we use either feasibility tests or cross-validation methods. Once the set of hyperparameters is obtained, we carried out the final Bayesian reconstruction, leading to a reconstruction with decreased bias and excellent visual characteristics. The method has been applied to data from the non-refurbished Hubble Space Telescope. The method can be also applied to ground-based images.
Resumo:
Usual image fusion methods inject features from a high spatial resolution panchromatic sensor into every low spatial resolution multispectral band trying to preserve spectral signatures and improve spatial resolution to that of the panchromatic sensor. The objective is to obtain the image that would be observed by a sensor with the same spectral response (i.e., spectral sensitivity and quantum efficiency) as the multispectral sensors and the spatial resolution of the panchromatic sensor. But in these methods, features from electromagnetic spectrum regions not covered by multispectral sensors are injected into them, and physical spectral responses of the sensors are not considered during this process. This produces some undesirable effects, such as resolution overinjection images and slightly modified spectral signatures in some features. The authors present a technique which takes into account the physical electromagnetic spectrum responses of sensors during the fusion process, which produces images closer to the image obtained by the ideal sensor than those obtained by usual wavelet-based image fusion methods. This technique is used to define a new wavelet-based image fusion method.
Resumo:
A new approach to the local measurement of residual stress in microstructures is described in this paper. The presented technique takes advantage of the combined milling-imaging features of a focused ion beam (FIB) equipment to scale down the widely known hole drilling method. This method consists of drilling a small hole in a solid with inherent residual stresses and measuring the strains/displacements caused by the local stress release, that takes place around the hole. In the presented case, the displacements caused by the milling are determined by applying digital image correlation (DIC) techniques to high resolution micrographs taken before and after the milling process. The residual stress value is then obtained by fitting the measured displacements to the analytical solution of the displacement fields. The feasibility of this approach has been demonstrated on a micromachined silicon nitride membrane showing that this method has high potential for applications in the field of mechanical characterization of micro/nanoelectromechanical systems.
Resumo:
In this work, a new one-class classification ensemble strategy called approximate polytope ensemble is presented. The main contribution of the paper is threefold. First, the geometrical concept of convex hull is used to define the boundary of the target class defining the problem. Expansions and contractions of this geometrical structure are introduced in order to avoid over-fitting. Second, the decision whether a point belongs to the convex hull model in high dimensional spaces is approximated by means of random projections and an ensemble decision process. Finally, a tiling strategy is proposed in order to model non-convex structures. Experimental results show that the proposed strategy is significantly better than state of the art one-class classification methods on over 200 datasets.
Resumo:
This correspondence addresses the problem of nondata-aidedwaveform estimation for digital communications. Based on the unconditionalmaximum likelihood criterion, the main contribution of this correspondenceis the derivation of a closed-form solution to the waveform estimationproblem in the low signal-to-noise ratio regime. The proposed estimationmethod is based on the second-order statistics of the received signaland a clear link is established between maximum likelihood estimation andcorrelation matching techniques. Compression with the signal-subspace isalso proposed to improve the robustness against the noise and to mitigatethe impact of abnormals or outliers.
Resumo:
A method for optimizing the strength of a parametric phase mask for a wavefront coding imaging system is presented. The method is based on an optimization process that minimizes a proposed merit function. The goal is to achieve modulation transfer function invariance while quantitatively maintaining nal image delity. A parametric lter that copes with the noise present in the captured images is used to obtain the nal images, and this lter is optimized. The whole process results in optimum phase mask strength and optimal parameters for the restoration lter. The results for a particular optical system are presented and tested experimentally in the labo- ratory. The experimental results show good agreement with the simulations, indicating that the procedure is useful.
Resumo:
The Cherenkov light flashes produced by Extensive Air Showers are very short in time. A high bandwidth and fast digitizing readout, therefore, can minimize the influence of the background from the light of the night sky, and improve the performance in Cherenkov telescopes. The time structure of the Cherenkov image can further be used in single-dish Cherenkov telescopes as an additional parameter to reduce the background from unwanted hadronic showers. A description of an analysis method which makes use of the time information and the subsequent improvement on the performance of the MAGIC telescope (especially after the upgrade with an ultra fast 2 GSamples/s digitization system in February 2007) will be presented. The use of timing information in the analysis of the new MAGIC data reduces the background by a factor two, which in turn results in an enhancement of about a factor 1.4 of the flux sensitivity to point-like sources, as tested on observations of the Crab Nebula.
Resumo:
The CORNISH project is the highest resolution radio continuum survey of the Galactic plane to date. It is the 5 GHz radio continuum part of a series of multi-wavelength surveys that focus on the northern GLIMPSE region (10° < l < 65°), observed by the Spitzer satellite in the mid-infrared. Observations with the Very Large Array in B and BnA configurations have yielded a 1.''5 resolution Stokes I map with a root mean square noise level better than 0.4 mJy beam 1. Here we describe the data-processing methods and data characteristics, and present a new, uniform catalog of compact radio emission. This includes an implementation of automatic deconvolution that provides much more reliable imaging than standard CLEANing. A rigorous investigation of the noise characteristics and reliability of source detection has been carried out. We show that the survey is optimized to detect emission on size scales up to 14'' and for unresolved sources the catalog is more than 90% complete at a flux density of 3.9 mJy. We have detected 3062 sources above a 7σ detection limit and present their ensemble properties. The catalog is highly reliable away from regions containing poorly sampled extended emission, which comprise less than 2% of the survey area. Imaging problems have been mitigated by down-weighting the shortest spacings and potential artifacts flagged via a rigorous manual inspection with reference to the Spitzer infrared data. We present images of the most common source types found: H II regions, planetary nebulae, and radio galaxies. The CORNISH data and catalog are available online at http://cornish.leeds.ac.uk.
Resumo:
Phase encoded nano structures such as Quick Response (QR) codes made of metallic nanoparticles are suggested to be used in security and authentication applications. We present a polarimetric optical method able to authenticate random phase encoded QR codes. The system is illuminated using polarized light and the QR code is encoded using a phase-only random mask. Using classification algorithms it is possible to validate the QR code from the examination of the polarimetric signature of the speckle pattern. We used Kolmogorov-Smirnov statistical test and Support Vector Machine algorithms to authenticate the phase encoded QR codes using polarimetric signatures.
Resumo:
In order to develop applications for z;isual interpretation of medical images, the early detection and evaluation of microcalcifications in digital mammograms is verg important since their presence is oftenassociated with a high incidence of breast cancers. Accurate classification into benign and malignant groups would help improve diagnostic sensitivity as well as reduce the number of unnecessa y biopsies. The challenge here is the selection of the useful features to distinguish benign from malignant micro calcifications. Our purpose in this work is to analyse a microcalcification evaluation method based on a set of shapebased features extracted from the digitised mammography. The segmentation of the microcalcificationsis performed using a fixed-tolerance region growing method to extract boundaries of calcifications with manually selected seed pixels. Taking into account that shapes and sizes of clustered microcalcificationshave been associated with a high risk of carcinoma based on digerent subjective measures, such as whether or not the calcifications are irregular, linear, vermiform, branched, rounded or ring like, our efforts were addressed to obtain a feature set related to the shape. The identification of the pammeters concerning the malignant character of the microcalcifications was performed on a set of 146 mammograms with their real diagnosis known in advance from biopsies. This allowed identifying the following shape-based parameters as the relevant ones: Number of clusters, Number of holes, Area, Feret elongation, Roughness, and Elongation. Further experiments on a set of 70 new mammogmms showed that the performance of the classification scheme is close to the mean performance of three expert radiologists, which allows to consider the proposed method for assisting the diagnosis and encourages to continue the investigation in the senseof adding new features not only related to the shape