121 resultados para Image interpolation
Resumo:
Representing images and videos in the form of compact codes has emerged as an important research interest in the vision community, in the context of web scale image/video search. Recently proposed Vector of Locally Aggregated Descriptors (VLAD), has been shown to outperform the existing retrieval techniques, while giving a desired compact representation. VLAD aggregates the local features of an image in the feature space. In this paper, we propose to represent the local features extracted from an image, as sparse codes over an over-complete dictionary, which is obtained by K-SVD based dictionary training algorithm. The proposed VLAD aggregates the residuals in the space of these sparse codes, to obtain a compact representation for the image. Experiments are performed over the `Holidays' database using SIFT features. The performance of the proposed method is compared with the original VLAD. The 4% increment in the mean average precision (mAP) indicates the better retrieval performance of the proposed sparse coding based VLAD.
B-Spline potential function for maximum a-posteriori image reconstruction in fluorescence microscopy
Resumo:
An iterative image reconstruction technique employing B-Spline potential function in a Bayesian framework is proposed for fluorescence microscopy images. B-splines are piecewise polynomials with smooth transition, compact support and are the shortest polynomial splines. Incorporation of the B-spline potential function in the maximum-a-posteriori reconstruction technique resulted in improved contrast, enhanced resolution and substantial background reduction. The proposed technique is validated on simulated data as well as on the images acquired from fluorescence microscopes (widefield, confocal laser scanning fluorescence and super-resolution 4Pi microscopy). A comparative study of the proposed technique with the state-of-art maximum likelihood (ML) and maximum-a-posteriori (MAP) with quadratic potential function shows its superiority over the others. B-Spline MAP technique can find applications in several imaging modalities of fluorescence microscopy like selective plane illumination microscopy, localization microscopy and STED. (C) 2015 Author(s).
Resumo:
In this paper, we propose a super resolution (SR) method for synthetic images using FeatureMatch. Existing state-of-the-art super resolution methods are learning based methods, where a pair of low-resolution and high-resolution dictionary pair are trained, and this trained pair is used to replace patches in low-resolution image with appropriate matching patches from the high-resolution dictionary. In this paper, we show that by using Approximate Nearest Neighbour Fields (ANNF), and a common source image, we can by-pass the learning phase, and use a single image for dictionary. Thus, reducing the dictionary from a collection obtained from hundreds of training images, to a single image. We show that by modifying the latest developments in ANNF computation, to suit super resolution, we can perform much faster and more accurate SR than existing techniques. To establish this claim we will compare our algorithm against various state-of-the-art algorithms, and show that we are able to achieve better and faster reconstruction without any training phase.
Resumo:
Rapid reconstruction of multidimensional image is crucial for enabling real-time 3D fluorescence imaging. This becomes a key factor for imaging rapidly occurring events in the cellular environment. To facilitate real-time imaging, we have developed a graphics processing unit (GPU) based real-time maximum a-posteriori (MAP) image reconstruction system. The parallel processing capability of GPU device that consists of a large number of tiny processing cores and the adaptability of image reconstruction algorithm to parallel processing (that employ multiple independent computing modules called threads) results in high temporal resolution. Moreover, the proposed quadratic potential based MAP algorithm effectively deconvolves the images as well as suppresses the noise. The multi-node multi-threaded GPU and the Compute Unified Device Architecture (CUDA) efficiently execute the iterative image reconstruction algorithm that is similar to 200-fold faster (for large dataset) when compared to existing CPU based systems. (C) 2015 Author(s). All article content, except where otherwise noted, is licensed under a Creative Commons Attribution 3.0 Unported License.
Resumo:
In big data image/video analytics, we encounter the problem of learning an over-complete dictionary for sparse representation from a large training dataset, which cannot be processed at once because of storage and computational constraints. To tackle the problem of dictionary learning in such scenarios, we propose an algorithm that exploits the inherent clustered structure of the training data and make use of a divide-and-conquer approach. The fundamental idea behind the algorithm is to partition the training dataset into smaller clusters, and learn local dictionaries for each cluster. Subsequently, the local dictionaries are merged to form a global dictionary. Merging is done by solving another dictionary learning problem on the atoms of the locally trained dictionaries. This algorithm is referred to as the split-and-merge algorithm. We show that the proposed algorithm is efficient in its usage of memory and computational complexity, and performs on par with the standard learning strategy, which operates on the entire data at a time. As an application, we consider the problem of image denoising. We present a comparative analysis of our algorithm with the standard learning techniques that use the entire database at a time, in terms of training and denoising performance. We observe that the split-and-merge algorithm results in a remarkable reduction of training time, without significantly affecting the denoising performance.
Resumo:
Fringe tracking and fringe order assignment have become the central topics of current research in digital photoelasticity. Isotropic points (IPs) appearing in low fringe order zones are often either overlooked or entirely missed in conventional as well as digital photoelasticity. We aim to highlight image processing for characterizing IPs in an isochromatic fringe field. By resorting to a global analytical solution of a circular disk, sensitivity of IPs to small changes in far-field loading on the disk is highlighted. A local theory supplements the global closed-form solutions of three-, four-, and six-point loading configurations of circular disk. The local theoretical concepts developed in this paper are demonstrated through digital image analysis of isochromatics in circular disks subjected to three-and four-point loads. (C) 2015 Society of Photo-Optical Instrumentation Engineers (SPIE)
Resumo:
We discuss here a semiconductors assembly comprising of titanium dioxide (TiO2) rods sensitized by cadmium sulfide (CdS) nanocrystals for potential applications in large area electronics on three dimensional (3-D) substrates. Vertically aligned TiO2 rods are grown on a substrate using a 150 degrees C process flow and then sensitized with CdS by SILAR method at room temperature. This structure forms an effective photoconductor as the photo-generated electrons are rapidly removed from the CdS via the TiO2 thereby permitting a hole rich CdS. Current-voltage characteristics are measured and models illustrate space charge limited photo-current as the mechanism of charge transport at moderate voltage bias. The stable assembly and high speed are achieved. The frequency response with a loading of 10 pF and 9 M Omega shows a half power frequency of 100 Hz. (C) 2015 The Electrochemical Society. All rights reserved.
Resumo:
This paper proposes a denoising algorithm which performs non-local means bilateral filtering. As existing literature suggests, non-local means (NLM) is one of the widely used denoising techniques, but has a critical drawback of smoothing of edges. In order to improve this, we perform fast and efficient NLM using Approximate Nearest Neighbour Fields and improve the edge content in denoising by formulating a joint-bilateral filter. Using the proposed joint bilateral, we are able to denoise smooth regions using the NLM approach and efficient edge reconstruction is obtained from the bilateral filter. Furthermore, to avoid tedious parameter selection, we carry out a noise estimation before performing joint bilateral filtering. The proposed approach is observed to perform well on high noise images.
Resumo:
Image inpainting is the process of filling the unwanted region in an image marked by the user. It is used for restoring old paintings and photographs, removal of red eyes from pictures, etc. In this paper, we propose an efficient inpainting algorithm which takes care of false edge propagation. We use the classical exemplar based technique to find out the priority term for each patch. To ensure that the edge content of the nearest neighbor patch found by minimizing L-2 distance between patches, we impose an additional constraint that the entropy of the patches be similar. Entropy of the patch acts as a good measure of edge content. Additionally, we fill the image by considering overlapping patches to ensure smoothness in the output. We use structural similarity index as the measure of similarity between ground truth and inpainted image. The results of the proposed approach on a number of examples on real and synthetic images show the effectiveness of our algorithm in removing objects and thin scratches or text written on image. It is also shown that the proposed approach is robust to the shape of the manually selected target. Our results compare favorably to those obtained by existing techniques
Resumo:
The performance of two curved beam finite element models based on coupled polynomial displacement fields is investigated for out-of-plane vibration of arches. These two-noded beam models employ curvilinear strain definitions and have three degrees of freedom per node namely, out-of-plane translation (v), out-of-plane bending rotation (theta(z)) and torsion rotation (theta(s)). The coupled polynomial interpolation fields are derived independently for Timoshenko and Euler-Bernoulli beam elements using the force-moment equilibrium equations. Numerical performance of these elements for constrained and unconstrained arches is compared with the conventional curved beam models which are based on independent polynomial fields. The formulation is shown to be free from any spurious constraints in the limit of `flexureless torsion' and `torsionless flexure' and hence devoid of flexure and torsion locking. The resulting stiffness and consistent mass matrices generated from the coupled displacement models show excellent convergence of natural frequencies in locking regimes. The accuracy of the shear flexibility added to the elements is also demonstrated. The coupled polynomial models are shown to perform consistently over a wide range of flexure-to-shear (EI/GA) and flexure-to-torsion (EI/GJ) stiffness ratios and are inherently devoid of flexure, torsion and shear locking phenomena. (C) 2015 Elsevier B.V. All rights reserved.
Resumo:
Purpose: A prior image based temporally constrained reconstruction ( PITCR) algorithm was developed for obtaining accurate temperature maps having better volume coverage, and spatial, and temporal resolution than other algorithms for highly undersampled data in magnetic resonance (MR) thermometry. Methods: The proposed PITCR approach is an algorithm that gives weight to the prior image and performs accurate reconstruction in a dynamic imaging environment. The PITCR method is compared with the temporally constrained reconstruction (TCR) algorithm using pork muscle data. Results: The PITCR method provides superior performance compared to the TCR approach with highly undersampled data. The proposed approach is computationally expensive compared to the TCR approach, but this could be overcome by the advantage of reconstructing with fewer measurements. In the case of reconstruction of temperature maps from 16% of fully sampled data, the PITCR approach was 1.57x slower compared to the TCR approach, while the root mean square error using PITCR is 0.784 compared to 2.815 with the TCR scheme. Conclusions: The PITCR approach is able to perform more accurate reconstructions of temperature maps compared to the TCR approach with highly undersampled data in MR guided high intensity focused ultrasound. (C) 2015 American Association of Physicists in Medicine.
Resumo:
This paper presents the design and implementation of PolyMage, a domain-specific language and compiler for image processing pipelines. An image processing pipeline can be viewed as a graph of interconnected stages which process images successively. Each stage typically performs one of point-wise, stencil, reduction or data-dependent operations on image pixels. Individual stages in a pipeline typically exhibit abundant data parallelism that can be exploited with relative ease. However, the stages also require high memory bandwidth preventing effective utilization of parallelism available on modern architectures. For applications that demand high performance, the traditional options are to use optimized libraries like OpenCV or to optimize manually. While using libraries precludes optimization across library routines, manual optimization accounting for both parallelism and locality is very tedious. The focus of our system, PolyMage, is on automatically generating high-performance implementations of image processing pipelines expressed in a high-level declarative language. Our optimization approach primarily relies on the transformation and code generation capabilities of the polyhedral compiler framework. To the best of our knowledge, this is the first model-driven compiler for image processing pipelines that performs complex fusion, tiling, and storage optimization automatically. Experimental results on a modern multicore system show that the performance achieved by our automatic approach is up to 1.81x better than that achieved through manual tuning in Halide, a state-of-the-art language and compiler for image processing pipelines. For a camera raw image processing pipeline, our performance is comparable to that of a hand-tuned implementation.
Resumo:
Among the multiple advantages and applications of remote sensing, one of the most important uses is to solve the problem of crop classification, i.e., differentiating between various crop types. Satellite images are a reliable source for investigating the temporal changes in crop cultivated areas. In this letter, we propose a novel bat algorithm (BA)-based clustering approach for solving crop type classification problems using a multispectral satellite image. The proposed partitional clustering algorithm is used to extract information in the form of optimal cluster centers from training samples. The extracted cluster centers are then validated on test samples. A real-time multispectral satellite image and one benchmark data set from the University of California, Irvine (UCI) repository are used to demonstrate the robustness of the proposed algorithm. The performance of the BA is compared with two other nature-inspired metaheuristic techniques, namely, genetic algorithm and particle swarm optimization. The performance is also compared with the existing hybrid approach such as the BA with K-means. From the results obtained, it can be concluded that the BA can be successfully applied to solve crop type classification problems.
Resumo:
Fingerprints are used for identification in forensics and are classified into Manual and Automatic. Automatic fingerprint identification system is classified into Latent and Exemplar. A novel Exemplar technique of Fingerprint Image Verification using Dictionary Learning (FIVDL) is proposed to improve the performance of low quality fingerprints, where Dictionary learning method reduces the time complexity by using block processing instead of pixel processing. The dynamic range of an image is adjusted by using Successive Mean Quantization Transform (SMQT) technique and the frequency domain noise is reduced using spectral frequency Histogram Equalization. Then, an adaptive nonlinear dynamic range adjustment technique is utilized to determine the local spectral features on corresponding fingerprint ridge frequency and orientation. The dictionary is constructed using spatial fundamental frequency that is determined from the spectral features. These dictionaries help in removing the spurious noise present in fingerprints and reduce the time complexity by using block processing instead of pixel processing. Further, dictionaries are used to reconstruct the image for matching. The proposed FIVDL is verified on FVC database sets and Experimental result shows an improvement over the state-of-the-art techniques. (C) 2015 The Authors. Published by Elsevier B.V.
Resumo:
The bilateral filter is known to be quite effective in denoising images corrupted with small dosages of additive Gaussian noise. The denoising performance of the filter, however, is known to degrade quickly with the increase in noise level. Several adaptations of the filter have been proposed in the literature to address this shortcoming, but often at a substantial computational overhead. In this paper, we report a simple pre-processing step that can substantially improve the denoising performance of the bilateral filter, at almost no additional cost. The modified filter is designed to be robust at large noise levels, and often tends to perform poorly below a certain noise threshold. To get the best of the original and the modified filter, we propose to combine them in a weighted fashion, where the weights are chosen to minimize (a surrogate of) the oracle mean-squared-error (MSE). The optimally-weighted filter is thus guaranteed to perform better than either of the component filters in terms of the MSE, at all noise levels. We also provide a fast algorithm for the weighted filtering. Visual and quantitative denoising results on standard test images are reported which demonstrate that the improvement over the original filter is significant both visually and in terms of PSNR. Moreover, the denoising performance of the optimally-weighted bilateral filter is competitive with the computation-intensive non-local means filter.