983 resultados para Adaptive Image Binarization
Resumo:
Purpose: A prior image based temporally constrained reconstruction ( PITCR) algorithm was developed for obtaining accurate temperature maps having better volume coverage, and spatial, and temporal resolution than other algorithms for highly undersampled data in magnetic resonance (MR) thermometry. Methods: The proposed PITCR approach is an algorithm that gives weight to the prior image and performs accurate reconstruction in a dynamic imaging environment. The PITCR method is compared with the temporally constrained reconstruction (TCR) algorithm using pork muscle data. Results: The PITCR method provides superior performance compared to the TCR approach with highly undersampled data. The proposed approach is computationally expensive compared to the TCR approach, but this could be overcome by the advantage of reconstructing with fewer measurements. In the case of reconstruction of temperature maps from 16% of fully sampled data, the PITCR approach was 1.57x slower compared to the TCR approach, while the root mean square error using PITCR is 0.784 compared to 2.815 with the TCR scheme. Conclusions: The PITCR approach is able to perform more accurate reconstructions of temperature maps compared to the TCR approach with highly undersampled data in MR guided high intensity focused ultrasound. (C) 2015 American Association of Physicists in Medicine.
Resumo:
This paper presents the design and implementation of PolyMage, a domain-specific language and compiler for image processing pipelines. An image processing pipeline can be viewed as a graph of interconnected stages which process images successively. Each stage typically performs one of point-wise, stencil, reduction or data-dependent operations on image pixels. Individual stages in a pipeline typically exhibit abundant data parallelism that can be exploited with relative ease. However, the stages also require high memory bandwidth preventing effective utilization of parallelism available on modern architectures. For applications that demand high performance, the traditional options are to use optimized libraries like OpenCV or to optimize manually. While using libraries precludes optimization across library routines, manual optimization accounting for both parallelism and locality is very tedious. The focus of our system, PolyMage, is on automatically generating high-performance implementations of image processing pipelines expressed in a high-level declarative language. Our optimization approach primarily relies on the transformation and code generation capabilities of the polyhedral compiler framework. To the best of our knowledge, this is the first model-driven compiler for image processing pipelines that performs complex fusion, tiling, and storage optimization automatically. Experimental results on a modern multicore system show that the performance achieved by our automatic approach is up to 1.81x better than that achieved through manual tuning in Halide, a state-of-the-art language and compiler for image processing pipelines. For a camera raw image processing pipeline, our performance is comparable to that of a hand-tuned implementation.
Resumo:
Among the multiple advantages and applications of remote sensing, one of the most important uses is to solve the problem of crop classification, i.e., differentiating between various crop types. Satellite images are a reliable source for investigating the temporal changes in crop cultivated areas. In this letter, we propose a novel bat algorithm (BA)-based clustering approach for solving crop type classification problems using a multispectral satellite image. The proposed partitional clustering algorithm is used to extract information in the form of optimal cluster centers from training samples. The extracted cluster centers are then validated on test samples. A real-time multispectral satellite image and one benchmark data set from the University of California, Irvine (UCI) repository are used to demonstrate the robustness of the proposed algorithm. The performance of the BA is compared with two other nature-inspired metaheuristic techniques, namely, genetic algorithm and particle swarm optimization. The performance is also compared with the existing hybrid approach such as the BA with K-means. From the results obtained, it can be concluded that the BA can be successfully applied to solve crop type classification problems.
Resumo:
The bilateral filter is known to be quite effective in denoising images corrupted with small dosages of additive Gaussian noise. The denoising performance of the filter, however, is known to degrade quickly with the increase in noise level. Several adaptations of the filter have been proposed in the literature to address this shortcoming, but often at a substantial computational overhead. In this paper, we report a simple pre-processing step that can substantially improve the denoising performance of the bilateral filter, at almost no additional cost. The modified filter is designed to be robust at large noise levels, and often tends to perform poorly below a certain noise threshold. To get the best of the original and the modified filter, we propose to combine them in a weighted fashion, where the weights are chosen to minimize (a surrogate of) the oracle mean-squared-error (MSE). The optimally-weighted filter is thus guaranteed to perform better than either of the component filters in terms of the MSE, at all noise levels. We also provide a fast algorithm for the weighted filtering. Visual and quantitative denoising results on standard test images are reported which demonstrate that the improvement over the original filter is significant both visually and in terms of PSNR. Moreover, the denoising performance of the optimally-weighted bilateral filter is competitive with the computation-intensive non-local means filter.
Resumo:
We address the problem of denoising images corrupted by multiplicative noise. The noise is assumed to follow a Gamma distribution. Compared with additive noise distortion, the effect of multiplicative noise on the visual quality of images is quite severe. We consider the mean-square error (MSE) cost function and derive an expression for an unbiased estimate of the MSE. The resulting multiplicative noise unbiased risk estimator is referred to as MURE. The denoising operation is performed in the wavelet domain by considering the image-domain MURE. The parameters of the denoising function (typically, a shrinkage of wavelet coefficients) are optimized for by minimizing MURE. We show that MURE is accurate and close to the oracle MSE. This makes MURE-based image denoising reliable and on par with oracle-MSE-based estimates. Analogous to the other popular risk estimation approaches developed for additive, Poisson, and chi-squared noise degradations, the proposed approach does not assume any prior on the underlying noise-free image. We report denoising results for various noise levels and show that the quality of denoising obtained is on par with the oracle result and better than that obtained using some state-of-the-art denoisers.
Resumo:
对薄板成形应变场传统的测量方法进行了研究,指出了其不足和误差的来源,提出了数字图像分析法测量薄板成形中的应变场,对测量原理、新的测量方法对传统方法的改进,以及如何降低误差进行了介绍,指出数字图像分析法的前景,提出了改进意见。
Resumo:
A comprehensive model of laser propagation in the atmosphere with a complete adaptive optics (AO) system for phase compensation is presented, and a corresponding computer program is compiled. A direct wave-front gradient control method is used to reconstruct the wave-front phase. With the long-exposure Strehl ratio as the evaluation parameter, a numerical simulation of an AO system in a stationary state with the atmospheric propagation of a laser beam was conducted. It was found that for certain conditions the phase screen that describes turbulence in the atmosphere might not be isotropic. Numerical experiments show that the computational results in imaging of lenses by means of the fast Fourier transform (FFT) method agree well with those computed by means of an integration method. However, the computer time required for the FFT method is 1 order of magnitude less than that of the integration method. Phase tailoring of the calculated phase is presented as a means to solve the problem that variance of the calculated residual phase does not correspond to the correction effectiveness of an AO system. It is found for the first time to our knowledge that for a constant delay time of an AO system, when the lateral wind speed exceeds a threshold, the compensation effectiveness of an AO system is better than that of complete phase conjugation. This finding indicates that the better compensation capability of an AO system does not mean better correction effectiveness. (C) 2000 Optical Society of America.
Resumo:
It is well known that noise and detection error can affect the performances of an adaptive optics (AO) system. Effects of noise and detection error on the phase compensation effectiveness in a dynamic AO system are investigated by means of a pure numerical simulation in this paper. A theoretical model for numerically simulating effects of noise and detection error in a static AO system and a corresponding computer program were presented in a previous article. A numerical simulation of effects of noise and detection error is combined with our previous numeral simulation of a dynamic AO system in this paper and a corresponding computer program has been compiled. Effects of detection error, readout noise and photon noise are included and investigated by a numerical simulation for finding the preferred working conditions and the best performances in a practical dynamic AO system. An approximate model is presented as well. Under many practical conditions such approximate model is a good alternative to the more accurate one. A simple algorithm which can be used for reducing the effect of noise is presented as well. When signal to noise ratio is very low, such method can be used to improve the performances of a dynamic AO system.
Resumo:
A new particle image technique was developed to analyze the dispersion of tracer particles in an internally circulating fluidized bed (ICFB). The movement course and the concentration distribution of tracer particles in the bed were imaged and the degree of inhomogeneity of tracer particles was analyzed. The lateral and axial dispersion coefficients of particles were calculated for various zones in ICFB. Results indicate that the lateral diffusion coefficient in the fluidized bed with uneven air distribution is significantly higher than that in uniform bubbling beds with even air distribution. The dispersion coefficients are different along bed length and height.
Resumo:
This paper proposes to use an extended Gaussian Scale Mixtures (GSM) model instead of the conventional ℓ1 norm to approximate the sparseness constraint in the wavelet domain. We combine this new constraint with subband-dependent minimization to formulate an iterative algorithm on two shift-invariant wavelet transforms, the Shannon wavelet transform and dual-tree complex wavelet transform (DTCWT). This extented GSM model introduces spatially varying information into the deconvolution process and thus enables the algorithm to achieve better results with fewer iterations in our experiments. ©2009 IEEE.