40 resultados para Image processing.
Resumo:
Complexity is conventionally defined as the level of detail or intricacy contained within a picture. The study of complexity has received relatively little attention-in part, because of the absence of an acceptable metric. Traditionally, normative ratings of complexity have been based on human judgments. However, this study demonstrates that published norms for visual complexity are biased. Familiarity and learning influence the subjective complexity scores for nonsense shapes, with a significant training x familiarity interaction [F(1,52) = 17.53, p <.05]. Several image-processing techniques were explored as alternative measures of picture and image complexity. A perimeter detection measure correlates strongly with human judgments of the complexity of line drawings of real-world objects and nonsense shapes and captures some of the processes important in judgments of subjective complexity, while removing the bias due to familiarity effects.
Resumo:
An approach to spatialization is described in which the pixels of an image determine both spatial and other attributes of individual elements in a multi-channel musical texture. The application of this technique in the author’s composition Spaced Images with Noise and Lines is discussed in detail. The relationship of this technique to existing image-to-sound mappings is discussed. The particular advantage of modifying spatial properties with image filters is considered.
Resumo:
This paper investigates sub-integer implementations of the adaptive Gaussian mixture model (GMM) for background/foreground segmentation to allow the deployment of the method on low cost/low power processors that lack Floating Point Unit (FPU). We propose two novel integer computer arithmetic techniques to update Gaussian parameters. Specifically, the mean value and the variance of each Gaussian are updated by a redefined and generalised "round'' operation that emulates the original updating rules for a large set of learning rates. Weights are represented by counters that are updated following stochastic rules to allow a wider range of learning rates and the weight trend is approximated by a line or a staircase. We demonstrate that the memory footprint and computational cost of GMM are significantly reduced, without significantly affecting the performance of background/foreground segmentation.
Resumo:
The Field Programmable Gate Array (FPGA) implementation of the commonly used Histogram of Oriented Gradients (HOG) algorithm is explored. The HOG algorithm is employed to extract features for object detection. A key focus has been to explore the use of a new FPGA-based processor which has been targeted at image processing. The paper gives details of the mapping and scheduling factors that influence the performance and the stages that were undertaken to allow the algorithm to be deployed on FPGA hardware, whilst taking into account the specific IPPro architecture features. We show that multi-core IPPro performance can exceed that of against state-of-the-art FPGA designs by up to 3.2 times with reduced design and implementation effort and increased flexibility all on a low cost, Zynq programmable system.
Resumo:
The Richardson–Lucy algorithm is one of the most important in image deconvolution. However, a drawback is its slow convergence. A significant acceleration was obtained using the technique proposed by Biggs and Andrews (BA), which is implemented in the deconvlucy function of the image processing MATLAB toolbox. The BA method was developed heuristically with no proof of convergence. In this paper, we introduce the heavy-ball (H-B) method for Poisson data optimization and extend it to a scaled H-B method, which includes the BA method as a special case. The method has a proof of the convergence rateof O(K^2), where k is the number of iterations. We demonstrate the superior convergence performance, by a speedup factor off ive, of the scaled H-B method on both synthetic and real 3D images.