47 resultados para Blur and noise removal

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Removing noise from signals which are piecewise constant (PWC) is a challenging signal processing problem that arises in many practical scientific and engineering contexts. In the first paper (part I) of this series of two, we presented background theory building on results from the image processing community to show that the majority of these algorithms, and more proposed in the wider literature, are each associated with a special case of a generalized functional, that, when minimized, solves the PWC denoising problem. It shows how the minimizer can be obtained by a range of computational solver algorithms. In this second paper (part II), using this understanding developed in part I, we introduce several novel PWC denoising methods, which, for example, combine the global behaviour of mean shift clustering with the local smoothing of total variation diffusion, and show example solver algorithms for these new methods. Comparisons between these methods are performed on synthetic and real signals, revealing that our new methods have a useful role to play. Finally, overlaps between the generalized methods of these two papers and others such as wavelet shrinkage, hidden Markov models, and piecewise smooth filtering are touched on.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many models of edge analysis in biological vision, the initial stage is a linear 2nd derivative operation. Such models predict that adding a linear luminance ramp to an edge will have no effect on the edge's appearance, since the ramp has no effect on the 2nd derivative. Our experiments did not support this prediction: adding a negative-going ramp to a positive-going edge (or vice-versa) greatly reduced the perceived blur and contrast of the edge. The effects on a fairly sharp edge were accurately predicted by a nonlinear multi-scale model of edge processing [Georgeson, M. A., May, K. A., Freeman, T. C. A., & Hesse, G. S. (in press). From filters to features: Scale-space analysis of edge and blur coding in human vision. Journal of Vision], in which a half-wave rectifier comes after the 1st derivative filter. But we also found that the ramp affected perceived blur more profoundly when the edge blur was large, and this greater effect was not predicted by the existing model. The model's fit to these data was much improved when the simple half-wave rectifier was replaced by a threshold-like transducer [May, K. A. & Georgeson, M. A. (2007). Blurred edges look faint, and faint edges look sharp: The effect of a gradient threshold in a multi-scale edge coding model. Vision Research, 47, 1705-1720.]. This modified model correctly predicted that the interaction between ramp gradient and edge scale would be much larger for blur perception than for contrast perception. In our model, the ramp narrows an internal representation of the gradient profile, leading to a reduction in perceived blur. This in turn reduces perceived contrast because estimated blur plays a role in the model's estimation of contrast. Interestingly, the model predicts that analogous effects should occur when the width of the window containing the edge is made narrower. This has already been confirmed for blur perception; here, we further support the model by showing a similar effect for contrast perception. © 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Q parameter scales differently with the noise power for the signal-noise and the noise-noise beating terms in scalar and vector models. Some procedures for including noise in the scalar model largely under-estimate the Q parameter.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Activated sludge basins (ASBs) are a key-step in wastewater treatment processes that are used to eliminate biodegradable pollution from the water discharged to the natural environment. Bacteria found in the activated sludge consume and assimilate nutrients such as carbon, nitrogen and phosphorous under specific environmental conditions. However, applying the appropriate agitation and aeration regimes to supply the environmental conditions to promote the growth of the bacteria is not easy. The agitation and aeration regimes that are applied to activated sludge basins have a strong influence on the efficacy of wastewater treatment processes. The major aims of agitation by submersible mixers are to improve the contact between biomass and wastewater and the prevention of biomass settling. They induce a horizontal flow in the oxidation ditch, which can be quantified by the mean horizontal velocity. Mean values of 0.3-0.35 m s-1 are recommended as a design criteria to ensure best conditions for mixing and aeration (Da Silva, 1994). To give circulation velocities of this order of magnitude, the positioning and types of mixers are chosen from the plant constructors' experience and the suppliers' data for the impellers. Some case studies of existing plants have shown that measured velocities were not in the range that was specified in the plant design. This illustrates that there is still a need for design and diagnosis approach to improve process reliability by eliminating or reducing the number of short circuits, dead zones, zones of inefficient mixing and poor aeration. The objective of the aeration is to facilitate the quick degradation of pollutants by bacterial growth. To achieve these objectives a wastewater treatment plant must be adequately aerated; thus resulting in 60-80% of all energetic consummation being dedicated to the aeration alone (Juspin and Vasel, 2000). An earlier study (Gillot et al., 1997) has illustrated the influence that hydrodynamics have on the aeration performance as measure by the oxygen transfer coefficient. Therefore, optimising the agitation and aeration systems can enhance the oxygen transfer coefficient and consequently reduce the operating costs of the wastewater treatment plant. It is critically important to correctly estimate the mass transfer coefficient as any errors could result in the simulations of biological activity not being physically representative. Therefore, the transfer process was rigorously examined in several different types of process equipment to determine the impact that different hydrodynamic regimes and liquid-side film transfer coefficients have on the gas phase and the mass transfer of oxygen. To model the biological activity occurring in ASBs, several generic biochemical reaction models have been developed to characterise different biochemical reaction processes that are known as Activated Sludge Models, ASM (Henze et al., 2000). The ASM1 protocol was selected to characterise the impact of aeration on the bacteria consuming and assimilating ammonia and nitrate in the wastewater. However, one drawback of ASM protocols is that the hydrodynamics are assumed to be uniform by the use of perfectly mixed, plug flow reactors or as a number of perfectly mixed reactors in series. This makes it very difficult to identify the influence of mixing and aeration on oxygen mass transfer and biological activity. Therefore, to account for the impact of local gas-liquid mixing regime on the biochemical activity Computational Fluid Dynamics (CFD) was used by applying the individual ASM1 reaction equations as the source terms to a number of scalar equations. Thus, the application of ASM1 to CFD (FLUENT) enabled the investigation of the oxygen transfer efficiency and the carbon & nitrogen biological removal in pilot (7.5 cubic metres) and plant scale (6000 cubic metres) ASBs. Both studies have been used to validate the effect that the hydrodynamic regime has on oxygen mass transfer (the circulation velocity and mass transfer coefficient) and the effect that this had on the biological activity on pollutants such as ammonia and nitrate (Cartland Glover et al., 2005). The work presented here is one part to of an overall approach for improving the understanding of ASBs and the impact that they have in terms of the hydraulic and biological performance on the overall wastewater treatment process. References CARTLAND GLOVER G., PRINTEMPS C., ESSEMIANI K., MEINHOLD J., (2005) Modelling of wastewater treatment plants ? How far shall we go with sophisticated modelling tools? 3rd IWA Leading-Edge Conference & Exhibition on Water and Wastewater Treatment Technologies, 6-8 June 2005, Sapporo, Japan DA SILVA G. (1994). Eléments d'optimisation du transfert d'oxygène par fines bulles et agitateur séparé en chenal d'oxydation. PhD Thesis. CEMAGREF Antony ? France. GILLOT S., DERONZIER G., HEDUIT A. (1997). Oxygen transfer under process conditions in an oxidation ditch equipped with fine bubble diffusers and slow speed mixers. WEFTEC, Chicago, USA. HENZE M., GUJER W., MINO T., van LOOSDRECHT M., (2000). Activated Sludge Models ASM1, ASM2, ASM2D and ASM3, Scientific and Technical Report No. 9. IWA Publishing, London, UK. JUSPIN H., VASEL J.-L. (2000). Influence of hydrodynamics on oxygen transfer in the activated sludge process. IWA, Paris - France.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the present paper we numerically study instrumental impact on statistical properties of quasi-CW Raman fiber laser using a simple model of multimode laser radiation. Effects, that have the most influence, are limited electrical bandwidth of measurement equipment and noise. To check this influence, we developed a simple model of the multimode quasi- CW generation with exponential statistics (i.e. uncorrelated modes). We found that the area near zero intensity in probability density function (PDF) is strongly affected by both factors, for example both lead to formation of a negative wing of intensity distribution. But far wing slope of PDF is not affected by noise and, for moderate mismatch between optical and electrical bandwidth, is only slightly affected by bandwidth limitation. The generation spectrum often becomes broader at higher power in experiments, so the spectral/electrical bandwidth mismatch factor increases over the power that can lead to artificial dependence of the PDF slope over the power. It was also found that both effects influence the ACF background level: noise impact decreases it, while limited bandwidth leads to its increase. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is well known that even slight changes in nonuniform illumination lead to a large image variability and are crucial for many visual tasks. This paper presents a new ICA related probabilistic model where the number of sources exceeds the number of sensors to perform an image segmentation and illumination removal, simultaneously. We model illumination and reflectance in log space by a generalized autoregressive process and Hidden Gaussian Markov random field, respectively. The model ability to deal with segmentation of illuminated images is compared with a Canny edge detector and homomorphic filtering. We apply the model to two problems: synthetic image segmentation and sea surface pollution detection from intensity images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multi-scale model of edge coding based on normalized Gaussian derivative filters successfully predicts perceived scale (blur) for a wide variety of edge profiles [Georgeson, M. A., May, K. A., Freeman, T. C. A., & Hesse, G. S. (in press). From filters to features: Scale-space analysis of edge and blur coding in human vision. Journal of Vision]. Our model spatially differentiates the luminance profile, half-wave rectifies the 1st derivative, and then differentiates twice more, to give the 3rd derivative of all regions with a positive gradient. This process is implemented by a set of Gaussian derivative filters with a range of scales. Peaks in the inverted normalized 3rd derivative across space and scale indicate the positions and scales of the edges. The edge contrast can be estimated from the height of the peak. The model provides a veridical estimate of the scale and contrast of edges that have a Gaussian integral profile. Therefore, since scale and contrast are independent stimulus parameters, the model predicts that the perceived value of either of these parameters should be unaffected by changes in the other. This prediction was found to be incorrect: reducing the contrast of an edge made it look sharper, and increasing its scale led to a decrease in the perceived contrast. Our model can account for these effects when the simple half-wave rectifier after the 1st derivative is replaced by a smoothed threshold function described by two parameters. For each subject, one pair of parameters provided a satisfactory fit to the data from all the experiments presented here and in the accompanying paper [May, K. A. & Georgeson, M. A. (2007). Added luminance ramp alters perceived edge blur and contrast: A critical test for derivative-based models of edge coding. Vision Research, 47, 1721-1731]. Thus, when we allow for the visual system's insensitivity to very shallow luminance gradients, our multi-scale model can be extended to edge coding over a wide range of contrasts and blurs. © 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a template model for perception of edge blur and identify a crucial early nonlinearity in this process. The main principle is to spatially filter the edge image to produce a 'signature', and then find which of a set of templates best fits that signature. Psychophysical blur-matching data strongly support the use of a second-derivative signature, coupled to Gaussian first-derivative templates. The spatial scale of the best-fitting template signals the edge blur. This model predicts blur-matching data accurately for a wide variety of Gaussian and non-Gaussian edges, but it suffers a bias when edges of opposite sign come close together in sine-wave gratings and other periodic images. This anomaly suggests a second general principle: the region of an image that 'belongs' to a given edge should have a consistent sign or direction of luminance gradient. Segmentation of the gradient profile into regions of common sign is achieved by implementing the second-derivative 'signature' operator as two first-derivative operators separated by a half-wave rectifier. This multiscale system of nonlinear filters predicts perceived blur accurately for periodic and aperiodic waveforms. We also outline its extension to 2-D images and infer the 2-D shape of the receptive fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Edge blur is an important perceptual cue, but how does the visual system encode the degree of blur at edges? Blur could be measured by the width of the luminance gradient profile, peak ^ trough separation in the 2nd derivative profile, or the ratio of 1st-to-3rd derivative magnitudes. In template models, the system would store a set of templates of different sizes and find which one best fits the `signature' of the edge. The signature could be the luminance profile itself, or one of its spatial derivatives. I tested these possibilities in blur-matching experiments. In a 2AFC staircase procedure, observers adjusted the blur of Gaussian edges (30% contrast) to match the perceived blur of various non-Gaussian test edges. In experiment 1, test stimuli were mixtures of 2 Gaussian edges (eg 10 and 30 min of arc blur) at the same location, while in experiment 2, test stimuli were formed from a blurred edge sharpened to different extents by a compressive transformation. Predictions of the various models were tested against the blur-matching data, but only one model was strongly supported. This was the template model, in which the input signature is the 2nd derivative of the luminance profile, and the templates are applied to this signature at the zero-crossings. The templates are Gaussian derivative receptive fields that covary in width and length to form a self-similar set (ie same shape, different sizes). This naturally predicts that shorter edges should look sharper. As edge length gets shorter, responses of longer templates drop more than shorter ones, and so the response distribution shifts towards shorter (smaller) templates, signalling a sharper edge. The data confirmed this, including the scale-invariance implied by self-similarity, and a good fit was obtained from templates with a length-to-width ratio of about 1. The simultaneous analysis of edge blur and edge location may offer a new solution to the multiscale problem in edge detection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computing circuits composed of noisy logical gates and their ability to represent arbitrary Boolean functions with a given level of error are investigated within a statistical mechanics setting. Existing bounds on their performance are straightforwardly retrieved, generalized, and identified as the corresponding typical-case phase transitions. Results on error rates, function depth, and sensitivity, and their dependence on the gate-type and noise model used are also obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with the optimising of hearing protector selection. A computer model was used to estimate the reduction in noise exposure and risk of occupational deafness provided by the wearing of hearing protectors in industrial noise spectra. The model was used to show that low attenuation hearing protectors con provide greater protection than high attenuation protectors if the high attenuation protectors ore not worn for the total duration of noise exposure; or not used by a small proportion of the population. The model was also used to show that high attenuation protectors will not necessarily provide significantly greater reduction in risk than low attenuation protectors if the population has been exposed to the noise for many years prior to the provision of hearing protectors. The effects of earplugs and earmuffs on the localisation of sounds were studied to determine whether high attenuation earmuffs are likely to have greater potential than the lower attenuation earplugs for affecting personal safety. Laboratory studies and experiments at a foundry with normal-hearing office employees and noise-exposed foundrymen who had some experience of wearing hearing protectors showed that although earplugs reduced the ability of the wearer to determine the direction of warning sounds, earmuffs produced more total angular error and more confusions between left and right. !t is concluded from the research findings that the key to the selection of hearing protectors is to be found in the provision of hearing protectors that can be worn for a very high percentage of the exposure time by a high percentage of the exposed population with the minimum effect on the personal safety of the wearers - the attenuation provided by the protection should be adequate but not a maximum value.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a study of how edges are detected and encoded by the human visual system. The study begins with theoretical work on the development of a model of edge processing, and includes psychophysical experiments on humans, and computer simulations of these experiments, using the model. The first chapter reviews the literature on edge processing in biological and machine vision, and introduces the mathematical foundations of this area of research. The second chapter gives a formal presentation of a model of edge perception that detects edges and characterizes their blur, contrast and orientation, using Gaussian derivative templates. This model has previously been shown to accurately predict human performance in blur matching tasks with several different types of edge profile. The model provides veridical estimates of the blur and contrast of edges that have a Gaussian integral profile. Since blur and contrast are independent parameters of Gaussian edges, the model predicts that varying one parameter should not affect perception of the other. Psychophysical experiments showed that this prediction is incorrect: reducing the contrast makes an edge look sharper; increasing the blur reduces the perceived contrast. Both of these effects can be explained by introducing a smoothed threshold to one of the processing stages of the model. It is shown that, with this modification,the model can predict the perceived contrast and blur of a number of edge profiles that differ markedly from the ideal Gaussian edge profiles on which the templates are based. With only a few exceptions, the results from all the experiments on blur and contrast perception can be explained reasonably well using one set of parameters for each subject. In the few cases where the model fails, possible extensions to the model are discussed.