976 resultados para Filtering techniques
Resumo:
One of the significant advancements in Nuclear Magnetic Resonance spectroscopy (NMR) in combating the problem of spectral complexity for deriving the structure and conformational information is the incorporation of additional dimension and to spread the information content in a two dimensional space. This approach together with the manipulation of the dynamics of nuclear spins permitted the designing of appropriate pulse sequences leading to the evolution of diverse multidimensional NMR experiments. The desired spectral information can now be extracted in a simplified and an orchestrated manner. The indirect detection of multiple quantum (MQ) NMR frequencies is a step in this direction. The MQ technique has been extensively used in the study of molecules aligned in liquid crystalline media to reduce spectral complexity and to determine molecular geometries. Unlike in dipolar coupled systems, the size of the network of scalar coupled spins is not big in isotropic solutions and the MQ 1H detection is not routinely employed,although there are specific examples of spin topology filtering. In this brief review, we discuss our recent studies on the development and application of multiple quantum correlation and resolved techniques for the analyses of proton NMR spectra of scalar coupled spins.
Resumo:
In this paper we present an information filtering agent called sharable instructable information filtering agent (SIIFA). It adopted the approach of sharable instructable agents. SIIFA provides comprehensible and flexible interaction to represent and filter the documents. The representation scheme in SIIFA is personalized. It, either fully or partly, can be shared among the users of the stream while not revealing their interests and can be easily edited. SIIFA is evaluated on the comp.ai.neural-nets Usent newsgroup documents and compared with the vector space method.
Resumo:
A detailed study on the removal of pollutants (NOx, aldehydes and CO) from the exhaust of a stationary diesel engine is carried out using barrier discharge hybrid plasma techniques. The objective of the study is to make a comparative analysis. For this purpose, the exhaust treatment was carried out in two stages. In the first stage, the exhaust was treated with plasma process and plasma-adsorbent hybrid process. The effectiveness of the two processes with regard to NOx removal and by-product reduction was discussed. In the second stage, the exhaust was treated by plasma and plasma-catalyst hybrid process. The effectiveness of the two processes with regard to pollutants (NOx, CO) removal and by-product reduction was analyzed. Finally, a comprehensive comparison of different techniques has been made and feasible plasma based hybrid techniques for stationary and non-stationary engine exhaust treatments were proposed.
Resumo:
This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.
Resumo:
There has been growing interest in understanding energy metabolism in human embryos generated using assisted reproductive techniques (ART) for improving the overall success rate of the method. Using NMR spectroscopy as a noninvasive tool, we studied human embryo metabolism to identify specific biomarkers to assess the quality of embryos for their implantation potential. The study was based on estimation of pyruvate, lactate and alanine levels in the growth medium, ISM1, used in the culture of embryos. An NMR study involving 127 embryos from 48 couples revealed that embryos transferred on Day 3 (after 72 h in vitro culture) with successful implantation (pregnancy) exhibited significantly (p < 10(-5)) lower pyruvate/alanine ratios compared to those that failed to implant. Lactate levels in media were similar for all embryos. This implies that in addition to lactate production, successfully implanted embryos use pyruvate to produce alanine and other cellular functions. While pyruvate and alanine individually have been used as biomarkers, the present study highlights the potential of combining them to provide a single parameter that correlates strongly with implantation potential. Copyright (C) 2012 John Wiley & Sons, Ltd.
Resumo:
Savitzky-Golay (S-G) filters are finite impulse response lowpass filters obtained while smoothing data using a local least-squares (LS) polynomial approximation. Savitzky and Golay proved in their hallmark paper that local LS fitting of polynomials and their evaluation at the mid-point of the approximation interval is equivalent to filtering with a fixed impulse response. The problem that we address here is, ``how to choose a pointwise minimum mean squared error (MMSE) S-G filter length or order for smoothing, while preserving the temporal structure of a time-varying signal.'' We solve the bias-variance tradeoff involved in the MMSE optimization using Stein's unbiased risk estimator (SURE). We observe that the 3-dB cutoff frequency of the SURE-optimal S-G filter is higher where the signal varies fast locally, and vice versa, essentially enabling us to suitably trade off the bias and variance, thereby resulting in near-MMSE performance. At low signal-to-noise ratios (SNRs), it is seen that the adaptive filter length algorithm performance improves by incorporating a regularization term in the SURE objective function. We consider the algorithm performance on real-world electrocardiogram (ECG) signals. The results exhibit considerable SNR improvement. Noise performance analysis shows that the proposed algorithms are comparable, and in some cases, better than some standard denoising techniques available in the literature.
Resumo:
We address the classical problem of delta feature computation, and interpret the operation involved in terms of Savitzky- Golay (SG) filtering. Features such as themel-frequency cepstral coefficients (MFCCs), obtained based on short-time spectra of the speech signal, are commonly used in speech recognition tasks. In order to incorporate the dynamics of speech, auxiliary delta and delta-delta features, which are computed as temporal derivatives of the original features, are used. Typically, the delta features are computed in a smooth fashion using local least-squares (LS) polynomial fitting on each feature vector component trajectory. In the light of the original work of Savitzky and Golay, and a recent article by Schafer in IEEE Signal Processing Magazine, we interpret the dynamic feature vector computation for arbitrary derivative orders as SG filtering with a fixed impulse response. This filtering equivalence brings in significantly lower latency with no loss in accuracy, as validated by results on a TIMIT phoneme recognition task. The SG filters involved in dynamic parameter computation can be viewed as modulation filters, proposed by Hermansky.
Resumo:
This paper considers the problem of identifying the footprints of communication of multiple transmitters in a given geographical area. To do this, a number of sensors are deployed at arbitrary but known locations in the area, and their individual decisions regarding the presence or absence of the transmitters' signal are combined at a fusion center to reconstruct the spatial spectral usage map. One straightforward scheme to construct this map is to query each of the sensors and cluster the sensors that detect the primary's signal. However, using the fact that a typical transmitter footprint map is a sparse image, two novel compressive sensing based schemes are proposed, which require significantly fewer number of transmissions compared to the querying scheme. A key feature of the proposed schemes is that the measurement matrix is constructed from a pseudo-random binary phase shift applied to the decision of each sensor prior to transmission. The measurement matrix is thus a binary ensemble which satisfies the restricted isometry property. The number of measurements needed for accurate footprint reconstruction is determined using compressive sampling theory. The three schemes are compared through simulations in terms of a performance measure that quantifies the accuracy of the reconstructed spatial spectral usage map. It is found that the proposed sparse reconstruction technique-based schemes significantly outperform the round-robin scheme.
Resumo:
Western Blot analysis is an analytical technique used in Molecular Biology, Biochemistry, Immunogenetics and other Molecular Biology studies to separate proteins by electrophoresis. The procedure results in images containing nearly rectangular-shaped blots. In this paper, we address the problem of quantitation of the blots using automated image processing techniques. We formulate a special active contour (or snake) called Oblong, which locks on to rectangular shaped objects. Oblongs depend on five free parameters, which is also the minimum number of parameters required for a unique characterization. Unlike many snake formulations, Oblongs do not require explicit gradient computations and therefore the optimization is carried out fast. The performance of Oblongs is assessed on synthesized data and Western Blot Analysis images.
Resumo:
We address the problem of speech enhancement in real-world noisy scenarios. We propose to solve the problem in two stages, the first comprising a generalized spectral subtraction technique, followed by a sequence of perceptually-motivated post-processing algorithms. The role of the post-processing algorithms is to compensate for the effects of noise as well as to suppress any artifacts created by the first-stage processing. The key post-processing mechanisms are aimed at suppressing musical noise and to enhance the formant structure of voiced speech as well as to denoise the linear-prediction residual. The parameter values in the techniques are fixed optimally by experimentally evaluating the enhancement performance as a function of the parameters. We used the Carnegie-Mellon university Arctic database for our experiments. We considered three real-world noise types: fan noise, car noise, and motorbike noise. The enhancement performance was evaluated by conducting listening experiments on 12 subjects. The listeners reported a clear improvement (MOS improvement of 0.5 on an average) over the noisy signal in the perceived quality (increase in the mean-opinion score (MOS)) for positive signal-to-noise-ratios (SNRs). For negative SNRs, however, the improvement was found to be marginal.
Resumo:
In this paper we propose a postprocessing technique for a spectrogram diffusion based harmonic/percussion decom- position algorithm. The proposed technique removes har- monic instrument leakages in the percussion enhanced out- puts of the baseline algorithm. The technique uses median filtering and an adaptive detection of percussive segments in subbands followed by piecewise signal reconstruction using envelope properties to ensure that percussion is enhanced while harmonic leakages are suppressed. A new binary mask is created for the percussion signal which upon applying on the original signal improves harmonic versus percussion separation. We compare our algorithm with two recent techniques and show that on a database of polyphonic Indian music, the postprocessing algorithm improves the harmonic versus percussion decomposition significantly.
Resumo:
We analyze the spectral zero-crossing rate (SZCR) properties of transient signals and show that SZCR contains accurate localization information about the transient. For a train of pulses containing transient events, the SZCR computed on a sliding window basis is useful in locating the impulse locations accurately. We present the properties of SZCR on standard stylized signal models and then show how it may be used to estimate the epochs in speech signals. We also present comparisons with some state-of-the-art techniques that are based on the group-delay function. Experiments on real speech show that the proposed SZCR technique is better than other group-delay-based epoch detectors. In the presence of noise, a comparison with the zero-frequency filtering technique (ZFF) and Dynamic programming projected Phase-Slope Algorithm (DYPSA) showed that performance of the SZCR technique is better than DYPSA and inferior to that of ZFF. For highpass-filtered speech, where ZFF performance suffers drastically, the identification rates of SZCR are better than those of DYPSA.
Resumo:
Reliable estimates of species density are fundamental to planning conservation strategies for any species; further, it is equally crucial to identify the most appropriate technique to estimate animal density. Nocturnal, small-sized animal species are notoriously difficult to census accurately and this issue critically affects their conservation status, We carried out a field study in southern India to estimate the density of slender loris, a small-sized nocturnal primate using line and strip transects. Actual counts of study individuals yielded a density estimate of 1.61 ha(-1); density estimate from line transects was 1.08 ha(-1); and density estimates varied from 1.06 ha(-1) to 0.59 ha(-1) in different fixed-width strip transects. We conclude that line and strip transects may typically underestimate densities of cryptic, nocturnal primates.