11 resultados para background noise
em Cochin University of Science
Resumo:
Speech is the most natural means of communication among human beings and speech processing and recognition are intensive areas of research for the last five decades. Since speech recognition is a pattern recognition problem, classification is an important part of any speech recognition system. In this work, a speech recognition system is developed for recognizing speaker independent spoken digits in Malayalam. Voice signals are sampled directly from the microphone. The proposed method is implemented for 1000 speakers uttering 10 digits each. Since the speech signals are affected by background noise, the signals are tuned by removing the noise from it using wavelet denoising method based on Soft Thresholding. Here, the features from the signals are extracted using Discrete Wavelet Transforms (DWT) because they are well suitable for processing non-stationary signals like speech. This is due to their multi- resolutional, multi-scale analysis characteristics. Speech recognition is a multiclass classification problem. So, the feature vector set obtained are classified using three classifiers namely, Artificial Neural Networks (ANN), Support Vector Machines (SVM) and Naive Bayes classifiers which are capable of handling multiclasses. During classification stage, the input feature vector data is trained using information relating to known patterns and then they are tested using the test data set. The performances of all these classifiers are evaluated based on recognition accuracy. All the three methods produced good recognition accuracy. DWT and ANN produced a recognition accuracy of 89%, SVM and DWT combination produced an accuracy of 86.6% and Naive Bayes and DWT combination produced an accuracy of 83.5%. ANN is found to be better among the three methods.
Resumo:
As the technologies for the fabrication of high quality microarray advances rapidly, quantification of microarray data becomes a major task. Gridding is the first step in the analysis of microarray images for locating the subarrays and individual spots within each subarray. For accurate gridding of high-density microarray images, in the presence of contamination and background noise, precise calculation of parameters is essential. This paper presents an accurate fully automatic gridding method for locating suarrays and individual spots using the intensity projection profile of the most suitable subimage. The method is capable of processing the image without any user intervention and does not demand any input parameters as many other commercial and academic packages. According to results obtained, the accuracy of our algorithm is between 95-100% for microarray images with coefficient of variation less than two. Experimental results show that the method is capable of gridding microarray images with irregular spots, varying surface intensity distribution and with more than 50% contamination
Resumo:
The measurement of global precipitation is of great importance in climate modeling since the release of latent heat associated with tropical convection is one of the pricipal driving mechanisms of atmospheric circulation.Knowledge of the larger-scale precipitation field also has important potential applications in the generation of initial conditions for numerical weather prediction models Knowledge of the relationship between rainfall intensity and kinetic energy, and its variations in time and space is important for erosion prediction. Vegetation on earth also greatly depends on the total amount of rainfall as well as the drop size distribution (DSD) in rainfall.While methods using visible,infrared, and microwave radiometer data have been shown to yield useful estimates of precipitation, validation of these products for the open ocean has been hampered by the limited amount of surface rainfall measurements available for accurate assessement, especially for the tropical oceans.Surface rain fall measurements(often called the ground truth)are carried out by rain gauges working on various principles like weighing type,tipping bucket,capacitive type and so on.The acoustic technique is yet another promising method of rain parameter measurement that has many advantages. The basic principle of acoustic method is that the droplets falling in water produce underwater sound with distinct features, using which the rainfall parameters can be computed. The acoustic technique can also be used for developing a low cost and accurate device for automatic measurement of rainfall rate and kinetic energy of rain.especially suitable for telemetry applications. This technique can also be utilized to develop a low cost Disdrometer that finds application in rainfall analysis as well as in calibration of nozzles and sprinklers. This thesis is divided into the following 7 chapters, which describes the methodology adopted, the results obtained and the conclusions arrived at.
Resumo:
This thesis addresses one of the emerging topics in Sonar Signal Processing.,viz.the implementation of a target classifier for the noise sources in the ocean, as the operator assisted classification turns out to be tedious,laborious and time consuming.In the work reported in this thesis,various judiciously chosen components of the feature vector are used for realizing the newly proposed Hierarchical Target Trimming Model.The performance of the proposed classifier has been compared with the Euclidean distance and Fuzzy K-Nearest Neighbour Model classifiers and is found to have better success rates.The procedures for generating the Target Feature Record or the Feature vector from the spectral,cepstral and bispectral features have also been suggested.The Feature vector ,so generated from the noise data waveform is compared with the feature vectors available in the knowledge base and the most matching pattern is identified,for the purpose of target classification.In an attempt to improve the success rate of the Feature Vector based classifier,the proposed system has been augmented with the HMM based Classifier.Institutions where both the classifier decisions disagree,a contention resolving mechanism built around the DUET algorithm has been suggested.
Resumo:
Nonlinear time series analysis is employed to study the complex behaviour exhibited by a coupled pair of Rossler systems. Dimensional analysis with emphasis on the topological correlation dimension and the Kolmogorov entropy of the system is carried out in the coupling parameter space. The regime of phase synchronization is identified and the extent of synchronization between the systems constituting the coupled system is quantified by the phase synchronization index. The effect of noise on the coupling between the systems is also investigated. An exhaustive study of the topological, dynamical and synchronization properties of the nonlinear system under consideration in its characteristic parameter space is attempted.
Resumo:
The Andaman-Nicobar Islands in the Bay of Bengal lies in a zone where the Indian plate subducts beneath the Burmese microplate, and therefore forms a belt of frequent earthquakes. Few efforts, not withstanding the available historical and instrumental data were not effectively used before the Mw 9.3 Sumatra-Andaman earthquake to draw any inference on the spatial and temporal distribution of large subduction zone earthquakes in this region. An attempt to constrain the active crustal deformation of the Andaman-Nicobar arc in the background of the December 26, 2004 Great Sumatra-Andaman megathrust earthquake is made here, thereby presenting a unique data set representing the pre-seismic convergence and co-seismic displacement.Understanding the mechanisms of the subduction zone earthquakes is both challenging sCientifically and important for assessing the related earthquake hazards. In many subduction zones, thrust earthquakes may have characteristic patterns in space and time. However, the mechanism of mega events still remains largely unresolved.Large subduction zone earthquakes are usually associated with high amplitude co-seismic deformation above the plate boundary megathrust and the elastic relaxation of the fore-arc. These are expressed as vertical changes in land level with the up-dip part of the rupture surface uplifted and the areas above the down-dip edge subsided. One of the most characteristic pattern associated with the inter-seismic era is that the deformation is in an opposite sense that of co-seismic period.This work was started in 2002 to understand the tectonic deformation along the Andaman-Nicobar arc using seismological, geological and geodetic data. The occurrence of the 2004 megathrust earthquake gave a new dimension to this study, by providing an opportunity to examine the co-seismic deformation associated with the greatest earthquake to have occurred since the advent of Global Positioning System (GPS) and broadband seismometry. The major objectives of this study are to assess the pre-seismic stress regimes, to determine the pre-seismic convergence rate, to analyze and interpret the pattern of co-seismic displacement and slip on various segments and to look out for any possible recurrence interval for megathrust event occurrence for Andaman-Nicobar subduction zone. This thesis is arranged in six chapters with further subdivisions dealing all the above aspects.
Resumo:
The understanding of the theory of entrepreneurship depends upon one set of definitions which provide the base for analytical study. The main objective of the study was to understand the distribution of entrepreneurship in the manufacturing sector among different categories of people in kerala and to differentiate the socio - psychological background of successful entrepreneur- managers from unsuccessful entrepreneur-managers. The purpose of the study, a sample of 150 entrepreneur-managers of SS1 units spread over Ernakulam district was surveyed through a specially designed questionnaire.
Resumo:
Cerebral glioma is the most prevalent primary brain tumor, which are classified broadly into low and high grades according to the degree of malignancy. High grade gliomas are highly malignant which possess a poor prognosis, and the patients survive less than eighteen months after diagnosis. Low grade gliomas are slow growing, least malignant and has better response to therapy. To date, histological grading is used as the standard technique for diagnosis, treatment planning and survival prediction. The main objective of this thesis is to propose novel methods for automatic extraction of low and high grade glioma and other brain tissues, grade detection techniques for glioma using conventional magnetic resonance imaging (MRI) modalities and 3D modelling of glioma from segmented tumor slices in order to assess the growth rate of tumors. Two new methods are developed for extracting tumor regions, of which the second method, named as Adaptive Gray level Algebraic set Segmentation Algorithm (AGASA) can also extract white matter and grey matter from T1 FLAIR an T2 weighted images. The methods were validated with manual Ground truth images, which showed promising results. The developed methods were compared with widely used Fuzzy c-means clustering technique and the robustness of the algorithm with respect to noise is also checked for different noise levels. Image texture can provide significant information on the (ab)normality of tissue, and this thesis expands this idea to tumour texture grading and detection. Based on the thresholds of discriminant first order and gray level cooccurrence matrix based second order statistical features three feature sets were formulated and a decision system was developed for grade detection of glioma from conventional T2 weighted MRI modality.The quantitative performance analysis using ROC curve showed 99.03% accuracy for distinguishing between advanced (aggressive) and early stage (non-aggressive) malignant glioma. The developed brain texture analysis techniques can improve the physician’s ability to detect and analyse pathologies leading to a more reliable diagnosis and treatment of disease. The segmented tumors were also used for volumetric modelling of tumors which can provide an idea of the growth rate of tumor; this can be used for assessing response to therapy and patient prognosis.
Resumo:
Detection of Objects in Video is a highly demanding area of research. The Background Subtraction Algorithms can yield better results in Foreground Object Detection. This work presents a Hybrid CodeBook based Background Subtraction to extract the foreground ROI from the background. Codebooks are used to store compressed information by demanding lesser memory usage and high speedy processing. This Hybrid method which uses Block-Based and Pixel-Based Codebooks provide efficient detection results; the high speed processing capability of block based background subtraction as well as high Precision Rate of pixel based background subtraction are exploited to yield an efficient Background Subtraction System. The Block stage produces a coarse foreground area, which is then refined by the Pixel stage. The system’s performance is evaluated with different block sizes and with different block descriptors like 2D-DCT, FFT etc. The Experimental analysis based on statistical measurements yields precision, recall, similarity and F measure of the hybrid system as 88.74%, 91.09%, 81.66% and 89.90% respectively, and thus proves the efficiency of the novel system.
Resumo:
Adaptive filter is a primary method to filter Electrocardiogram (ECG), because it does not need the signal statistical characteristics. In this paper, an adaptive filtering technique for denoising the ECG based on Genetic Algorithm (GA) tuned Sign-Data Least Mean Square (SD-LMS) algorithm is proposed. This technique minimizes the mean-squared error between the primary input, which is a noisy ECG, and a reference input which can be either noise that is correlated in some way with the noise in the primary input or a signal that is correlated only with ECG in the primary input. Noise is used as the reference signal in this work. The algorithm was applied to the records from the MIT -BIH Arrhythmia database for removing the baseline wander and 60Hz power line interference. The proposed algorithm gave an average signal to noise ratio improvement of 10.75 dB for baseline wander and 24.26 dB for power line interference which is better than the previous reported works
Resumo:
The paper investigates the feasibility of implementing an intelligent classifier for noise sources in the ocean, with the help of artificial neural networks, using higher order spectral features. Non-linear interactions between the component frequencies of the noise data can give rise to certain phase relations called Quadratic Phase Coupling (QPC), which cannot be characterized by power spectral analysis. However, bispectral analysis, which is a higher order estimation technique, can reveal the presence of such phase couplings and provide a measure to quantify such couplings. A feed forward neural network has been trained and validated with higher order spectral features