572 resultados para wavelet method

em Queensland University of Technology - ePrints Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Genomic and proteomic analyses have attracted a great deal of interests in biological research in recent years. Many methods have been applied to discover useful information contained in the enormous databases of genomic sequences and amino acid sequences. The results of these investigations inspire further research in biological fields in return. These biological sequences, which may be considered as multiscale sequences, have some specific features which need further efforts to characterise using more refined methods. This project aims to study some of these biological challenges with multiscale analysis methods and stochastic modelling approach. The first part of the thesis aims to cluster some unknown proteins, and classify their families as well as their structural classes. A development in proteomic analysis is concerned with the determination of protein functions. The first step in this development is to classify proteins and predict their families. This motives us to study some unknown proteins from specific families, and to cluster them into families and structural classes. We select a large number of proteins from the same families or superfamilies, and link them to simulate some unknown large proteins from these families. We use multifractal analysis and the wavelet method to capture the characteristics of these linked proteins. The simulation results show that the method is valid for the classification of large proteins. The second part of the thesis aims to explore the relationship of proteins based on a layered comparison with their components. Many methods are based on homology of proteins because the resemblance at the protein sequence level normally indicates the similarity of functions and structures. However, some proteins may have similar functions with low sequential identity. We consider protein sequences at detail level to investigate the problem of comparison of proteins. The comparison is based on the empirical mode decomposition (EMD), and protein sequences are detected with the intrinsic mode functions. A measure of similarity is introduced with a new cross-correlation formula. The similarity results show that the EMD is useful for detection of functional relationships of proteins. The third part of the thesis aims to investigate the transcriptional regulatory network of yeast cell cycle via stochastic differential equations. As the investigation of genome-wide gene expressions has become a focus in genomic analysis, researchers have tried to understand the mechanisms of the yeast genome for many years. How cells control gene expressions still needs further investigation. We use a stochastic differential equation to model the expression profile of a target gene. We modify the model with a Gaussian membership function. For each target gene, a transcriptional rate is obtained, and the estimated transcriptional rate is also calculated with the information from five possible transcriptional regulators. Some regulators of these target genes are verified with the related references. With these results, we construct a transcriptional regulatory network for the genes from the yeast Saccharomyces cerevisiae. The construction of transcriptional regulatory network is useful for detecting more mechanisms of the yeast cell cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many industrial processes and systems can be modelled mathematically by a set of Partial Differential Equations (PDEs). Finding a solution to such a PDF model is essential for system design, simulation, and process control purpose. However, major difficulties appear when solving PDEs with singularity. Traditional numerical methods, such as finite difference, finite element, and polynomial based orthogonal collocation, not only have limitations to fully capture the process dynamics but also demand enormous computation power due to the large number of elements or mesh points for accommodation of sharp variations. To tackle this challenging problem, wavelet based approaches and high resolution methods have been recently developed with successful applications to a fixedbed adsorption column model. Our investigation has shown that recent advances in wavelet based approaches and high resolution methods have the potential to be adopted for solving more complicated dynamic system models. This chapter will highlight the successful applications of these new methods in solving complex models of simulated-moving-bed (SMB) chromatographic processes. A SMB process is a distributed parameter system and can be mathematically described by a set of partial/ordinary differential equations and algebraic equations. These equations are highly coupled; experience wave propagations with steep front, and require significant numerical effort to solve. To demonstrate the numerical computing power of the wavelet based approaches and high resolution methods, a single column chromatographic process modelled by a Transport-Dispersive-Equilibrium linear model is investigated first. Numerical solutions from the upwind-1 finite difference, wavelet-collocation, and high resolution methods are evaluated by quantitative comparisons with the analytical solution for a range of Peclet numbers. After that, the advantages of the wavelet based approaches and high resolution methods are further demonstrated through applications to a dynamic SMB model for an enantiomers separation process. This research has revealed that for a PDE system with a low Peclet number, all existing numerical methods work well, but the upwind finite difference method consumes the most time for the same degree of accuracy of the numerical solution. The high resolution method provides an accurate numerical solution for a PDE system with a medium Peclet number. The wavelet collocation method is capable of catching up steep changes in the solution, and thus can be used for solving PDE models with high singularity. For the complex SMB system models under consideration, both the wavelet based approaches and high resolution methods are good candidates in terms of computation demand and prediction accuracy on the steep front. The high resolution methods have shown better stability in achieving steady state in the specific case studied in this Chapter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Circuit breaker restrikes are unwanted occurrence, which can ultimately lead to breaker. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks. In 2008 a non-intrusive radiometric restrike measurement method, as well a restrike hardware detection algorithm was developed. The limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current detection methods and algorithms required the use of wide bandwidth current transformers and voltage dividers. A novel non-intrusive restrike diagnostic algorithm using ATP (Alternative Transient Program) and wavelet transforms is proposed. Wavelet transforms are the most common use in signal processing, which is divided into two tests, i.e. restrike detection and energy level based on deteriorated waveforms in different types of restrike. A ‘db5’ wavelet was selected in the tests as it gave a 97% correct diagnostic rate evaluated using a database of diagnostic signatures. This was also tested using restrike waveforms simulated under different network parameters which gave a 92% correct diagnostic responses. The diagnostic technique and methodology developed in this research can be applied to any power monitoring system with slight modification for restrike detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inspection of solder joints has been a critical process in the electronic manufacturing industry to reduce manufacturing cost, improve yield, and ensure project quality and reliability. This paper proposes the use of the Log-Gabor filter bank, Discrete Wavelet Transform and Discrete Cosine Transform for feature extraction of solder joint images on Printed Circuit Boards (PCBs). A distance based on the Mahalanobis Cosine metric is also presented for classification of five different types of solder joints. From the experimental results, this methodology achieved high accuracy and a well generalised performance. This can be an effective method to reduce cost and improve quality in the production of PCBs in the manufacturing industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A breaker restrike is an abnormal arcing phenomenon, leading to a possible breaker failure. Eventually, this failure leads to interruption of the transmission and distribution of the electricity supply system until the breaker is replaced. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks in power systems. In 2008 a non-intrusive radiometric restrike measurement method and a restrike hardware detection algorithm were developed by M.S. Ramli and B. Kasztenny. However, the limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current restrike detection methods and algorithms require the use of wide bandwidth current transformers and high voltage dividers. A restrike switch model using Alternative Transient Program (ATP) and Wavelet Transforms which support diagnostics are proposed. Restrike phenomena become a new diagnostic process using measurements, ATP and Wavelet Transforms for online interrupter monitoring. This research project investigates the restrike switch model Parameter „A. dielectric voltage gradient related to a normal and slowed case of the contact opening velocity and the escalation voltages, which can be used as a diagnostic tool for a vacuum circuit-breaker (CB) at service voltages between 11 kV and 63 kV. During current interruption of an inductive load at current quenching or chopping, a transient voltage is developed across the contact gap. The dielectric strength of the gap should rise to a point to withstand this transient voltage. If it does not, the gap will flash over, resulting in a restrike. A straight line is fitted through the voltage points at flashover of the contact gap. This is the point at which the gap voltage has reached a value that exceeds the dielectric strength of the gap. This research shows that a change in opening contact velocity of the vacuum CB produces a corresponding change in the slope of the gap escalation voltage envelope. To investigate the diagnostic process, an ATP restrike switch model was modified with contact opening velocity computation for restrike waveform signature analyses along with experimental investigations. This also enhanced a mathematical CB model with the empirical dielectric model for SF6 (sulphur hexa-fluoride) CBs at service voltages above 63 kV and a generalised dielectric curve model for 12 kV CBs. A CB restrike can be predicted if there is a similar type of restrike waveform signatures for measured and simulated waveforms. The restrike switch model applications are used for: computer simulations as virtual experiments, including predicting breaker restrikes; estimating the interrupter remaining life of SF6 puffer CBs; checking system stresses; assessing point-on-wave (POW) operations; and for a restrike detection algorithm development using Wavelet Transforms. A simulated high frequency nozzle current magnitude was applied to an Equation (derived from the literature) which can calculate the life extension of the interrupter of a SF6 high voltage CB. The restrike waveform signatures for a medium and high voltage CB identify its possible failure mechanism such as delayed opening, degraded dielectric strength and improper contact travel. The simulated and measured restrike waveform signatures are analysed using Matlab software for automatic detection. Experimental investigation of a 12 kV vacuum CB diagnostic was carried out for the parameter determination and a passive antenna calibration was also successfully developed with applications for field implementation. The degradation features were also evaluated with a predictive interpretation technique from the experiments, and the subsequent simulation indicates that the drop in voltage related to the slow opening velocity mechanism measurement to give a degree of contact degradation. A predictive interpretation technique is a computer modeling for assessing switching device performance, which allows one to vary a single parameter at a time; this is often difficult to do experimentally because of the variable contact opening velocity. The significance of this thesis outcome is that it is a non-intrusive method developed using measurements, ATP and Wavelet Transforms to predict and interpret a breaker restrike risk. The measurements on high voltage circuit-breakers can identify degradation that can interrupt the distribution and transmission of an electricity supply system. It is hoped that the techniques for the monitoring of restrike phenomena developed by this research will form part of a diagnostic process that will be valuable for detecting breaker stresses relating to the interrupter lifetime. Suggestions for future research, including a field implementation proposal to validate the restrike switch model for ATP system studies and the hot dielectric strength curve model for SF6 CBs, are given in Appendix A.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel combined near- and mid-infrared (NIR and MIR) spectroscopic method has been researched and developed for the analysis of complex substances such as the Traditional Chinese Medicine (TCM), Illicium verum Hook. F. (IVHF), and its noxious adulterant, Iuicium lanceolatum A.C. Smith (ILACS). Three types of spectral matrix were submitted for classification with the use of the linear discriminant analysis (LDA) method. The data were pretreated with either the successive projections algorithm (SPA) or the discrete wavelet transform (DWT) method. The SPA method performed somewhat better, principally because it required less spectral features for its pretreatment model. Thus, NIR or MIR matrix as well as the combined NIR/MIR one, were pretreated by the SPA method, and then analysed by LDA. This approach enabled the prediction and classification of the IVHF, ILACS and mixed samples. The MIR spectral data produced somewhat better classification rates than the NIR data. However, the best results were obtained from the combined NIR/MIR data matrix with 95–100% correct classifications for calibration, validation and prediction. Principal component analysis (PCA) of the three types of spectral data supported the results obtained with the LDA classification method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Frog protection has become increasingly essential due to the rapid decline of its biodiversity. Therefore, it is valuable to develop new methods for studying this biodiversity. In this paper, a novel feature extraction method is proposed based on perceptual wavelet packet decomposition for classifying frog calls in noisy environments. Pre-processing and syllable segmentation are first applied to the frog call. Then, a spectral peak track is extracted from each syllable if possible. Track duration, dominant frequency and oscillation rate are directly extracted from the track. With k-means clustering algorithm, the calculated dominant frequency of all frog species is clustered into k parts, which produce a frequency scale for wavelet packet decomposition. Based on the adaptive frequency scale, wavelet packet decomposition is applied to the frog calls. Using the wavelet packet decomposition coefficients, a new feature set named perceptual wavelet packet decomposition sub-band cepstral coefficients is extracted. Finally, a k-nearest neighbour (k-NN) classifier is used for the classification. The experiment results show that the proposed features can achieve an average classification accuracy of 97.45% which outperforms syllable features (86.87%) and Mel-frequency cepstral coefficients (MFCCs) feature (90.80%).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environmental changes have put great pressure on biological systems leading to the rapid decline of biodiversity. To monitor this change and protect biodiversity, animal vocalizations have been widely explored by the aid of deploying acoustic sensors in the field. Consequently, large volumes of acoustic data are collected. However, traditional manual methods that require ecologists to physically visit sites to collect biodiversity data are both costly and time consuming. Therefore it is essential to develop new semi-automated and automated methods to identify species in automated audio recordings. In this study, a novel feature extraction method based on wavelet packet decomposition is proposed for frog call classification. After syllable segmentation, the advertisement call of each frog syllable is represented by a spectral peak track, from which track duration, dominant frequency and oscillation rate are calculated. Then, a k-means clustering algorithm is applied to the dominant frequency, and the centroids of clustering results are used to generate the frequency scale for wavelet packet decomposition (WPD). Next, a new feature set named adaptive frequency scaled wavelet packet decomposition sub-band cepstral coefficients is extracted by performing WPD on the windowed frog calls. Furthermore, the statistics of all feature vectors over each windowed signal are calculated for producing the final feature set. Finally, two well-known classifiers, a k-nearest neighbour classifier and a support vector machine classifier, are used for classification. In our experiments, we use two different datasets from Queensland, Australia (18 frog species from commercial recordings and field recordings of 8 frog species from James Cook University recordings). The weighted classification accuracy with our proposed method is 99.5% and 97.4% for 18 frog species and 8 frog species respectively, which outperforms all other comparable methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fleck and Johnson (Int. J. Mech. Sci. 29 (1987) 507) and Fleck et al. (Proc. Inst. Mech. Eng. 206 (1992) 119) have developed foil rolling models which allow for large deformations in the roll profile, including the possibility that the rolls flatten completely. However, these models require computationally expensive iterative solution techniques. A new approach to the approximate solution of the Fleck et al. (1992) Influence Function Model has been developed using both analytic and approximation techniques. The numerical difficulties arising from solving an integral equation in the flattened region have been reduced by applying an Inverse Hilbert Transform to get an analytic expression for the pressure. The method described in this paper is applicable to cases where there is or there is not a flat region.

Relevância:

20.00% 20.00%

Publicador: