990 resultados para Al-Alaoui Transform


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematics Subject Classification: 26A33, 93B51, 93C95

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Australia and increasingly worldwide, methamphetamine is one of the most commonly seized drugs analysed by forensic chemists. The current well-established GC/MS methods used to identify and quantify methamphetamine are lengthy, expensive processes, but often rapid analysis is requested by undercover police leading to an interest in developing this new analytical technique. Ninety six illicit drug seizures containing methamphetamine (0.1% - 78.6%) were analysed using Fourier Transform Infrared Spectroscopy with an Attenuated Total Reflectance attachment and Chemometrics. Two Partial Least Squares models were developed, one using the principal Infrared Spectroscopy peaks of methamphetamine and the other a Hierarchical Partial Least Squares model. Both of these models were refined to choose the variables that were most closely associated with the methamphetamine % vector. Both of the models were excellent, with the principal peaks in the Partial Least Squares model having Root Mean Square Error of Prediction 3.8, R2 0.9779 and lower limit of quantification 7% methamphetamine. The Hierarchical Partial Least Squares model had lower limit of quantification 0.3% methamphetamine, Root Mean Square Error of Prediction 5.2 and R2 0.9637. Such models offer rapid and effective methods for screening illicit drug samples to determine the percentage of methamphetamine they contain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The notion of the 1-D analytic signal is well understood and has found many applications. At the heart of the analytic signal concept is the Hilbert transform. The problem in extending the concept of analytic signal to higher dimensions is that there is no unique multidimensional definition of the Hilbert transform. Also, the notion of analyticity is not so well under stood in higher dimensions. Of the several 2-D extensions of the Hilbert transform, the spiral-phase quadrature transform or the Riesz transform seems to be the natural extension and has attracted a lot of attention mainly due to its isotropic properties. From the Riesz transform, Larkin et al. constructed a vortex operator, which approximates the quadratures based on asymptotic stationary-phase analysis. In this paper, we show an alternative proof for the quadrature approximation property by invoking the quasi-eigenfunction property of linear, shift-invariant systems. We show that the vortex operator comes up as a natural consequence of applying this property. We also characterize the quadrature approximation error in terms of its energy as well as the peak spatial-domain error. Such results are available for 1-D signals, but their counter part for 2-D signals have not been provided. We also provide simulation results to supplement the analytical calculations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer vision algorithms that use color information require color constant images to operate correctly. Color constancy of the images is usually achieved in two steps: first the illuminant is detected and then image is transformed with the chromatic adaptation transform ( CAT). Existing CAT methods use a single transformation matrix for all the colors of the input image. The method proposed in this paper requires multiple corresponding color pairs between source and target illuminants given by patches of the Macbeth color checker. It uses Delaunay triangulation to divide the color gamut of the input image into small triangles. Each color of the input image is associated with the triangle containing the color point and transformed with a full linear model associated with the triangle. Full linear model is used because diagonal models are known to be inaccurate if channel color matching functions do not have narrow peaks. Objective evaluation showed that the proposed method outperforms existing CAT methods by more than 21%; that is, it performs statistically significantly better than other existing methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer vision algorithms that use color information require color constant images to operate correctly. Color constancy of the images is usually achieved in two steps: first the illuminant is detected and then image is transformed with the chromatic adaptation transform ( CAT). Existing CAT methods use a single transformation matrix for all the colors of the input image. The method proposed in this paper requires multiple corresponding color pairs between source and target illuminants given by patches of the Macbeth color checker. It uses Delaunay triangulation to divide the color gamut of the input image into small triangles. Each color of the input image is associated with the triangle containing the color point and transformed with a full linear model associated with the triangle. Full linear model is used because diagonal models are known to be inaccurate if channel color matching functions do not have narrow peaks. Objective evaluation showed that the proposed method outperforms existing CAT methods by more than 21%; that is, it performs statistically significantly better than other existing methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new nickel (II)-cyanometallates modified on glassy carbon electrode was prepared by a new method and studied by cyclic voltammetry and in situ Fourier transform infrared (FTIR) spectroelectrochemistry. It was found that the NiHCF film existed in two forms: Ni2Fe(II)-(CN)(6) and M2NiFe(II)(CN)(6), Fe(CN)(3)(6-) codeposited in the NiHCF film existing in free cation or bridged-bond state depended on the property of the cations in electrolyte: in NaCl and LiCl solution, it is in bridges-bonded, but in HCl and KCl, it is free.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plussian blue(PB)/Pt modified electrode Tvas studied in the CdCl2 electrolyte solution by cyclic voltammetry and in situ FTIR spectroelectrochemistry. It was found that Cadmium ion was capable of substituting the high-spin iron of PB in an electrochemically induced substitution reaction and hexacyanoferrate cadmium (CdHCF) can be formed in the PB film. But PB and CdHCF in mixture film showed their own electrochemistry properties without serious effect on each other. The mechanism of substitution reaction has been given in detail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new nonlinear integral transform relating the ocean wave spectrum to the along-track interferometric synthetic aperture radar (AT-INSAR) image spectrum. The AT-INSAR, which is a synthetic aperture radar (SAR) employing two antennas displaced along the platform's flight direction, is considered to be a better instrument for imaging ocean waves than the SAR. This is because the AT-INSAR yields the phase spectrum and not only the amplitude spectrum as with the conventional SAR. While the SAR and AT-INSAR amplitude spectra depend strongly on the modulation of the normalized radar cross section (NRCS) by the long ocean waves, which is poorly known, the phase spectrum depends only weakly on this modulation. By measuring the phase difference between the signals received by both antennas, AT-INSAR measures the radial component of the orbital velocity associated with the ocean waves, which is related to the ocean wave height field by a well-known transfer function. The nonlinear integral transform derived in this paper differs from the one previously derived by Bao et al. [1999] by an additional term containing the derivative of the radial component of the orbital velocity associated with the long ocean waves. By carrying out numerical simulations, we show that, in general, this additional term cannot be neglected. Furthermore, we present two new quasi-linear approximations to the nonlinear integral transform relating the ocean wave spectrum to the AT-INSAR phase spectrum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thermal degradation and gaseous products evolving from the pyrolysis of sewage sludge, aimed at agricultural soil amendment, were investigated using Thermogravimetric Analysis in conjunction with Fourier Transform Infrared Analysis (TG-FTIR). The materials were studied in temperatures ranging from 30 to 800 ºC. Furthermore infrared spectra of sewage sludge samples were performed as a complementary technique. In parallel the sewage sludge was spiked with ibuprofen in order to test whether the mentioned techniques are able to detect the drug. Thermal analysis showed the range of 200-400ºC as the most characteristic for weight loss, corresponding with the organic matter volatilization, while the range of 500-800ºC was also characteristic and due to the volatilization of carbonates. On the other hand, ibuprofen-spiking tests identified at temperature range (150-250ºC) where the compound totally volatilizes, therefore, in this work, the detection of ibuprofen by TGA was established for concentrations higher than 0.5 g/kg sludge, concentration 102 times higher than the concentrations measured by other authors in regular sewage sludge (Martín, et al., 2010). A correlation has been found between the ibuprofen concentrations in the sludge and the intensity of the absorption bands, both for FT-IR spectra at the maximum emission temperature for ibuprofen (232ºC) as for the FT-IR spectra of the non-pyrolyzed samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La propuesta en el trabajo de grado es aplicar a Almaviva S.A. (Almacenes Generales de Depósito) sucursal Cartagena el marco teórico y práctico de la gerencia del servicio y las bases teóricas establecidas por el Docente de La Universidad del Rosario Carlos Eduardo Méndez como lo son Elementos para transformar la cultura de las organizaciones hacia la excelencia en el servicio al cliente y Un momento para el cliente en el servicio, publicaciones que sin duda se constituyen como un modelo a seguir para orientar el funcionamiento de las organizaciones hacia la prestación de un excelente servicio, que genere satisfacción y fidelidad constante, superando las expectativas de los clientes que con el creciente dinamismo del mercado actual son cada día más exigentes.