984 resultados para Signal Processing Research Center
Resumo:
Electrocardiogram (ECG) biometrics are a relatively recent trend in biometric recognition, with at least 13 years of development in peer-reviewed literature. Most of the proposed biometric techniques perform classifi-cation on features extracted from either heartbeats or from ECG based transformed signals. The best representation is yet to be decided. This paper studies an alternative representation, a dissimilarity space, based on the pairwise dissimilarity between templates and subjects' signals. Additionally, this representation can make use of ECG signals sourced from multiple leads. Configurations of three leads will be tested and contrasted with single-lead experiments. Using the same k-NN classifier the results proved superior to those obtained through a similar algorithm which does not employ a dissimilarity representation. The best Authentication EER went as low as 1:53% for a database employing 503 subjects. However, the employment of extra leads did not prove itself advantageous.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
A new algorithm for the velocity vector estimation of moving ships using Single Look Complex (SLC) SAR data in strip map acquisition mode is proposed. The algorithm exploits both amplitude and phase information of the Doppler decompressed data spectrum, with the aim to estimate both the azimuth antenna pattern and the backscattering coefficient as function of the look angle. The antenna pattern estimation provides information about the target velocity; the backscattering coefficient can be used for vessel classification. The range velocity is retrieved in the slow time frequency domain by estimating the antenna pattern effects induced by the target motion, while the azimuth velocity is calculated by the estimated range velocity and the ship orientation. Finally, the algorithm is tested on simulated SAR SLC data.
Resumo:
Endmember extraction (EE) is a fundamental and crucial task in hyperspectral unmixing. Among other methods vertex component analysis ( VCA) has become a very popular and useful tool to unmix hyperspectral data. VCA is a geometrical based method that extracts endmember signatures from large hyperspectral datasets without the use of any a priori knowledge about the constituent spectra. Many Hyperspectral imagery applications require a response in real time or near-real time. Thus, to met this requirement this paper proposes a parallel implementation of VCA developed for graphics processing units. The impact on the complexity and on the accuracy of the proposed parallel implementation of VCA is examined using both simulated and real hyperspectral datasets.
Resumo:
Hyperspectral unmixing methods aim at the decomposition of a hyperspectral image into a collection endmember signatures, i.e., the radiance or reflectance of the materials present in the scene, and the correspondent abundance fractions at each pixel in the image. This paper introduces a new unmixing method termed dependent component analysis (DECA). This method is blind and fully automatic and it overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical based approaches. DECA is based on the linear mixture model, i.e., each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. These abundances are modeled as mixtures of Dirichlet densities, thus enforcing the non-negativity and constant sum constraints, imposed by the acquisition process. The endmembers signatures are inferred by a generalized expectation-maximization (GEM) type algorithm. The paper illustrates the effectiveness of DECA on synthetic and real hyperspectral images.
Resumo:
Life-Cycle Civil Engineering – Biondini & Frangopol
Resumo:
The provision of reserves in power systems is of great importance in what concerns keeping an adequate and acceptable level of security and reliability. This need for reserves and the way they are defined and dispatched gain increasing importance in the present and future context of smart grids and electricity markets due to their inherent competitive environment. This paper concerns a methodology proposed by the authors, which aims to jointly and optimally dispatch both generation and demand response resources to provide the amounts of reserve required for the system operation. Virtual Power Players are especially important for the aggregation of small size demand response and generation resources. The proposed methodology has been implemented in MASCEM, a multi agent system also developed at the authors’ research center for the simulation of electricity markets.
Resumo:
Thesis submitted in the fulfillment of the requirements for the Degree of Master in Biomedical Engineering
Resumo:
O projeto realizado teve como tema a aplicação das derivadas e integrais fraccionários para a implementação de filtros digitais numa perspetiva de processamento digital de sinais. Numa primeira fase do trabalho, é efetuado uma abordagem teórica sobre os filtros digitais e o cálculo fraccionário. Estes conceitos teóricos são utilizados posteriormente para o desenvolvimento do presente projeto. Numa segunda fase, é desenvolvida uma interface gráfica em ambiente MatLab, utilizando a ferramenta GUIDE. Esta interface gráfica tem como objetivo a implementação de filtros digitais fraccionários. Na terceira fase deste projeto são implementados os filtros desenvolvidos experimentalmente através do ADSP-2181, onde será possível analisar e comparar os resultados experimentais com os resultados obtidos por simulação no MatLab. Como quarta e última fase deste projeto é efetuado uma reflexão sobre todo o desenvolvimento da Tese e o que esta me proporcionou. Com este relatório pretendo apresentar todo o esforço aplicado na realização deste trabalho, bem como alguns dos conhecimentos adquiridos ao longo do curso.
Resumo:
8th International Workshop on Multiple Access Communications (MACOM2015), Helsinki, Finland.
Resumo:
Power laws, also known as Pareto-like laws or Zipf-like laws, are commonly used to explain a variety of real world distinct phenomena, often described merely by the produced signals. In this paper, we study twelve cases, namely worldwide technological accidents, the annual revenue of America׳s largest private companies, the number of inhabitants in America׳s largest cities, the magnitude of earthquakes with minimum moment magnitude equal to 4, the total burned area in forest fires occurred in Portugal, the net worth of the richer people in America, the frequency of occurrence of words in the novel Ulysses, by James Joyce, the total number of deaths in worldwide terrorist attacks, the number of linking root domains of the top internet domains, the number of linking root domains of the top internet pages, the total number of human victims of tornadoes occurred in the U.S., and the number of inhabitants in the 60 most populated countries. The results demonstrate the emergence of statistical characteristics, very close to a power law behavior. Furthermore, the parametric characterization reveals complex relationships present at higher level of description.
Resumo:
This paper addresses limit cycles and signal propagation in dynamical systems with backlash. The study follows the describing function (DF) method for approximate analysis of nonlinearities and generalizes it in the perspective of the fractional calculus. The concept of fractional order describing function (FDF) is illustrated and the results for several numerical experiments are analysed. FDF leads to a novel viewpoint for limit cycle signal propagation as time-space waves within system structure.
Resumo:
The decomposition of a fractional linear system is discussed in this paper. It is shown that it can be decomposed into an integer order part, corresponding to possible existing poles, and a fractional part. The first and second parts are responsible for the short and long memory behaviors of the system, respectively, known as characteristic of fractional systems.
Resumo:
This study addresses the deoxyribonucleic acid (DNA) and proposes a procedure based on the association of statistics, information theory, signal processing, Fourier analysis and fractional calculus for describing fundamental characteristics of the DNA. In a first phase the 24 chromosomes of the Human are evaluated. In a second phase, 10 chromosomes for different species are also processed and the results compared. The results reveal invariance in the description and close resemblances with fractional Brownian motion.
Resumo:
Proceedings of the 12th Conference on 'Dynamical Systems -Theory and Applications'