15 resultados para new method
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
The construction industry keeps on demanding huge quantities of natural resources, mainly minerals for mortars and concrete production. The depletion of many quarries and environmental concerns about reducing the dumping of construction and demolition waste in quarries have led to an increase in the procuring and use of recycled aggregates from this type of waste. If they are to be incorporated in concrete and mortars it is essential to know their properties to guarantee the adequate performance of the end products, in both mechanical and durability-related terms. Existing regulated tests were developed for natural aggregates, however, and several problems arise when they are applied to recycled aggregates, especially fine recycled aggregates (FRA). This paper describes the main problems encountered with these tests and proposes an alternative method to determine the density and water absorption of FRA that removes them. The use of sodium hexametaphosphate solutions in the water absorption test has proven to improve its efficiency, minimizing cohesion between particles and helping to release entrained air.
Resumo:
Reinforcement Learning is an area of Machine Learning that deals with how an agent should take actions in an environment such as to maximize the notion of accumulated reward. This type of learning is inspired by the way humans learn and has led to the creation of various algorithms for reinforcement learning. These algorithms focus on the way in which an agent’s behaviour can be improved, assuming independence as to their surroundings. The current work studies the application of reinforcement learning methods to solve the inverted pendulum problem. The importance of the variability of the environment (factors that are external to the agent) on the execution of reinforcement learning agents is studied by using a model that seeks to obtain equilibrium (stability) through dynamism – a Cart-Pole system or inverted pendulum. We sought to improve the behaviour of the autonomous agents by changing the information passed to them, while maintaining the agent’s internal parameters constant (learning rate, discount factors, decay rate, etc.), instead of the classical approach of tuning the agent’s internal parameters. The influence of changes on the state set and the action set on an agent’s capability to solve the Cart-pole problem was studied. We have studied typical behaviour of reinforcement learning agents applied to the classic BOXES model and a new form of characterizing the environment was proposed using the notion of convergence towards a reference value. We demonstrate the gain in performance of this new method applied to a Q-Learning agent.
Resumo:
Introdução – A diabetes é uma das maiores epidemias do último século. Mais de 250 milhões de pessoas, em todo o mundo, sofrem de diabetes. Das complicações derivadas da diabetes são as principais causas de cegueira, de insuficiência renal e de amputação de membros inferiores, derivando estes, predominantemente, da disfunção vascular. Quando surge perda de pericitos na parede vascular ocorrem uma série de alterações da microcirculação que levam ao aparecimento de microaneurismas e outras alterações vasculares que possibilitam a passagem de componentes sanguíneos para o tecido retiniano adjacente que, em situação de normalidade, não ocorreriam, sendo esta uma das causas do edema macular exsudativo diabético. A perimetria de hiperacuidade preferencial (PHP) é um teste psicofísico que pretende detetar metamorfopsias na Degenerescência Macular ligada à Idade (DMI). Uma vez que o edema macular diabético (EMD) se destaca como uma das principais causas de deficiência visual e baixa visão, pretende-se verificar a eficácia do PHP no estudo do edema macular diabético, respondendo à seguinte questão: “Qual a capacidade do perímetro de hiperacuidade preferencial em detetar metamorfopsias em pacientes com edema macular diabético?“ Metodologia – Estudo quantitativo, do tipo descritivo e correlacional. Selecionou-se uma amostra de 33 pacientes, onde se analisou um total de 60 olhos. Resultados – A sensibilidade do PHP na deteção de metamorfopsias associadas ao EMD na tomografia de coerência ótica (OCT) foi de 70,6%, a especificidade foi de 11,5% e a eficiência global do teste de 45%. Comparando os resultados encontrados no PHP e no OCT, constatou-se a existência de uma correlação inversa fraca (Phi = -0,215). Conclusões – Este novo método de diagnóstico revela-se sensível, contudo pouco específico e eficaz na deteção de metamorfopsias consequentes da existência de EMD. - ABSTRACT - Introduction – Preferential hyperacuity perimeter (PHP) is a new psychophysical test, which principle is based on the detection of metamorphopsia in age-related macular degeneration (AMD). It is intended to verify its effectiveness in the study of diabetic macular edema (DME). When there is loss of pericytes in the vascular wall occur a number of microcirculatory changes that lead to the appearance of microaneurysms and other vascular changes that allow the passage of blood components to the surrounding retinal tissue than in normal situation does not occur, this being one of the causes exudative diabetic macular edema. Methodology – It was performed a quantitative study, using descriptive and correlational analysis. A sample of 33 patients was selected, and 60 eyes were analyzed. Results – The sensitivity of PHP on the detection of metamorphopsia associated to EMD was 70.6%, the specificity was 11.5% and the global efficiency of the test was 45%. It was found a weak negative correlation (Phi= -0.215) between the PHP and optical coherence tomography (OCT). Conclusions – This new method of diagnosis was sensitive, but not very specific and effective on the detection of metamorphopsia, due to the DME.
Resumo:
Mestrado em Tecnologia de Diagnóstico e Intervenção Cardiovascular. Área de especialização: Ultrassonografia Cardiovascular.
Resumo:
Mestrado em Contabilidade
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new method for unsupervised endmember extraction from hyperspectral data, termed vertex component analysis (VCA). The algorithm exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. In a series of experiments using simulated and real data, the VCA algorithm competes with state-of-the-art methods, with a computational complexity between one and two orders of magnitude lower than the best available method.
Resumo:
O tema do sistema de automação da protecção diferencial de linha e comparação direccional é merecedor de uma nova abordagem devido às recentes inovações tecnológicas ocorridas desde o aparecimento dos relés digitais e à consequente comunicação entre os sistemas de protecção, em particular na comunicação entre protecções diferenciais de linha. A protecção diferencial de linha apresenta claras vantagens face às protecções actualmente utilizadas para a protecção de linhas de transmissão e distribuição, tais como, Protecção de Máximo Intensidade de Fase, Máximo Intensidade Homopolar Direccionale Protecção de Distância. Contudo, existem alguns problemas associados a este tipo de protecções, nomeadamente na comunicação entre relés. Para automação e comunicação de protecções diferenciais de linhas de transmissão, no caso da ocorrência de defeitos na zona protegida pelo sistema de protecção diferencial foi empregue um método inovador para este tipo de sistema. Uma vez que a eficácia da actuação das protecções diferenciais depende do rigor das variáveis que são necessárias enviar entre protecções que se encontram localizadas em subestações distintas, recorreu-se à utilização de um automatismo para comunicação entre relés suportado pelo desenvolvimento de novos algoritmos para detectar quase instantaneamente um defeito em qualquer zona de protecção de uma linha de transmissão. Estes algoritmos são baseados na Transformada de Park, pelo que, é introduzido um novo conceito neste tipo de protecções. Através destes algoritmos é possível atenuar os problemas associados à protecção diferencial de linha. No sentido de verificar a aplicabilidade destes algoritmos à protecção diferencial de linha são apresentados diversos casos de estudo. Através dos resultados obtidos também foi possível verificar as vantagens associadas à utilização dos algoritmos propostos.
Resumo:
A new method is proposed to control delayed transitions towards extinction in single population theoretical models with discrete time undergoing saddle-node bifurcations. The control method takes advantage of the delaying properties of the saddle remnant arising after the bifurcation, and allows to sustain populations indefinitely. Our method, which is shown to work for deterministic and stochastic systems, could generally be applied to avoid transitions tied to one-dimensional maps after saddle-node bifurcations.
Resumo:
Mestrado em Gestão e Avaliação de Tecnologias em Saúde
Resumo:
In this paper a new method for self-localization of mobile robots, based on a PCA positioning sensor to operate in unstructured environments, is proposed and experimentally validated. The proposed PCA extension is able to perform the eigenvectors computation from a set of signals corrupted by missing data. The sensor package considered in this work contains a 2D depth sensor pointed upwards to the ceiling, providing depth images with missing data. The positioning sensor obtained is then integrated in a Linear Parameter Varying mobile robot model to obtain a self-localization system, based on linear Kalman filters, with globally stable position error estimates. A study consisting in adding synthetic random corrupted data to the captured depth images revealed that this extended PCA technique is able to reconstruct the signals, with improved accuracy. The self-localization system obtained is assessed in unstructured environments and the methodologies are validated even in the case of varying illumination conditions.
Resumo:
One of the most challenging task underlying many hyperspectral imagery applications is the linear unmixing. The key to linear unmixing is to find the set of reference substances, also called endmembers, that are representative of a given scene. This paper presents the vertex component analysis (VCA) a new method to unmix linear mixtures of hyperspectral sources. The algorithm is unsupervised and exploits a simple geometric fact: endmembers are vertices of a simplex. The algorithm complexity, measured in floating points operations, is O (n), where n is the sample size. The effectiveness of the proposed scheme is illustrated using simulated data.
Resumo:
This paper introduces a new method to blindly unmix hyperspectral data, termed dependent component analysis (DECA). This method decomposes a hyperspectral images into a collection of reflectance (or radiance) spectra of the materials present in the scene (endmember signatures) and the corresponding abundance fractions at each pixel. DECA assumes that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. These abudances are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. This method overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical based approaches. The effectiveness of the proposed method is illustrated using simulated data based on U.S.G.S. laboratory spectra and real hyperspectral data collected by the AVIRIS sensor over Cuprite, Nevada.
Resumo:
Biometric recognition has recently emerged as part of applications where the privacy of the information is crucial, as in the health care field. This paper presents a biometric recognition system based on the Electrocardiographic signal (ECG). The proposed system is based on a state-of-the-art recognition method which extracts information from the frequency domain. In this paper we propose a new method to increase the spectral resolution of low bandwidth ECG signals due to the limited bandwidth of the acquisition sensor. Preliminary results show that the proposed scheme reveals a higher identification rate and lower equal error rate when compared to previous approaches.
Resumo:
In the present paper we compare clustering solutions using indices of paired agreement. We propose a new method - IADJUST - to correct indices of paired agreement, excluding agreement by chance. This new method overcomes previous limitations known in the literature as it permits the correction of any index. We illustrate its use in external clustering validation, to measure the accordance between clusters and an a priori known structure. The adjusted indices are intended to provide a realistic measure of clustering performance that excludes agreement by chance with ground truth. We use simulated data sets, under a range of scenarios - considering diverse numbers of clusters, clusters overlaps and balances - to discuss the pertinence and the precision of our proposal. Precision is established based on comparisons with the analytical approach for correction specific indices that can be corrected in this way are used for this purpose. The pertinence of the proposed correction is discussed when making a detailed comparison between the performance of two classical clustering approaches, namely Expectation-Maximization (EM) and K-Means (KM) algorithms. Eight indices of paired agreement are studied and new corrected indices are obtained.