839 resultados para Real-world problem


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis to obtain the Master of Science Degree in Computer Science and Engineering

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação de Mestrado apresentada ao Instituto de Contabilidade e Administração do Porto para a obtenção de grau de Mestre em Auditoria, sob a orientação de Luís Silva Rodrigues

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the present paper we assess the performance of information-theoretic inspired risks functionals in multilayer perceptrons with reference to the two most popular ones, Mean Square Error and Cross-Entropy. The information-theoretic inspired risks, recently proposed, are: HS and HR2 are, respectively, the Shannon and quadratic Rényi entropies of the error; ZED is a risk reflecting the error density at zero errors; EXP is a generalized exponential risk, able to mimic a wide variety of risk functionals, including the information-thoeretic ones. The experiments were carried out with multilayer perceptrons on 35 public real-world datasets. All experiments were performed according to the same protocol. The statistical tests applied to the experimental results showed that the ubiquitous mean square error was the less interesting risk functional to be used by multilayer perceptrons. Namely, mean square error never achieved a significantly better classification performance than competing risks. Cross-entropy and EXP were the risks found by several tests to be significantly better than their competitors. Counts of significantly better and worse risks have also shown the usefulness of HS and HR2 for some datasets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper formulates a novel expression for entropy inspired in the properties of Fractional Calculus. The characteristics of the generalized fractional entropy are tested both in standard probability distributions and real world data series. The results reveal that tuning the fractional order allow an high sensitivity to the signal evolution, which is useful in describing the dynamics of complex systems. The concepts are also extended to relative distances and tested with several sets of data, confirming the goodness of the generalization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Previous work by our group introduced a novel concept and sensor design for “off-the-person” ECG, for which evidence on how it compares against standard clinical-grade equipment has been largely missing. Our objectives with this work are to characterise the off-the-person approach in light of the current ECG systems landscape, and assess how the signals acquired using this simplified setup compare with clinical-grade recordings. Empirical tests have been performed with real-world data collected from a population of 38 control subjects, to analyze the correlation between both approaches. Results show off-the-person data to be correlated with clinical-grade data, demonstrating the viability of this approach to potentially extend preventive medicine practices by enabling the integration of ECG monitoring into multiple dimensions of people’s everyday lives. © 2015, IUPESM and Springer-Verlag Berlin Heidelberg.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mestrado em Engenharia Informática - Área de Especialização em Sistemas Gráficos e Multimédia

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Biometric recognition is emerging has an alternative solution for applications where the privacy of the information is crucial. This paper presents an embedded biometric recognition system based on the Electrocardiographic signals (ECG) for individual identification and authentication. The proposed system implements a real-time state-of-the-art recognition algorithm, which extracts information from the frequency domain. The system is based on a ARM Cortex 4. Preliminary results show that embedded platforms are a promising path for the implementation of ECG-based applications in real-world scenario.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A gestão de redes informáticas converteu-se num fator vital para uma rede operar de forma eficiente, produtiva e lucrativa. A gestão envolve a monitorização e o controlo dos sistemas para que estes funcionam como o pretendido, ações de configuração, monitorização, reconfiguração dos componentes, são essenciais para o objetivo de melhorar o desempenho, diminuir o tempo de inatividade, melhor a segurança e efetuar contabilização. Paralelamente, a classificação de tráfego é um tema de bastante relevância em várias atividades relacionadas com as redes, tais como a previsão de QoS, segurança, monitorização, contabilização, planeamento de capacidade de backbones e deteção de invasão. A variação de determinados tipos de tráfego pode influenciar, decisões técnicas na área da gestão de redes, assim como decisões políticas e sociais. Neste trabalho pretende-se desenvolver um estudo dos vários protocolos, ferramentas de gestão e de classificação de tráfego disponíveis para apoiar a atividade de gestão. O estudo efetuado terminou com a proposta e implementação de uma solução de gestão adequado a um cenário real, bastante rico na diversidade de tecnologias e sistemas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As aplicações móveis de serviços baseados na localização, denominados LBS (Location-based services), disponibilizam serviços ao utilizador baseadas na sua localização geográfica. Este tipo de serviços começou a surgir ainda na década de 90 e, à medida que o número de dispositivos móveis cresceu de forma exponencial, a sua oferta disparou consideravelmente. Existem várias áreas com aplicabilidade prática, mas o foco desta tese é a pesquisa e localização de pontos de interesse (POI’s). Através dos sensores que os dispositivos móveis atualmente disponibilizam, torna-se possível localizar a posição do utilizador e apresentar-lhe os pontos de interesse que estão situados em seu redor. No entanto essa informação isolada revela-se por vezes insuficiente, uma vez que esses pontos de interesse são à partida desconhecidos para o utilizador. Através do serviço coolplaces, um projeto que pretende dedicar-se à pesquisa e partilha de POI’s, podemos criar a nossa rede de amigos e de locais, beneficiando assim da respetiva informação de contexto de um determinado POI. As inovações tecnológicas permitiram também o aparecimento de aplicações de Realidade Aumentada nos dispositivos móveis, isto é, aplicações capazes de sobrepor imagens virtuais a visualizações do mundo real. Considerando a visualização de POI’s num dado ambiente, se encararmos a Realidade Aumentada como um potenciador da interação do utilizador com o mundo real, rapidamente identificamos as potencialidades da junção destes conceitos numa só aplicação. Sendo assim, o trabalho desenvolvido nesta tese pretende constituir um estudo sobre a implementação e desenvolvimento de um módulo de Realidade Aumentada para a aplicação móvel do serviço coolplaces, fazendo uso da tecnologia disponível no mercado de forma a proporcionar uma experiência inovadora e acrescentar valor à referida aplicação.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Deegan and Packel (1979) and Holler (1982) proposed two power indices for simple games: the Deegan–Packel index and the Public Good Index. In the definition of these indices, only minimal winning coalitions are taken into account. Using similar arguments, we define two new power indices. These new indices are defined taking into account only those winning coalitions that do not contain null players. The results obtained with the different power indices are compared by means of two real-world examples taken from the political field.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper formulates a novel expression for entropy inspired in the properties of Fractional Calculus. The characteristics of the generalized fractional entropy are tested both in standard probability distributions and real world data series. The results reveal that tuning the fractional order allow an high sensitivity to the signal evolution, which is useful in describing the dynamics of complex systems. The concepts are also extended to relative distances and tested with several sets of data, confirming the goodness of the generalization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Power laws, also known as Pareto-like laws or Zipf-like laws, are commonly used to explain a variety of real world distinct phenomena, often described merely by the produced signals. In this paper, we study twelve cases, namely worldwide technological accidents, the annual revenue of America׳s largest private companies, the number of inhabitants in America׳s largest cities, the magnitude of earthquakes with minimum moment magnitude equal to 4, the total burned area in forest fires occurred in Portugal, the net worth of the richer people in America, the frequency of occurrence of words in the novel Ulysses, by James Joyce, the total number of deaths in worldwide terrorist attacks, the number of linking root domains of the top internet domains, the number of linking root domains of the top internet pages, the total number of human victims of tornadoes occurred in the U.S., and the number of inhabitants in the 60 most populated countries. The results demonstrate the emergence of statistical characteristics, very close to a power law behavior. Furthermore, the parametric characterization reveals complex relationships present at higher level of description.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Gottfried Leibniz generalized the derivation and integration, extending the operators from integer up to real, or even complex, orders. It is presently recognized that the resulting models capture long term memory effects difficult to describe by classical tools. Leon Chua generalized the set of lumped electrical elements that provide the building blocks in mathematical models. His proposal of the memristor and of higher order elements broadened the scope of variables and relationships embedded in the development of models. This paper follows the two directions and proposes a new logical step, by generalizing the concept of junction. Classical junctions interconnect system elements using simple algebraic restrictions. Nevertheless, this simplistic approach may be misleading in the presence of unexpected dynamical phenomena and requires including additional “parasitic” elements. The novel γ-junction includes, as special cases, the standard series and parallel connections and allows a new degree of freedom when building models. The proposal motivates the search for experimental and real world manifestations of the abstract conjectures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Complex industrial plants exhibit multiple interactions among smaller parts and with human operators. Failure in one part can propagate across subsystem boundaries causing a serious disaster. This paper analyzes the industrial accident data series in the perspective of dynamical systems. First, we process real world data and show that the statistics of the number of fatalities reveal features that are well described by power law (PL) distributions. For early years, the data reveal double PL behavior, while, for more recent time periods, a single PL fits better into the experimental data. Second, we analyze the entropy of the data series statistics over time. Third, we use the Kullback–Leibler divergence to compare the empirical data and multidimensional scaling (MDS) techniques for data analysis and visualization. Entropy-based analysis is adopted to assess complexity, having the advantage of yielding a single parameter to express relationships between the data. The classical and the generalized (fractional) entropy and Kullback–Leibler divergence are used. The generalized measures allow a clear identification of patterns embedded in the data.