869 resultados para Error-correcting codes (Information theory)
Resumo:
This study developed software rotines, in a system made basically from a processor board producer of signs and supervisory, wich main function was correcting the information measured by a turbine gas meter. This correction is based on the use of an intelligent algorithm formed by an artificial neural net. The rotines were implemented in the habitat of the supervisory as well as in the habitat of the DSP and have three main itens: processing, communication and supervision
Resumo:
Currently, one of the biggest challenges for the field of data mining is to perform cluster analysis on complex data. Several techniques have been proposed but, in general, they can only achieve good results within specific areas providing no consensus of what would be the best way to group this kind of data. In general, these techniques fail due to non-realistic assumptions about the true probability distribution of the data. Based on this, this thesis proposes a new measure based on Cross Information Potential that uses representative points of the dataset and statistics extracted directly from data to measure the interaction between groups. The proposed approach allows us to use all advantages of this information-theoretic descriptor and solves the limitations imposed on it by its own nature. From this, two cost functions and three algorithms have been proposed to perform cluster analysis. As the use of Information Theory captures the relationship between different patterns, regardless of assumptions about the nature of this relationship, the proposed approach was able to achieve a better performance than the main algorithms in literature. These results apply to the context of synthetic data designed to test the algorithms in specific situations and to real data extracted from problems of different fields
Resumo:
In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method
Resumo:
Conventional methods to solve the problem of blind source separation nonlinear, in general, using series of restrictions to obtain the solution, often leading to an imperfect separation of the original sources and high computational cost. In this paper, we propose an alternative measure of independence based on information theory and uses the tools of artificial intelligence to solve problems of blind source separation linear and nonlinear later. In the linear model applies genetic algorithms and Rényi of negentropy as a measure of independence to find a separation matrix from linear mixtures of signals using linear form of waves, audio and images. A comparison with two types of algorithms for Independent Component Analysis widespread in the literature. Subsequently, we use the same measure of independence, as the cost function in the genetic algorithm to recover source signals were mixed by nonlinear functions from an artificial neural network of radial base type. Genetic algorithms are powerful tools for global search, and therefore well suited for use in problems of blind source separation. Tests and analysis are through computer simulations
Resumo:
I thank to my advisor, João Marcos, for the intellectual support and patience that devoted me along graduate years. With his friendship, his ability to see problems of the better point of view and his love in to make Logic, he became a great inspiration for me. I thank to my committee members: Claudia Nalon, Elaine Pimentel and Benjamin Bedregal. These make a rigorous lecture of my work and give me valuable suggestions to make it better. I am grateful to the Post-Graduate Program in Systems and Computation that accepted me as student and provided to me the propitious environment to develop my research. I thank also to the CAPES for a 21 months fellowship. Thanks to my research group, LoLITA (Logic, Language, Information, Theory and Applications). In this group I have the opportunity to make some friends. Someone of them I knew in my early classes, they are: Sanderson, Haniel and Carol Blasio. Others I knew during the course, among them I’d like to cite: Patrick, Claudio, Flaulles and Ronildo. I thank to Severino Linhares and Maria Linhares who gently hosted me at your home in my first months in Natal. This couple jointly with my colleagues of student flat Fernado, Donátila and Aline are my nuclear family in Natal. I thank my fiancée Luclécia for her precious a ective support and to understand my absence at home during my master. I thank also my parents Manoel and Zenilda, my siblings Alexandre, Paulo and Paula.Without their confidence and encouragement I wouldn’t achieve success in this journey. If you want the hits, be prepared for the misses Carl Yastrzemski
Resumo:
We show how discrete squeezed states in an N-2-dimensional phase space can be properly constructed out of the finite-dimensional context. Such discrete extensions are then applied to the framework of quantum tomography and quantum information theory with the aim of establishing an initial study on the interference effects between discrete variables in a finite phase space. Moreover, the interpretation of the squeezing effects is seen to be direct in the present approach, and has some potential applications in different branches of physics.
Resumo:
Propomos que uma evolução de idéias científicas seja usada como instrumento de aprendizagem de conteúdos específicos e, em particular, para ressaltar como os conteúdos se articulam entre as disciplinas. Como exemplo, apresentamos um estudo sobre a proposta do demônio de Maxwell e discussões sobre sua exorcização, isto é, um estudo sobre a compreensão da natureza de um ser inteligente que atua dentro de um sistema físico e de como seria essa atuação. Estão envolvidos nesse problema fenômenos relacionados com várias teorias - Termodinâmica, Física Molecular, Mecânica Estatística, Teoria da Informação - dentro das disciplinas de Física, Química, Biologia, Computação. Entre diversas questões epistemológicas e conceituais aí contidas, será enfatizada a questão do objeto limitado de uma eoria científica, isto é, da limitação de seu significado aos fenômenos por ela compreendidos. A delimitação dos fenômenos estudados e as teorias e técnicas caracterizam a compreensão que vai realizar sua emergência concreta nos laboratórios. Essa compreensão vai dar também a possibilidade de atuação interdisciplinar.
Resumo:
This article describes a technique for Large Scale Virtual Environments (LSVEs) partitioning in hexagon cells and using portal in the cell interfaces to reduce the number of messages on the network and the complexity of the virtual world. These environments usually demand a high volume of data that must be sent only to those users who needs the information [Greenhalgh, Benford 1997].
Resumo:
This paper is based on the analysis and implementation of a new drive system applied to refrigeration systems, complying with the restrictions imposed by the IEC standards (Harmonic/Flicker/EMI-Electromagnetic Interference restrictions), in order to obtain high efficiency, high power factor, reduced harmonic distortion in the input current and reduced electromagnetic interference, with excellent performance in temperature control of a refrigeration prototype system (automatic control, precision and high dynamic response). The proposal is replace the single-phase motor by a three-phase motor, in the conventional refrigeration system. In this way, a proper control technique can be applied, using a closed-loop (feedback control), that will allow an accurate adjustment of the desirable temperature. The proposed refrigeration prototype uses a 0.5Hp three-phase motor and an open (Belt-Drive) Bitzer IY type compressor. The input rectifier stage's features include the reduction in the input current ripple, the reduction in the output voltage ripple, the use of low stress devices, low volume for the EMI input filter, high input power factor (PF), and low total harmonic distortion (THD) in the input current, in compliance with the IEC61000-3-2 standards. The digital controller for the output three-phase inverter stage has been developed using a conventional voltage-frequency control (scalar V/f control), and a simplified stator oriented Vector control, in order to verify the feasibility and performance of the proposed digital controls for continuous temperature control applied at the refrigerator prototype. ©2008 IEEE.
Resumo:
The applications of Automatic Vowel Recognition (AVR), which is a sub-part of fundamental importance in most of the speech processing systems, vary from automatic interpretation of spoken language to biometrics. State-of-the-art systems for AVR are based on traditional machine learning models such as Artificial Neural Networks (ANNs) and Support Vector Machines (SVMs), however, such classifiers can not deal with efficiency and effectiveness at the same time, existing a gap to be explored when real-time processing is required. In this work, we present an algorithm for AVR based on the Optimum-Path Forest (OPF), which is an emergent pattern recognition technique recently introduced in literature. Adopting a supervised training procedure and using speech tags from two public datasets, we observed that OPF has outperformed ANNs, SVMs, plus other classifiers, in terms of training time and accuracy. ©2010 IEEE.
Resumo:
The results of the histopathological analyses after the implantation of highly crystalline PVA microspheres in subcutaneous tissues of Wistar rats are here in reported. Three different groups of PVA microparticles were systematically studied: highly crystalline, amorphous, and commercial ones. In addition to these experiments, complementary analyses of architectural complexity were performed using fractal dimension (FD), and Shannon's entropy (SE) concepts. The highly crystalline microspheres induced inflammatory reactions similar to the ones observed for the commercial ones, while the inflammatory reactions caused by the amorphous ones were less intense. Statistical analyses of the subcutaneous tissues of Wistar rats implanted with the highly crystalline microspheres resulted in FD and SE values significantly higher than the statistical parameters observed for the amorphous ones. The FD and SE parameters obtained for the subcutaneous tissues of Wistar rats implanted with crystalline and commercial microparticles were statistically similar. Briefly, the results indicated that the new highly crystalline microspheres had biocompatible behavior comparable to the commercial ones. In addition, statistical tools such as FD and SE analyses when combined with histopathological analyses can be useful tools to investigate the architectural complexity tissues caused by complex inflammatory reactions. © 2012 WILEY PERIODICALS, INC.
Resumo:
Unlike correlation of classical systems, entanglement of quantum systems cannot be distributed at will: if one system A is maximally entangled with another system B, it cannot be entangled at all with a third system C. This concept, known as the monogamy of entanglement, is manifest when the entanglement of A with a pair BC can be divided as contributions of the entanglement between A and B and A and C, plus a term τABC involving genuine tripartite entanglement and so expected to be always positive. A very important measure in quantum information theory, the entanglement of formation (EOF), fails to satisfy this last requirement. Here we present the reasons for that and show a set of conditions that an arbitrary pure tripartite state must satisfy for the EOF to become a monogamous measure, i.e., for τABC≥0. The relation derived is connected to the discrepancy between quantum and classical correlations, τABC being negative whenever the quantum correlation prevails over the classical one. This result is employed to elucidate features of the distribution of entanglement during a dynamical evolution. It also helps to relate all monogamous instances of the EOF to the squashed sntanglement, an entanglement measure that is always monogamous. © 2013 American Physical Society.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Música - IA
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)