15 resultados para Teoria de informação
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
Currently, one of the biggest challenges for the field of data mining is to perform cluster analysis on complex data. Several techniques have been proposed but, in general, they can only achieve good results within specific areas providing no consensus of what would be the best way to group this kind of data. In general, these techniques fail due to non-realistic assumptions about the true probability distribution of the data. Based on this, this thesis proposes a new measure based on Cross Information Potential that uses representative points of the dataset and statistics extracted directly from data to measure the interaction between groups. The proposed approach allows us to use all advantages of this information-theoretic descriptor and solves the limitations imposed on it by its own nature. From this, two cost functions and three algorithms have been proposed to perform cluster analysis. As the use of Information Theory captures the relationship between different patterns, regardless of assumptions about the nature of this relationship, the proposed approach was able to achieve a better performance than the main algorithms in literature. These results apply to the context of synthetic data designed to test the algorithms in specific situations and to real data extracted from problems of different fields
Resumo:
Coding process is a fundamental aspect of cerebral functioning. The sensory stimuli transformation in neurophysiological responses has been a research theme in several areas of Neuroscience. One of the most used ways to measure a neural code e ciency is by the use of Information Theory measures, such as mutual information. Using these tools, recent studies show that in the auditory cortex both local eld potentials (LFPs) and action potential spiking times code information about sound stimuli. However, there are no studies applying Information Theory tools to investigate the e ciency of codes that use postsynaptics potentials (PSPs), alone and associated with LFP analysis. These signals are related in the sense that LFPs are partly created by joint action of several PSPs. The present dissertation reports information measures between PSP and LFP responses obtained in the primary auditory cortex of anaesthetized rats and auditory stimuli of distinct frequencies. Our results show that PSP responses hold information about sound stimuli in comparable levels and even greater than LFP responses. We have also found that PSPs and LFPs code sound information independently, since the joint analysis of these signals did neither show synergy nor redundancy.
Resumo:
In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method
Resumo:
Following the new tendency of interdisciplinarity of modern science, a new field called neuroengineering has come to light in the last decades. After 2000, scientific journals and conferences all around the world have been created on this theme. The present work comprises three different subareas related to neuroengineering and electrical engineering: neural stimulation; theoretical and computational neuroscience; and neuronal signal processing; as well as biomedical engineering. The research can be divided in three parts: (i) A new method of neuronal photostimulation was developed based on the use of caged compounds. Using the inhibitory neurotransmitter GABA caged by a ruthenium complex it was possible to block neuronal population activity using a laser pulse. The obtained results were evaluated by Wavelet analysis and tested by non-parametric statistics. (ii) A mathematical method was created to identify neuronal assemblies. Neuronal assemblies were proposed as the basis of learning by Donald Hebb remain the most accepted theory for neuronal representation of external stimuli. Using the Marcenko-Pastur law of eigenvalue distribution it was possible to detect neuronal assemblies and to compute their activity with high temporal resolution. The application of the method in real electrophysiological data revealed that neurons from the neocortex and hippocampus can be part of the same assembly, and that neurons can participate in multiple assemblies. (iii) A new method of automatic classification of heart beats was developed, which does not rely on a data base for training and is not specialized in specific pathologies. The method is based on Wavelet decomposition and normality measures of random variables. Throughout, the results presented in the three fields of knowledge represent qualification in neural and biomedical engineering
Resumo:
Conventional methods to solve the problem of blind source separation nonlinear, in general, using series of restrictions to obtain the solution, often leading to an imperfect separation of the original sources and high computational cost. In this paper, we propose an alternative measure of independence based on information theory and uses the tools of artificial intelligence to solve problems of blind source separation linear and nonlinear later. In the linear model applies genetic algorithms and Rényi of negentropy as a measure of independence to find a separation matrix from linear mixtures of signals using linear form of waves, audio and images. A comparison with two types of algorithms for Independent Component Analysis widespread in the literature. Subsequently, we use the same measure of independence, as the cost function in the genetic algorithm to recover source signals were mixed by nonlinear functions from an artificial neural network of radial base type. Genetic algorithms are powerful tools for global search, and therefore well suited for use in problems of blind source separation. Tests and analysis are through computer simulations
Resumo:
Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)
Resumo:
A posição que a renomada estatí stica de Boltzmann-Gibbs (BG) ocupa no cenário cientifíco e incontestável, tendo um âmbito de aplicabilidade muito abrangente. Por em, muitos fenômenos físicos não podem ser descritos por esse formalismo. Isso se deve, em parte, ao fato de que a estatística de BG trata de fenômenos que se encontram no equilíbrio termodinâmico. Em regiões onde o equilíbrio térmico não prevalece, outros formalismos estatísticos devem ser utilizados. Dois desses formalismos emergiram nas duas ultimas décadas e são comumente denominados de q-estatística e k-estatística; o primeiro deles foi concebido por Constantino Tsallis no final da década de 80 e o ultimo por Giorgio Kaniadakis em 2001. Esses formalismos possuem caráter generalizador e, por isso, contem a estatística de BG como caso particular para uma escolha adequada de certos parâmetros. Esses dois formalismos, em particular o de Tsallis, nos conduzem também a refletir criticamente sobre conceitos tão fortemente enraizados na estat ística de BG como a aditividade e a extensividade de certas grandezas físicas. O escopo deste trabalho esta centrado no segundo desses formalismos. A k -estatstica constitui não só uma generalização da estatística de BG, mas, atraves da fundamentação do Princípio de Interação Cinético (KIP), engloba em seu âmago as celebradas estatísticas quânticas de Fermi- Dirac e Bose-Einstein; além da própria q-estatística. Neste trabalho, apresentamos alguns aspectos conceituais da q-estatística e, principalmente, da k-estatística. Utilizaremos esses conceitos junto com o conceito de informação de bloco para apresentar um funcional entrópico espelhado no formalismo de Kaniadakis que será utilizado posteriormente para descrever aspectos informacionais contidos em fractais tipo Cantor. Em particular, estamos interessados em conhecer as relações entre parâmetros fractais, como a dimensão fractal, e o parâmetro deformador. Apesar da simplicidade, isso nos proporcionará, em trabalho futuros, descrever estatisticamente estruturas mais complexas como o DNA, super-redes e sistema complexos
Resumo:
The great interest in nonlinear system identification is mainly due to the fact that a large amount of real systems are complex and need to have their nonlinearities considered so that their models can be successfully used in applications of control, prediction, inference, among others. This work evaluates the application of Fuzzy Wavelet Neural Networks (FWNN) to identify nonlinear dynamical systems subjected to noise and outliers. Generally, these elements cause negative effects on the identification procedure, resulting in erroneous interpretations regarding the dynamical behavior of the system. The FWNN combines in a single structure the ability to deal with uncertainties of fuzzy logic, the multiresolution characteristics of wavelet theory and learning and generalization abilities of the artificial neural networks. Usually, the learning procedure of these neural networks is realized by a gradient based method, which uses the mean squared error as its cost function. This work proposes the replacement of this traditional function by an Information Theoretic Learning similarity measure, called correntropy. With the use of this similarity measure, higher order statistics can be considered during the FWNN training process. For this reason, this measure is more suitable for non-Gaussian error distributions and makes the training less sensitive to the presence of outliers. In order to evaluate this replacement, FWNN models are obtained in two identification case studies: a real nonlinear system, consisting of a multisection tank, and a simulated system based on a model of the human knee joint. The results demonstrate that the application of correntropy as the error backpropagation algorithm cost function makes the identification procedure using FWNN models more robust to outliers. However, this is only achieved if the gaussian kernel width of correntropy is properly adjusted.
Resumo:
Information extraction is a frequent and relevant problem in digital signal processing. In the past few years, different methods have been utilized for the parameterization of signals and the achievement of efficient descriptors. When the signals possess statistical cyclostationary properties, the Cyclic Autocorrelation Function (CAF) and the Spectral Cyclic Density (SCD) can be used to extract second-order cyclostationary information. However, second-order cyclostationary information is poor in nongaussian signals, as the cyclostationary analysis in this case should comprise higher-order statistical information. This paper proposes a new mathematical tool for the higher-order cyclostationary analysis based on the correntropy function. Specifically, the cyclostationary analysis is revisited focusing on the information theory, while the Cyclic Correntropy Function (CCF) and Cyclic Correntropy Spectral Density (CCSD) are also defined. Besides, it is analytically proven that the CCF contains information regarding second- and higher-order cyclostationary moments, being a generalization of the CAF. The performance of the aforementioned new functions in the extraction of higher-order cyclostationary characteristics is analyzed in a wireless communication system where nongaussian noise exists.
Resumo:
In the present time, public organizations are employing more and more solutions that uses information technology in order to ofer more transparency and better services for all citizens. Integrated Systems are IT which carry in their kernel features of integration and the use of a unique database. These systems bring several benefits and face some obstacles that make their adoption difficult. The conversion to a integrated system may take years and, thus, the study of the adoption of this IT in public sector organizations become very stimulant due to some peculiarities of this sector and the features of this technology. First of all, information about the particular integrated system in study and about its process of conversion are offered. Then, the researcher designs the configuration of the conversion process aim of this study the agents envolved and the moments and the tools used to support the process in order to elaborate the methodology of the conversion process understood as the set of procedures and tools used during all the conversion process. After this, the researcher points out, together with all the members of the conversion team, the negative and positive factors during the project. Finally, these factors were analysed through the Hospitality Theory lens which, in the researcher opinion, was very useful to understand the elements, events and moments that interfered in the project. The results consolidated empirically the Hospitality Theory presumptions, showing yet a limitation of this theory in the case in study
Resumo:
The right against self-incrimination is a fundamental right that works in the criminal prosecution, and therefore deserves a study supported by the general theory of criminal procedure. The right has a vague origin, and despite the various historical accounts only arises when there is a criminal procedure structured that aims to limit the State´s duty-power to punish. The only system of criminal procedure experienced that reconciles with seal self-incrimination is the accusatory model. The inquisitorial model is based on the construction of a truth and obtaining the confession at any cost, and is therefore incompatible with the right in study. The consecration of the right arises with the importance that fundamental rights have come to occupy in the Democratic Constitutional States. In the Brazilian experience before 1988 was only possible to recognize that self-incrimination represented a procedural burden for accused persons. Despite thorough debate in the Constituent Assembly, the right remains consecrated in a textual formula that´s closer to the implementation made by the Supreme Court of the United States, known as "Miranda warnings", than the text of the Fifth Amendment to the U.S. Constitution that established originally the right against self-incrimination with a constitutional status. However, the imprecise text does not prevent the consecration of the principle as a fundamental right in Brazilian law. The right against self-incrimination is a right that should be observed in the Criminal Procedure and relates to several of his canons, such as the the presumption of not guilty, the accusatory model, the distribution of the burden of proof, and especially the right of defense. Because it a fundamental right, the prohibition of self-incrimination deserves a proper study to her constitutional nature. For the definition of protected persons is important to build a material concept of accused, which is different of the formal concept over who is denounced on the prosecution. In the objective area of protection, there are two objects of protection of the norm: the instinct of self-preservation of the subject and the ability to self-determination. Configuring essentially a evidence rule in criminal procedure, the analysis of the case should be based on standards set previously to indicate respect for the right. These standard include the right to information of the accused, the right to counsel and respect the voluntary participation. The study of violations cases, concentrated on the element of voluntariness, starting from the definition of what is or is not a coercion violative of self-determination. The right faces new challenges that deserve attention, especially the fight against terrorism and organized crime that force the development of tools, resources and technologies about proves, methods increasingly invasive and hidden, and allow the use of information not only for criminal prosecution, but also for the establishment of an intelligence strategy in the development of national and public security
Resumo:
The infographics historically experience the process of evolution of journalism, from the incipient models handmade in the eighteenth century to the inclusion of computers and sophisticated software today. In order to face the advent of TV against of the partiality readers of the printed newspaper, or to represent the Gulf War, where not allowed photography, infographics reaches modern levels of production and publication. The technical devices which enabled the infographics to evolve the environment of the internet, with conditions for the manipulation of the reader, incorporating video, audio and animations, so styling of interactive infographics. These digital models of information visualization recently arrived daily in the northeast and on their respective web sites with features regionalized. This paper therefore proposes to explore and describe the processes of producing the interactive infographics, taking the example of the Diário do Nordeste, Fortaleza, Ceará, whose department was created one year ago. Therefore, based on aspects that guide the theory of journalism, as newsmaking, filters that focus on productive routine (gatekeeping) and the construction stages of the news. This research also draws on the theoretical framework on the subject, in concepts essential characteristics of computer graphics, as well as the methodological procedures and systematic empirical observations in production routines of the newsroom who can testify limitations and / or advances
Resumo:
The following work is to interpret and analyze the problem of induction under a vision founded on set theory and probability theory as a basis for solution of its negative philosophical implications related to the systems of inductive logic in general. Due to the importance of the problem and the relatively recent developments in these fields of knowledge (early 20th century), as well as the visible relations between them and the process of inductive inference, it has been opened a field of relatively unexplored and promising possibilities. The key point of the study consists in modeling the information acquisition process using concepts of set theory, followed by a treatment using probability theory. Throughout the study it was identified as a major obstacle to the probabilistic justification, both: the problem of defining the concept of probability and that of rationality, as well as the subtle connection between the two. This finding called for a greater care in choosing the criterion of rationality to be considered in order to facilitate the treatment of the problem through such specific situations, but without losing their original characteristics so that the conclusions can be extended to classic cases such as the question about the continuity of the sunrise
Resumo:
The information constitutes one of the most valuable strategic assets for the organization. However, the organizational environment in which it is inserted is very complex and heterogeneous, making emerging issues relevant to the Governance of information technology (IT) and Information Security. Academic Studies and market surveys indicate that the origin of most accidents with the information assets is the behavior of people organization itself rather than external attacks. Taking as a basis the promotion of a culture of safety among users and ensuring the protection of information in their properties of confidentiality, integrity and availability, organizations must establish its Information Security Policy (PSI). This policy is to formalise the guidelines in relation to the security of corporate information resources, in order to avoid that the asset vulnerabilities are exploited by threats and can bring negative consequences to the business. But, for the PSI being effective, it is required that the user have readiness to accept and follow the procedures and safety standards. In the light of this context, the present study aims to investigate what are the motivators extrinsic and intrinsic that affect the willingness of the user to be in accordance with the organization's security policies. The theoretical framework addresses issues related to IT Governance, Information Security, Theory of deterrence, Motivation and Behavior Pro-social. It was created a theoretical model based on the studies of Herath and Rao (2009) and D'Arcy, Hovav and Galletta (2009) that are based on General Deterrence Theory and propose the following influencing factors in compliance with the Policy: Severity of Punishment, Certainty of Detection, Peer Behaviour, Normative Beliefs, Perceived Effectiveness and Moral Commitment. The research used a quantitative approach, descriptive. The data were collected through a questionnaire with 18 variables with a Likert scale of five points representing the influencing factors proposed by the theory. The sample was composed of 391 students entering the courses from the Center for Applied Social Sciences of the Universidade Federal do Rio Grande do Norte. For the data analysis, were adopted the techniques of Exploratory Factor Analysis, Analysis of Cluster hierarchical and nonhierarchical, Logistic Regression and Multiple Linear Regression. As main results, it is noteworthy that the factor severity of punishment is what contributes the most to the theoretical model and also influences the division of the sample between users more predisposed and less prone. As practical implication, the research model applied allows organizations to provide users less prone and, with them, to carry out actions of awareness and training directed and write Security Policies more effective.
Resumo:
In the present time, public organizations are employing more and more solutions that uses information technology in order to ofer more transparency and better services for all citizens. Integrated Systems are IT which carry in their kernel features of integration and the use of a unique database. These systems bring several benefits and face some obstacles that make their adoption difficult. The conversion to a integrated system may take years and, thus, the study of the adoption of this IT in public sector organizations become very stimulant due to some peculiarities of this sector and the features of this technology. First of all, information about the particular integrated system in study and about its process of conversion are offered. Then, the researcher designs the configuration of the conversion process aim of this study the agents envolved and the moments and the tools used to support the process in order to elaborate the methodology of the conversion process understood as the set of procedures and tools used during all the conversion process. After this, the researcher points out, together with all the members of the conversion team, the negative and positive factors during the project. Finally, these factors were analysed through the Hospitality Theory lens which, in the researcher opinion, was very useful to understand the elements, events and moments that interfered in the project. The results consolidated empirically the Hospitality Theory presumptions, showing yet a limitation of this theory in the case in study