19 resultados para Teoria da informação em finanças

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, one of the biggest challenges for the field of data mining is to perform cluster analysis on complex data. Several techniques have been proposed but, in general, they can only achieve good results within specific areas providing no consensus of what would be the best way to group this kind of data. In general, these techniques fail due to non-realistic assumptions about the true probability distribution of the data. Based on this, this thesis proposes a new measure based on Cross Information Potential that uses representative points of the dataset and statistics extracted directly from data to measure the interaction between groups. The proposed approach allows us to use all advantages of this information-theoretic descriptor and solves the limitations imposed on it by its own nature. From this, two cost functions and three algorithms have been proposed to perform cluster analysis. As the use of Information Theory captures the relationship between different patterns, regardless of assumptions about the nature of this relationship, the proposed approach was able to achieve a better performance than the main algorithms in literature. These results apply to the context of synthetic data designed to test the algorithms in specific situations and to real data extracted from problems of different fields

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coding process is a fundamental aspect of cerebral functioning. The sensory stimuli transformation in neurophysiological responses has been a research theme in several areas of Neuroscience. One of the most used ways to measure a neural code e ciency is by the use of Information Theory measures, such as mutual information. Using these tools, recent studies show that in the auditory cortex both local eld potentials (LFPs) and action potential spiking times code information about sound stimuli. However, there are no studies applying Information Theory tools to investigate the e ciency of codes that use postsynaptics potentials (PSPs), alone and associated with LFP analysis. These signals are related in the sense that LFPs are partly created by joint action of several PSPs. The present dissertation reports information measures between PSP and LFP responses obtained in the primary auditory cortex of anaesthetized rats and auditory stimuli of distinct frequencies. Our results show that PSP responses hold information about sound stimuli in comparable levels and even greater than LFP responses. We have also found that PSPs and LFPs code sound information independently, since the joint analysis of these signals did neither show synergy nor redundancy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Following the new tendency of interdisciplinarity of modern science, a new field called neuroengineering has come to light in the last decades. After 2000, scientific journals and conferences all around the world have been created on this theme. The present work comprises three different subareas related to neuroengineering and electrical engineering: neural stimulation; theoretical and computational neuroscience; and neuronal signal processing; as well as biomedical engineering. The research can be divided in three parts: (i) A new method of neuronal photostimulation was developed based on the use of caged compounds. Using the inhibitory neurotransmitter GABA caged by a ruthenium complex it was possible to block neuronal population activity using a laser pulse. The obtained results were evaluated by Wavelet analysis and tested by non-parametric statistics. (ii) A mathematical method was created to identify neuronal assemblies. Neuronal assemblies were proposed as the basis of learning by Donald Hebb remain the most accepted theory for neuronal representation of external stimuli. Using the Marcenko-Pastur law of eigenvalue distribution it was possible to detect neuronal assemblies and to compute their activity with high temporal resolution. The application of the method in real electrophysiological data revealed that neurons from the neocortex and hippocampus can be part of the same assembly, and that neurons can participate in multiple assemblies. (iii) A new method of automatic classification of heart beats was developed, which does not rely on a data base for training and is not specialized in specific pathologies. The method is based on Wavelet decomposition and normality measures of random variables. Throughout, the results presented in the three fields of knowledge represent qualification in neural and biomedical engineering

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conventional methods to solve the problem of blind source separation nonlinear, in general, using series of restrictions to obtain the solution, often leading to an imperfect separation of the original sources and high computational cost. In this paper, we propose an alternative measure of independence based on information theory and uses the tools of artificial intelligence to solve problems of blind source separation linear and nonlinear later. In the linear model applies genetic algorithms and Rényi of negentropy as a measure of independence to find a separation matrix from linear mixtures of signals using linear form of waves, audio and images. A comparison with two types of algorithms for Independent Component Analysis widespread in the literature. Subsequently, we use the same measure of independence, as the cost function in the genetic algorithm to recover source signals were mixed by nonlinear functions from an artificial neural network of radial base type. Genetic algorithms are powerful tools for global search, and therefore well suited for use in problems of blind source separation. Tests and analysis are through computer simulations

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A posição que a renomada estatí stica de Boltzmann-Gibbs (BG) ocupa no cenário cientifíco e incontestável, tendo um âmbito de aplicabilidade muito abrangente. Por em, muitos fenômenos físicos não podem ser descritos por esse formalismo. Isso se deve, em parte, ao fato de que a estatística de BG trata de fenômenos que se encontram no equilíbrio termodinâmico. Em regiões onde o equilíbrio térmico não prevalece, outros formalismos estatísticos devem ser utilizados. Dois desses formalismos emergiram nas duas ultimas décadas e são comumente denominados de q-estatística e k-estatística; o primeiro deles foi concebido por Constantino Tsallis no final da década de 80 e o ultimo por Giorgio Kaniadakis em 2001. Esses formalismos possuem caráter generalizador e, por isso, contem a estatística de BG como caso particular para uma escolha adequada de certos parâmetros. Esses dois formalismos, em particular o de Tsallis, nos conduzem também a refletir criticamente sobre conceitos tão fortemente enraizados na estat ística de BG como a aditividade e a extensividade de certas grandezas físicas. O escopo deste trabalho esta centrado no segundo desses formalismos. A k -estatstica constitui não só uma generalização da estatística de BG, mas, atraves da fundamentação do Princípio de Interação Cinético (KIP), engloba em seu âmago as celebradas estatísticas quânticas de Fermi- Dirac e Bose-Einstein; além da própria q-estatística. Neste trabalho, apresentamos alguns aspectos conceituais da q-estatística e, principalmente, da k-estatística. Utilizaremos esses conceitos junto com o conceito de informação de bloco para apresentar um funcional entrópico espelhado no formalismo de Kaniadakis que será utilizado posteriormente para descrever aspectos informacionais contidos em fractais tipo Cantor. Em particular, estamos interessados em conhecer as relações entre parâmetros fractais, como a dimensão fractal, e o parâmetro deformador. Apesar da simplicidade, isso nos proporcionará, em trabalho futuros, descrever estatisticamente estruturas mais complexas como o DNA, super-redes e sistema complexos

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The great interest in nonlinear system identification is mainly due to the fact that a large amount of real systems are complex and need to have their nonlinearities considered so that their models can be successfully used in applications of control, prediction, inference, among others. This work evaluates the application of Fuzzy Wavelet Neural Networks (FWNN) to identify nonlinear dynamical systems subjected to noise and outliers. Generally, these elements cause negative effects on the identification procedure, resulting in erroneous interpretations regarding the dynamical behavior of the system. The FWNN combines in a single structure the ability to deal with uncertainties of fuzzy logic, the multiresolution characteristics of wavelet theory and learning and generalization abilities of the artificial neural networks. Usually, the learning procedure of these neural networks is realized by a gradient based method, which uses the mean squared error as its cost function. This work proposes the replacement of this traditional function by an Information Theoretic Learning similarity measure, called correntropy. With the use of this similarity measure, higher order statistics can be considered during the FWNN training process. For this reason, this measure is more suitable for non-Gaussian error distributions and makes the training less sensitive to the presence of outliers. In order to evaluate this replacement, FWNN models are obtained in two identification case studies: a real nonlinear system, consisting of a multisection tank, and a simulated system based on a model of the human knee joint. The results demonstrate that the application of correntropy as the error backpropagation algorithm cost function makes the identification procedure using FWNN models more robust to outliers. However, this is only achieved if the gaussian kernel width of correntropy is properly adjusted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Information extraction is a frequent and relevant problem in digital signal processing. In the past few years, different methods have been utilized for the parameterization of signals and the achievement of efficient descriptors. When the signals possess statistical cyclostationary properties, the Cyclic Autocorrelation Function (CAF) and the Spectral Cyclic Density (SCD) can be used to extract second-order cyclostationary information. However, second-order cyclostationary information is poor in nongaussian signals, as the cyclostationary analysis in this case should comprise higher-order statistical information. This paper proposes a new mathematical tool for the higher-order cyclostationary analysis based on the correntropy function. Specifically, the cyclostationary analysis is revisited focusing on the information theory, while the Cyclic Correntropy Function (CCF) and Cyclic Correntropy Spectral Density (CCSD) are also defined. Besides, it is analytically proven that the CCF contains information regarding second- and higher-order cyclostationary moments, being a generalization of the CAF. The performance of the aforementioned new functions in the extraction of higher-order cyclostationary characteristics is analyzed in a wireless communication system where nongaussian noise exists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present time, public organizations are employing more and more solutions that uses information technology in order to ofer more transparency and better services for all citizens. Integrated Systems are IT which carry in their kernel features of integration and the use of a unique database. These systems bring several benefits and face some obstacles that make their adoption difficult. The conversion to a integrated system may take years and, thus, the study of the adoption of this IT in public sector organizations become very stimulant due to some peculiarities of this sector and the features of this technology. First of all, information about the particular integrated system in study and about its process of conversion are offered. Then, the researcher designs the configuration of the conversion process aim of this study the agents envolved and the moments and the tools used to support the process in order to elaborate the methodology of the conversion process understood as the set of procedures and tools used during all the conversion process. After this, the researcher points out, together with all the members of the conversion team, the negative and positive factors during the project. Finally, these factors were analysed through the Hospitality Theory lens which, in the researcher opinion, was very useful to understand the elements, events and moments that interfered in the project. The results consolidated empirically the Hospitality Theory presumptions, showing yet a limitation of this theory in the case in study

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brazilian law passes through a crisis of effectiveness commonly attributed to the extravagance of fundamental rights and public shortage. However, public finances are not dogmatically structured to solve the conflicts around the limitations of public spending. There are ethical conditioning factors, like morality, proportionality and impartiality, however, these principles act separately, while the problem of public shortage is holistic. Also, the subjectivity of politics discretionary in the definition of public spending, which is supported in an indeterminate concept of public interest, needs material orientation about the destination of public funds, making it vulnerable to ideological manipulation, resulting in real process of catching rights. Not even the judicial activism (such as influx of constitutionalism) is shown legally appropriate. The Reserve of Possible, also presents basic ethical failure. Understanding the formation of public shortage is therefore essential for understanding the crisis of effectiveness of state responsibilities, given the significant expansion of the state duty of protection, which does not find legal technique of defense of the established interests. The premise of argument, then, part of the possibility of deducting minimal model ethical of desire to spend (public interest) according to objective parameters of the normative system. Public spending has always been treated disdainfully by the Brazilian doctrine, according to the legal character accessory assigned to the monetary cost. Nonetheless, it is the meeting point between economics and law, or is in the marrow of the problem of public shortage. Expensive Subjects to modernity, as the effectiveness of fundamental rights, pass necessarily an ethical legal system of public spending. From the ethical principles deducted from the planning, only the democratic principle guides the public spending through the approval of public spending in the complex budget process. In other words, there is an ethical distancing of economic reality in relation to state responsibilities. From the dogmatic belief of insufficiency, public spending is evaluated ethically, according to the foundations of modern constitutionalism, in search of possible of the financial reserve, certain that the ethics of public economy is a sine qua non condition for legal ethics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the Brazilian legal scenario, the study of taxation has traditionally been restricted to positivist analysis, concerned with investigating the formal aspects of the tax legal rule. Despite its relevance to the formation of the national doctrine of tax, such formalist tradition limits the discipline, separating it from reality and the socioeconomic context in which the Tax Law is inserted. Thus, the proposal of the dissertation is to examine the fundamentals and nature of taxation and tax legal rules from the perspective of Law and Economics (Economic Analysis of Law). For this purpose, the work initially reconnects the Tax Law and Science of Finance (or Public Finance) and Fiscal Policy, undertaking not only a legal analysis, but also economic and financial analysis of the theme. The Economics of Public Sector (or Modern Public Finance) will contribute to the research through topics such as market failures and economic theory of taxation, which are essential to an economic approach to Tax Law. The core of the work lies in the application of Law and Economics instruments in the study of taxation, analyzing the effects of tax rules on the economic system. Accordingly, the dissertation examines the fundamental assumptions that make up the Economic Analysis of Law (as the concept of economic efficiency and its relation to equity), relating them to the tax phenomenon. Due to the nature of the Brazilian legal system, any worth investigation or approach, including Law and Economics, could not pass off the Constitution. Thus, the constitutional rules will serve as a limit and a prerequisite for the application of Law and Economics on taxation, particularly the rules related to property rights, freedom, equality and legal certainty. The relationship between taxation and market failures receives prominent role, particularly due to its importance to the Law and Economics, as well as to the role that taxation plays in the correction of these failures. In addition to performing a review of taxation under the approach of Economic Analysis of Law, the research also investigates the reality of Brazilian tax system, applying the concepts developed in relevant cases and issues to the national scene, such as the relationship between taxation and development, the compliance costs of taxation, the tax evasion and the tax enforcement procedure. Given the above, it is intended to lay the groundwork for a general theory of Economic Analysis of Tax Law, contextualizing it with the Brazilian tax system

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The right against self-incrimination is a fundamental right that works in the criminal prosecution, and therefore deserves a study supported by the general theory of criminal procedure. The right has a vague origin, and despite the various historical accounts only arises when there is a criminal procedure structured that aims to limit the State´s duty-power to punish. The only system of criminal procedure experienced that reconciles with seal self-incrimination is the accusatory model. The inquisitorial model is based on the construction of a truth and obtaining the confession at any cost, and is therefore incompatible with the right in study. The consecration of the right arises with the importance that fundamental rights have come to occupy in the Democratic Constitutional States. In the Brazilian experience before 1988 was only possible to recognize that self-incrimination represented a procedural burden for accused persons. Despite thorough debate in the Constituent Assembly, the right remains consecrated in a textual formula that´s closer to the implementation made by the Supreme Court of the United States, known as "Miranda warnings", than the text of the Fifth Amendment to the U.S. Constitution that established originally the right against self-incrimination with a constitutional status. However, the imprecise text does not prevent the consecration of the principle as a fundamental right in Brazilian law. The right against self-incrimination is a right that should be observed in the Criminal Procedure and relates to several of his canons, such as the the presumption of not guilty, the accusatory model, the distribution of the burden of proof, and especially the right of defense. Because it a fundamental right, the prohibition of self-incrimination deserves a proper study to her constitutional nature. For the definition of protected persons is important to build a material concept of accused, which is different of the formal concept over who is denounced on the prosecution. In the objective area of protection, there are two objects of protection of the norm: the instinct of self-preservation of the subject and the ability to self-determination. Configuring essentially a evidence rule in criminal procedure, the analysis of the case should be based on standards set previously to indicate respect for the right. These standard include the right to information of the accused, the right to counsel and respect the voluntary participation. The study of violations cases, concentrated on the element of voluntariness, starting from the definition of what is or is not a coercion violative of self-determination. The right faces new challenges that deserve attention, especially the fight against terrorism and organized crime that force the development of tools, resources and technologies about proves, methods increasingly invasive and hidden, and allow the use of information not only for criminal prosecution, but also for the establishment of an intelligence strategy in the development of national and public security

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It analyzes the magnitude, the nature and the direction of public revenues and the public expenses in oil and natural gas producing municipalities in the state of Rio Grande do Norte in the post-constituent period, and, more precisely, from the approval of Law 9.478/97, called Oil Law . It argues the fiscal federalism normative theory, the typology and the role of the intergovernamental transferences in the performance of the public finances of the local governments. Shows that the economy of Rio Grande do Norte went through deep social-economic changes in the last few decades, among which includes the discovery of the oil and the natural gas and its importance for the growth of the industrial and services sectors. It points out that the increase of the production and the international price of the oil contributed for the growth in revenues of royalties and the special participation in the beneficiary cities, what did not mean an automatic increase in the resources destined to the investment and in the quality on the provision of the goods and services come back toward the local development. On the contrary, the main conclusion of the work is that the trajectory of the oil producing municipalities is marked by paths and embezzlements in the performance of the public finances and in the provision of public goods and services. Paths, that lead to the improvement of the performance of the public finances and the quality of the public goods and services. Embezzlements, that lead to the inefficiency in the provision of goods and services and the capture of the public resources. That is, the fiscal decentralization is a necessary condition, however not enough to improve the amount and the quality of the public goods and services given by these municipalities. For that it is necessary to advance in the fiscal federalism normative theories, in search of optimum model of federalism in local governments where still predominated by patrimonialism, clientelism, fiscal illusion and the capture of the public resources in benefit of the private interests

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Brazilian tax structure has specific characteristics and the performance level of government. The autonomy given to municipalities to manage their activities after the 1988 Constitution, made them highly dependent on intergovernmental transfers of resources, revealing the fragility of the administrative capacity of these entities. The vertical gap revealed by the constitutional structure of the Brazilian fiscal federalism model contributes to the formation of this specific feature that you are eroding the tax base and the ability of municipal own revenues. Although there was a better regulation of these transfers after the enactment of the Fiscal Responsibility Law, it is observed that the amount of resources transferred to the municipalities of Rio Grande do Norte is very high and is the main source of revenue of municipalities. In light of the theory of federalism and fiscal decentralization, in particular, the theories related to intergovernmental transfers, we seek to diagnose the transfers from the systematization of information on the origin, destination and value. We used the econometric model of System Dynamic Panel GMM in making the diagnosis and verification of the impact of transfers on public finances of the municipalities of the newborn, associated with a review in light of the theory of fiscal federalism and intergovernmental transfers. The paper presents some proposals for the transfer system and the composition of spending in order to contribute to greater tax efficiency