951 resultados para Teoria da informação em finanças


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Cue Utilization Theory establishes that all products are made of multiples cues that may be seen as surrogates for the intangible attributes that make up any given product. However, the results of many years of research have yet yielded little consensus as to the impact generated by the use of such cues. This research aims to contribute to the discussion about the importance of intrinsic cues by investigating the effects that the use of product cues that confirm the product claim may have on Claim Credibility (measured through Ad Credibility), and also on consumers’ Purchase Intention and Perceived Risk toward the product. An experiment was designed to test such effects and the results suggest the effects of the use of Claim Confirming Product Cues depend on consumer’s level of awareness about such cue, and that when consumers are aware of it, Ad Credibility and Purchase Intention increase, as Perceived Risk decreases. Such results may have implications to academicians and practitioners, as well as may provide insights for future research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A área que enfocamos é a de processamento de dados. o que nos interessará neste trabalho é o controle na forma de uma redundância que acompanha o código. Abrange os CÓDIGOS DETETORES DE ERROS, sendo já bastante difundida na forma de DÍGITOS VERIFICADORES OU DE CONTROLE, para códigos numéricos decimais.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Após aplicação de testes empíricos sobre carteiras de ações na Bovespa, no período que vai de 1995 a 1998, os autores constatam a existência da anomalia conhecida como Efeito Tamanho no mercado de capitais brasileiro. Identificando no período analisado, 1995 a 1998, que há evidências de que a média dos retornos das carteiras compostas pelas empresas de baixa capitalização é menor do que as compostas por empresas de alta capitalização. O autores advertem que os resultados, refletem apenas o período estudado, não devendo ser generalizados e são indicativos de que novas pesquisas são necessárias.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper investigates the expectations formation process of economic agents about infl ation rate. Using the Market Expectations System of Central Bank of Brazil, we perceive that agents do not update their forecasts every period and that even agents who update disagree in their predictions. We then focus on the two most popular types of inattention models that have been discussed in the recent literature: sticky-information and noisy-information models. Estimating a hybrid model we fi nd that, although formally fi tting the Brazilian data, it happens at the cost of a much higher degree of information rigidity than observed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Following the new tendency of interdisciplinarity of modern science, a new field called neuroengineering has come to light in the last decades. After 2000, scientific journals and conferences all around the world have been created on this theme. The present work comprises three different subareas related to neuroengineering and electrical engineering: neural stimulation; theoretical and computational neuroscience; and neuronal signal processing; as well as biomedical engineering. The research can be divided in three parts: (i) A new method of neuronal photostimulation was developed based on the use of caged compounds. Using the inhibitory neurotransmitter GABA caged by a ruthenium complex it was possible to block neuronal population activity using a laser pulse. The obtained results were evaluated by Wavelet analysis and tested by non-parametric statistics. (ii) A mathematical method was created to identify neuronal assemblies. Neuronal assemblies were proposed as the basis of learning by Donald Hebb remain the most accepted theory for neuronal representation of external stimuli. Using the Marcenko-Pastur law of eigenvalue distribution it was possible to detect neuronal assemblies and to compute their activity with high temporal resolution. The application of the method in real electrophysiological data revealed that neurons from the neocortex and hippocampus can be part of the same assembly, and that neurons can participate in multiple assemblies. (iii) A new method of automatic classification of heart beats was developed, which does not rely on a data base for training and is not specialized in specific pathologies. The method is based on Wavelet decomposition and normality measures of random variables. Throughout, the results presented in the three fields of knowledge represent qualification in neural and biomedical engineering

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conventional methods to solve the problem of blind source separation nonlinear, in general, using series of restrictions to obtain the solution, often leading to an imperfect separation of the original sources and high computational cost. In this paper, we propose an alternative measure of independence based on information theory and uses the tools of artificial intelligence to solve problems of blind source separation linear and nonlinear later. In the linear model applies genetic algorithms and Rényi of negentropy as a measure of independence to find a separation matrix from linear mixtures of signals using linear form of waves, audio and images. A comparison with two types of algorithms for Independent Component Analysis widespread in the literature. Subsequently, we use the same measure of independence, as the cost function in the genetic algorithm to recover source signals were mixed by nonlinear functions from an artificial neural network of radial base type. Genetic algorithms are powerful tools for global search, and therefore well suited for use in problems of blind source separation. Tests and analysis are through computer simulations

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A posição que a renomada estatí stica de Boltzmann-Gibbs (BG) ocupa no cenário cientifíco e incontestável, tendo um âmbito de aplicabilidade muito abrangente. Por em, muitos fenômenos físicos não podem ser descritos por esse formalismo. Isso se deve, em parte, ao fato de que a estatística de BG trata de fenômenos que se encontram no equilíbrio termodinâmico. Em regiões onde o equilíbrio térmico não prevalece, outros formalismos estatísticos devem ser utilizados. Dois desses formalismos emergiram nas duas ultimas décadas e são comumente denominados de q-estatística e k-estatística; o primeiro deles foi concebido por Constantino Tsallis no final da década de 80 e o ultimo por Giorgio Kaniadakis em 2001. Esses formalismos possuem caráter generalizador e, por isso, contem a estatística de BG como caso particular para uma escolha adequada de certos parâmetros. Esses dois formalismos, em particular o de Tsallis, nos conduzem também a refletir criticamente sobre conceitos tão fortemente enraizados na estat ística de BG como a aditividade e a extensividade de certas grandezas físicas. O escopo deste trabalho esta centrado no segundo desses formalismos. A k -estatstica constitui não só uma generalização da estatística de BG, mas, atraves da fundamentação do Princípio de Interação Cinético (KIP), engloba em seu âmago as celebradas estatísticas quânticas de Fermi- Dirac e Bose-Einstein; além da própria q-estatística. Neste trabalho, apresentamos alguns aspectos conceituais da q-estatística e, principalmente, da k-estatística. Utilizaremos esses conceitos junto com o conceito de informação de bloco para apresentar um funcional entrópico espelhado no formalismo de Kaniadakis que será utilizado posteriormente para descrever aspectos informacionais contidos em fractais tipo Cantor. Em particular, estamos interessados em conhecer as relações entre parâmetros fractais, como a dimensão fractal, e o parâmetro deformador. Apesar da simplicidade, isso nos proporcionará, em trabalho futuros, descrever estatisticamente estruturas mais complexas como o DNA, super-redes e sistema complexos

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Propomos que uma evolução de idéias científicas seja usada como instrumento de aprendizagem de conteúdos específicos e, em particular, para ressaltar como os conteúdos se articulam entre as disciplinas. Como exemplo, apresentamos um estudo sobre a proposta do demônio de Maxwell e discussões sobre sua exorcização, isto é, um estudo sobre a compreensão da natureza de um ser inteligente que atua dentro de um sistema físico e de como seria essa atuação. Estão envolvidos nesse problema fenômenos relacionados com várias teorias - Termodinâmica, Física Molecular, Mecânica Estatística, Teoria da Informação - dentro das disciplinas de Física, Química, Biologia, Computação. Entre diversas questões epistemológicas e conceituais aí contidas, será enfatizada a questão do objeto limitado de uma eoria científica, isto é, da limitação de seu significado aos fenômenos por ela compreendidos. A delimitação dos fenômenos estudados e as teorias e técnicas caracterizam a compreensão que vai realizar sua emergência concreta nos laboratórios. Essa compreensão vai dar também a possibilidade de atuação interdisciplinar.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)