951 resultados para One-shot information theory
Resumo:
It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. ‘Emergent’ properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.
Resumo:
This paper proposes a convenient signaling scheme-orthogonal on-off BPSK (O3BPSK)-for near-far (NF) resistant detection in asynchronous direct-sequence code-division multiple-access (DS/CDMA) systems (uplink). The temporally adjacent bits from different users in the received signals are decoupled by using the on-off signaling, and the original data rate is maintained with no increase in transmission rate by adopting an orthogonal structure. The detector at the receiver is a one-shot linear decorrelating detector, which depends upon neither hard decision nor specific channel coding. The application of O3 strategy to the differentially encoded BPSK (D-BPSK) sequences is also presented. Finally, some computer simulations are shown to confirm the theoretical analysis.
Resumo:
This paper addresses the effects of synchronisation errors (time delay, carrier phase, and carrier frequency) on the performance of linear decorrelating detectors (LDDs). A major effect is that all LDDs require certain degree of power control in the presence of synchronisation errors. The multi-shot sliding window algorithm (SLWA) and hard decision method (HDM) are analysed and their power control requirements are examined. Also, a more efficient one-shot detection scheme, called “hard-decision based coupling cancellation”, is proposed and analysed. These schemes are then compared with the isolation bit insertion (IBI) approach in terms of power control requirements.
Resumo:
It has been years since the introduction of the Dynamic Network Optimization (DNO) concept, yet the DNO development is still at its infant stage, largely due to a lack of breakthrough in minimizing the lengthy optimization runtime. Our previous work, a distributed parallel solution, has achieved a significant speed gain. To cater for the increased optimization complexity pressed by the uptake of smartphones and tablets, however, this paper examines the potential areas for further improvement and presents a novel asynchronous distributed parallel design that minimizes the inter-process communications. The new approach is implemented and applied to real-life projects whose results demonstrate an augmented acceleration of 7.5 times on a 16-core distributed system compared to 6.1 of our previous solution. Moreover, there is no degradation in the optimization outcome. This is a solid sprint towards the realization of DNO.
Resumo:
An entropy-based image segmentation approach is introduced and applied to color images obtained from Google Earth. Segmentation refers to the process of partitioning a digital image in order to locate different objects and regions of interest. The application to satellite images paves the way to automated monitoring of ecological catastrophes, urban growth, agricultural activity, maritime pollution, climate changing and general surveillance. Regions representing aquatic, rural and urban areas are identified and the accuracy of the proposed segmentation methodology is evaluated. The comparison with gray level images revealed that the color information is fundamental to obtain an accurate segmentation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
In a natural experiment, this paper studies the impact of an informal sanctioning mechanism on individuals’ voluntary contribution to a public good. Cross-country skiers’ actual cash contributions in two ski resorts, one with and one without an informal sanctioning system, are used. I find the contributing share to be higher in the informal sanctioning system (79 percent) than in the non-sanctioning system (36 percent). Previous studies in one-shot public good situations have found an increasing conditional contribution (CC) function, i.e. the relationship between expected average contributions of other group members and the individual’s own contribution. In contrast, the present results suggest that the CC-function in the non-sanctioning system is non-increasing at high perceived levels of others’ contribution. This relationship deserves further testing in lab.
Resumo:
Basic information theory is used to analyse the amount of confidential information which may be leaked by programs written in a very simple imperative language. In particular, a detailed analysis is given of the possible leakage due to equality tests and if statements. The analysis is presented as a set of syntax-directed inference rules and can readily be automated.
Resumo:
O presente trabalho tem por objetivo investigar se a adoção de documentos eletrônicos, uma realidade cuja obrigatoriedade é crescente no Brasil, é acompanhada por uma redução nos custos de conformidade das empresas. O autor buscou o referencial teórico em várias áreas do conhecimento: no Direito Tributário e no Direito Civil, na Matemática Aplicada, na Tecnologia da Informação e, por fim, na Contabilidade. Do Direito Civil vieram os conceitos de documento, que juntamente com conceitos de Matemática e de Teoria da Informação permitem construir a noção de Documento Eletrônico. Do Direito Tributário vieram as noções relativas aos tributos no ordenamento brasileiro, e as suas obrigações associadas (principal e acessórias). Da Contabilidade buscaram-se as definições de custos de conformidade e de transação, de forma que se pudesse avaliar quanto custa para uma empresa ser conforme com as obrigações tributárias acessórias brasileiras, especialmente no que tange ao uso de documentos fiscais eletrônicos. O estudo foi circunscrito na Nota Fiscal Eletrônica, que no Brasil deve ser utilizada em operações de circulação de mercadorias em substituição à Nota Fiscal Modelo 1 ou 1-A, documento tradicional que existe há décadas no Brasil. Buscaram-se informações quantitativas com empresas brasileiras, e a conclusão final é que existem evidências que justificam a afirmação de que o uso de documentos eletrônicos é mais barato que o uso de documentos em papel, mediante a comparação entre os custos de transação associados com a Nota Fiscal Modelo 1 ou 1-A e com a Nota Fiscal Eletrônica.
Resumo:
O tema dessa dissertação ê uma comparaçao o sujeito da teoria psicaüalítica e o sujeito tal concebido pela psicologia escolar. entre qual ê Buscou-se, inicialmente, descrever o sujeito da psicanálise a partir da leitura de alguns textos freudianos, que demonstraram, ser o sujeito, o inconsciente, psicanaliticamente falando. Procurou-se mostrar que, do ponto de vista da teoria psicanalítica, sendo o sujeito, o inconsciente, a ele não se tem acesso através da lógica ou do bom senso. A outra etapa do trabalho foi, a partir de uma pesquisa de campo e de uma pesquisa bibliográfica, a tenta tiva de descobrir quem ª o sujeito para a psicologia escolar. O que ficou claro, como resultado dessas pesquisas, foi que, com pouquíssimas exceções, este sujeito em nada se aproxima do sujeito da teoria psicanalítica. A psicologia escolar. dirige-se, eminentemente, ao indivíduo enquanto sujeito consciente. E, por fim, o que se procurou foi realizar alguns comentários a respeito do confronto entre a visão que a psicanálise apresenta do sujeito e a visão da psicologia escolar. Viu-se, então, que de acordo com alguns autores o papel do psicólogo escolar seria o de contribuir para queo individuo, com o qual lida pudesse tornar-se mais crítico em relação ao sistema social e político ao qual está i~ serido; enquanto que a pesquisa de campo e alguns outros autores mostraram a psicologia escolar como um veículo da melhor adaptação do indivíduo àquilo que se considera como ideal.
Resumo:
The Cue Utilization Theory establishes that all products are made of multiples cues that may be seen as surrogates for the intangible attributes that make up any given product. However, the results of many years of research have yet yielded little consensus as to the impact generated by the use of such cues. This research aims to contribute to the discussion about the importance of intrinsic cues by investigating the effects that the use of product cues that confirm the product claim may have on Claim Credibility (measured through Ad Credibility), and also on consumers’ Purchase Intention and Perceived Risk toward the product. An experiment was designed to test such effects and the results suggest the effects of the use of Claim Confirming Product Cues depend on consumer’s level of awareness about such cue, and that when consumers are aware of it, Ad Credibility and Purchase Intention increase, as Perceived Risk decreases. Such results may have implications to academicians and practitioners, as well as may provide insights for future research.
Resumo:
This paper studies how constraints on the timing of actions affect equilibrium in intertemporal coordination problems. The model exhibits a unique symmetric equilibrium in cut-o¤ strategies. The risk-dominant action of the underlying one-shot game is selected when the option to delay effort is commensurate with the option to wait longer for others' actions. The possibility of waiting longer for the actions of others enhances coordination, but the option of delaying one s actions can induce severe coordination failures: if agents are very patient, they might get arbitrarily low expected payoffs even in cases where coordination would yield arbitrarily large returns.
Resumo:
Conventional methods to solve the problem of blind source separation nonlinear, in general, using series of restrictions to obtain the solution, often leading to an imperfect separation of the original sources and high computational cost. In this paper, we propose an alternative measure of independence based on information theory and uses the tools of artificial intelligence to solve problems of blind source separation linear and nonlinear later. In the linear model applies genetic algorithms and Rényi of negentropy as a measure of independence to find a separation matrix from linear mixtures of signals using linear form of waves, audio and images. A comparison with two types of algorithms for Independent Component Analysis widespread in the literature. Subsequently, we use the same measure of independence, as the cost function in the genetic algorithm to recover source signals were mixed by nonlinear functions from an artificial neural network of radial base type. Genetic algorithms are powerful tools for global search, and therefore well suited for use in problems of blind source separation. Tests and analysis are through computer simulations