901 resultados para Algorithmic information theory
Resumo:
Mode of access: Internet.
Resumo:
Mode of access: Internet.
Resumo:
Includes bibliography.
Resumo:
In the context of a hostile funding environment, universities are increasingly asked to justify their output in narrowly defined economic terms, and this can be difficult in Humanities or Arts faculties where productivity is rarely reducible to a simple financial indicator. This can lead to a number of immediate consequences that I have no need to rehearse here, but can also result in some interesting tensions within the academic community itself. First is that which has become known as the ‘Science Wars’: the increasingly acrimonious exchanges between scientists and scientific academics and cultural critics or theorists about who has the right to describe the world. Much has already been said—and much remains to be said—about this issue, but it is not my intention to discuss it here. Rather, I will look at a second area of contestation: the incorporation of scientific theory into literary or cultural criticism. Much of this work comes from a genuine commitment to interdisciplinarity, and an appreciation of insights that a fresh perspective can bring to a familiar object. However, some can be seen as cynical attempts to lend literary studies the sort of empirical legitimacy of the sciences. In particular, I want to look at a number of critics who have applied information theory to the literary work. In this paper, I will examine several instances of this sort of criticism, and then, through an analysis of a novel by American author Richard Powers, Three Farmers on Their Way to a Dance, show how this sort of criticism merely reduces the meaningful analysis of a complex literary text.
Resumo:
We present information-theory analysis of the tradeoff between bit-error rate improvement and the data-rate loss using skewed channel coding to suppress pattern-dependent errors in digital communications. Without loss of generality, we apply developed general theory to the particular example of a high-speed fiber communication system with a strong patterning effect. © 2007 IEEE.
Resumo:
The basic structure of the General Information Theory (GIT) is presented in the paper. The main divisions of the GIT are outlined. Some new results are pointed.
Resumo:
In this paper we summarize our recently proposed work on the information theory analysis of regenerative channels. We discuss how the design and the transfer function properties of the regenerator affect the noise statistics and enable Shannon capacities higher than that of the corresponding linear channels (in the absence of regeneration).
Resumo:
This paper proposes a principal-agent model between banks and firms with risk and asymmetric information. A mixed form of finance to firms is assumed. The capital structure of firms is a relevant cause for the final aggregate level of investment in the economy. In the model analyzed, there may be a separating equilibrium, which is not economically efficient, because aggregate investments fall short of the first-best level. Based on European firm-level data, an empirical model is presented which validates the result of the relevance of the capital structure of firms. The relative magnitude of equity in the capital structure makes a real difference to the profits obtained by firms in the economy.
Resumo:
Understanding how stem and progenitor cells choose between alternative cell fates is a major challenge in developmental biology. Efforts to tackle this problem have been hampered by the scarcity of markers that can be used to predict cell division outcomes. Here we present a computational method, based on algorithmic information theory, to analyze dynamic features of living cells over time. Using this method, we asked whether rat retinal progenitor cells (RPCs) display characteristic phenotypes before undergoing mitosis that could foretell their fate. We predicted whether RPCs will undergo a self-renewing or terminal division with 99% accuracy, or whether they will produce two photoreceptors or another combination of offspring with 87% accuracy. Our implementation can segment, track and generate predictions for 40 cells simultaneously on a standard computer at 5 min per frame. This method could be used to isolate cell populations with specific developmental potential, enabling previously impossible investigations.
Resumo:
It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. ‘Emergent’ properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.
Resumo:
A line of information and information literacy research has emerged that has a strong focus on information experience. Strengthened understanding, profiling and theorising of information experience as a specific domain of interest to information researchers is required. A focus on information experience is likely to have a major influence on the field, drawing attention to interpretive and experiential forms of research.
Resumo:
The design and construction community has shown increasing interest in adopting building information models (BIMs). The richness of information provided by BIMs has the potential to streamline the design and construction processes by enabling enhanced communication, coordination, automation and analysis. However, there are many challenges in extracting construction-specific information out of BIMs. In most cases, construction practitioners have to manually identify the required information, which is inefficient and prone to error, particularly for complex, large-scale projects. This paper describes the process and methods we have formalized to partially automate the extraction and querying of construction-specific information from a BIM. We describe methods for analyzing a BIM to query for spatial information that is relevant for construction practitioners, and that is typically represented implicitly in a BIM. Our approach integrates ifcXML data and other spatial data to develop a richer model for construction users. We employ custom 2D topological XQuery predicates to answer a variety of spatial queries. The validation results demonstrate that this approach provides a richer representation of construction-specific information compared to existing BIM tools.
Resumo:
We propose a quantity called information ambiguity that plays the same role in the worst-case information-theoretic nalyses as the well-known notion of information entropy performs in the corresponding average-case analyses. We prove various properties of information ambiguity and illustrate its usefulness in performing the worst-case analysis of a variant of distributed source coding problem.