26 resultados para information theoretic measures

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a novel graph class we call universal hierarchical graphs (UHG) whose topology can be found numerously in problems representing, e.g., temporal, spacial or general process structures of systems. For this graph class we show, that we can naturally assign two probability distributions, for nodes and for edges, which lead us directly to the definition of the entropy and joint entropy and, hence, mutual information establishing an information theory for this graph class. Furthermore, we provide some results under which conditions these constraint probability distributions maximize the corresponding entropy. Also, we demonstrate that these entropic measures can be computed efficiently which is a prerequisite for every large scale practical application and show some numerical examples. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper provides algorithms that use an information-theoretic analysis to learn Bayesian network structures from data. Based on our three-phase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional independence (CI) tests in typical cases. We provide precise conditions that specify when these algorithms are guaranteed to be correct as well as empirical evidence (from real world applications and simulation tests) that demonstrates that these systems work efficiently and reliably in practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present an information-theoretic method to measure the structural information content of networks and apply it to chemical graphs. As a result, we find that our entropy measure is more general than classical information indices known in mathematical and computational chemistry. Further, we demonstrate that our measure reflects the essence of molecular branching meaningfully by determining the structural information content of some chemical graphs numerically.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We consider two celebrated criteria for defining the nonclassicality of bipartite bosonic quantum systems, the first stemming from information theoretic concepts and the second from physical constraints on the quantum phase space. Consequently, two sets of allegedly classical states are singled out: (i) the set C composed of the so-called classical-classical (CC) states—separable states that are locally distinguishable and do not possess quantum discord; (ii) the set P of states endowed with a positive P representation (P-classical states)—mixtures of Glauber coherent states that, e.g., fail to show negativity of their Wigner function. By showing that C and P are almost disjoint, we prove that the two defining criteria are maximally inequivalent. Thus, the notions of classicality that they put forward are radically different. In particular, generic CC states show quantumness in their P representation, and vice versa, almost all P-classical states have positive quantum discord and, hence, are not CC. This inequivalence is further elucidated considering different applications of P-classical and CC states. Our results suggest that there are other quantum correlations in nature than those revealed by entanglement and quantum discord.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper investigates the application of complex wavelet transforms to the field of digital data hiding. Complex wavelets offer improved directional selectivity and shift invariance over their discretely sampled counterparts allowing for better adaptation of watermark distortions to the host media. Two methods of deriving visual models for the watermarking system are adapted to the complex wavelet transforms and their performances are compared. To produce improved capacity a spread transform embedding algorithm is devised, this combines the robustness of spread spectrum methods with the high capacity of quantization based methods. Using established information theoretic methods, limits of watermark capacity are derived that demonstrate the superiority of complex wavelets over discretely sampled wavelets. Finally results for the algorithm against commonly used attacks demonstrate its robustness and the improved performance offered by complex wavelet transforms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the present paper, we introduce a notion of a style representing abstract, complex objects having characteristics that can be represented as structured objects. Furthermore, we provide some mathematical properties of such styles. As a main result, we present a novel approach to perform a meaningful comparative analysis of such styles by defining and using graph-theoretic measures. We compare two styles by comparing the underlying feature sets representing sets of graph structurally. To determine the structural similarity between the underlying graphs, we use graph similarity measures that are computationally efficient. More precisely, in order to compare styles, we map each feature set to a so-called median graph and compare the resulting median graphs. As an application, we perform an experimental study to compare special styles representing sets of undirected graphs and present numerical results thereof. (C) 2007 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: To compare long-term cognitive outcomes of patients treated with surgical clipping or endovascular coiling after subarachnoid haemorrhage (SAH). Method: Retrospective matched cohort study assessed neuropsychological functioning at least 12 months after aneurysmal SAH treatment. Fourteen patients treated by endovascular coiling and nine patients treated by surgical clipping participated. After gaining written consent, a comprehensive neuropsychological battery was completed. Standardised tests were employed to assess pre-morbid and current intellectual functioning (IQ), attention, speed of information processing, memory and executive function as well as psychosocial functioning and affect. Results: Treatment groups were not significantly different in terms of age, pre-morbid IQ, time from injury to treatment or time since injury. A significant effect of treatment on full-scale IQ score (p = 0.025), performance IQ (p = 0.045) and verbal IQ score (p = 0.029), all favouring the coiled group was observed. A medium effect size between groups difference in immediate memory (p = 0.19, partial ?(2) = 0.08) was also observed. No significant between group differences on attention, executive functioning and speed of information processing measures or mood and psychosocial functioning were noted. Both groups reported increased anxiety and memory, attention and speed of information processing deficits relative to normative data. Conclusions: Study findings indicate fewer cognitive deficits following endovascular coiling. Cognitive deficits in the clipped group may be due in part to the invasive nature of neurosurgical clipping. Further prospective research with regard to long-term cognitive and emotional outcomes is warranted. [Box: see text].

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Energy in today's short-range wireless communication is mostly spent on the analog- and digital hardware rather than on radiated power. Hence,purely information-theoretic considerations fail to achieve the lowest energy per information bit and the optimization process must carefully consider the overall transceiver. In this paper, we propose to perform cross-layer optimization, based on an energy-aware rate adaptation scheme combined with a physical layer that is able to properly adjust its processing effort to the data rate and the channel conditions to minimize the energy consumption per information bit. This energy proportional behavior is enabled by extending the classical system modes with additional configuration parameters at the various layers. Fine grained models of the power consumption of the hardware are developed to provide awareness of the physical layer capabilities to the medium access control layer. The joint application of the proposed energy-aware rate adaptation and modifications to the physical layer of an IEEE802.11n system, improves energy-efficiency (averaged over many noise and channel realizations) in all considered scenarios by up to 44%.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article synthesizes the labor theoretic approach to information retrieval. Selection power is taken as the fundamental value for information retrieval and is regarded as produced by selection labor. Selection power remains relatively constant while selection labor modulates across oral, written, and computational modes. A dynamic, stemming principally from the costs of direct human mental labor and effectively compelling the transfer of aspects of human labor to computational technology, is identified. The decision practices of major information system producers are shown to conform with the motivating forces identified in the dynamic. An enhancement of human capacities, from the increased scope of description processes, is revealed. Decision variation and decision considerations are identified. The value of the labor theoretic approach is considered in relation to pre-existing theories, real world practice, and future possibilities. Finally, the continuing intractability of information retrieval is suggested.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Game-theoretic security resource allocation problems have generated significant interest in the area of designing and developing security systems. These approaches traditionally utilize the Stackelberg game model for security resource scheduling in order to improve the protection of critical assets. The basic assumption in Stackelberg games is that a defender will act first, then an attacker will choose their best response after observing the defender’s strategy commitment (e.g., protecting a specific asset). Thus, it requires an attacker’s full or partial observation of a defender’s strategy. This assumption is unrealistic in real-time threat recognition and prevention. In this paper, we propose a new solution concept (i.e., a method to predict how a game will be played) for deriving the defender’s optimal strategy based on the principle of acceptable costs of minimax regret. Moreover, we demonstrate the advantages of this solution concept by analyzing its properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the relation between selection power and selection labor for information retrieval (IR). It is the first part of the development of a labor theoretic approach to IR. Existing models for evaluation of IR systems are reviewed and the distinction of operational from experimental systems partly dissolved. The often covert, but powerful, influence from technology on practice and theory is rendered explicit. Selection power is understood as the human ability to make informed choices between objects or representations of objects and is adopted as the primary value for IR. Selection power is conceived as a property of human consciousness, which can be assisted or frustrated by system design. The concept of selection power is further elucidated, and its value supported, by an example of the discrimination enabled by index descriptions, the discovery of analogous concepts in partly independent scholarly and wider public discourses, and its embodiment in the design and use of systems. Selection power is regarded as produced by selection labor, with the nature of that labor changing with different historical conditions and concurrent information technologies. Selection labor can itself be decomposed into description and search labor. Selection labor and its decomposition into description and search labor will be treated in a subsequent article, in a further development of a labor theoretic approach to information retrieval.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In previous papers, we have presented a logic-based framework based on fusion rules for merging structured news reports. Structured news reports are XML documents, where the textentries are restricted to individual words or simple phrases, such as names and domain-specific terminology, and numbers and units. We assume structured news reports do not require natural language processing. Fusion rules are a form of scripting language that define how structured news reports should be merged. The antecedent of a fusion rule is a call to investigate the information in the structured news reports and the background knowledge, and the consequent of a fusion rule is a formula specifying an action to be undertaken to form a merged report. It is expected that a set of fusion rules is defined for any given application. In this paper we extend the approach to handling probability values, degrees of beliefs, or necessity measures associated with textentries in the news reports. We present the formal definition for each of these types of uncertainty and explain how they can be handled using fusion rules. We also discuss the methods of detecting inconsistencies among sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information retrieval in the age of Internet search engines has become part of ordinary discourse and everyday practice: "Google" is a verb in common usage. Thus far, more attention has been given to practical understanding of information retrieval than to a full theoretical account. In Human Information Retrieval, Julian Warner offers a comprehensive overview of information retrieval, synthesizing theories from different disciplines (information and computer science, librarianship and indexing, and information society discourse) and incorporating such disparate systems as WorldCat and Google into a single, robust theoretical framework. There is a need for such a theoretical treatment, he argues, one that reveals the structure and underlying patterns of this complex field while remaining congruent with everyday practice. Warner presents a labor theoretic approach to information retrieval, building on his previously formulated distinction between semantic and syntactic mental labor, arguing that the description and search labor of information retrieval can be understood as both semantic and syntactic in character. Warner's information science approach is rooted in the humanities and the social sciences but informed by an understanding of information technology and information theory. The chapters offer a progressive exposition of the topic, with illustrative examples to explain the concepts presented. Neither narrowly practical nor largely speculative, Human Information Retrieval meets the contemporary need for a broader treatment of information and information systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to compare the inferability of various synthetic as well as real biological regulatory networks. In order to assess differences we apply local network-based measures. That means, instead of applying global measures, we investigate and assess an inference algorithm locally, on the level of individual edges and subnetworks. We demonstrate the behaviour of our local network-based measures with respect to different regulatory networks by conducting large-scale simulations. As inference algorithm we use exemplarily ARACNE. The results from our exploratory analysis allow us not only to gain new insights into the strength and weakness of an inference algorithm with respect to characteristics of different regulatory networks, but also to obtain information that could be used to design novel problem-specific statistical estimators.