49 resultados para Knowledge representation (Information theory)
Resumo:
Hunter and Konieczny explored the relationships between measures of inconsistency for a belief base and the minimal inconsistent subsets of that belief base in several of their papers. In particular, an inconsistency value termed MIVC, defined from minimal inconsistent subsets, can be considered as a Shapley Inconsistency Value. Moreover, it can be axiomatized completely in terms of five simple axioms. MinInc, one of the five axioms, states that each minimal inconsistent set has the same amount of conflict. However, it conflicts with the intuition illustrated by the lottery paradox, which states that as the size of a minimal inconsistent belief base increases, the degree of inconsistency of that belief base becomes smaller. To address this, we present two kinds of revised inconsistency measures for a belief base from its minimal inconsistent subsets. Each of these measures considers the size of each minimal inconsistent subset as well as the number of minimal inconsistent subsets of a belief base. More specifically, we first present a vectorial measure to capture the inconsistency for a belief base, which is more discriminative than MIVC. Then we present a family of weighted inconsistency measures based on the vectorial inconsistency measure, which allow us to capture the inconsistency for a belief base in terms of a single numerical value as usual. We also show that each of the two kinds of revised inconsistency measures can be considered as a particular Shapley Inconsistency Value, and can be axiomatically characterized by the corresponding revised axioms presented in this paper.
Resumo:
Just as conventional institutions are organisational structures for coordinating the activities of multiple interacting individuals, electronic institutions provide a computational analogue for coordinating the activities of multiple interacting software agents. In this paper, we argue that open multi-agent systems can be effectively designed and implemented as electronic institutions, for which we provide a comprehensive computational model. More specifically, the paper provides an operational semantics for electronic institutions, specifying the essential data structures, the state representation and the key operations necessary to implement them. We specify the agent workflow structure that is the core component of such electronic institutions and particular instantiations of knowledge representation languages that support the institutional model. In so doing, we provide the first formal account of the electronic institution concept in a rigorous and unambiguous way.
Resumo:
The information encoded in a quantum system is generally spoiled by the influences of its environment, leading to a transition from pure to mixed states. Reducing the mixedness of a state is a fundamental step in the quest for a feasible implementation of quantum technologies. Here we show that it is impossible to transfer part of such mixedness to a trash system without losing some of the initial information. Such loss is lower-bounded by a value determined by the properties of the initial state to purify. We discuss this interesting phenomenon and its consequences for general quantum information theory, linking it to the information theoretical primitive embodied by the quantum state-merging protocol and to the behaviour of general quantum correlations.
Resumo:
Physical Access Control Systems are commonly used to secure doors in buildings such as airports, hospitals, government buildings and offices. These systems are designed primarily to provide an authentication mechanism, but they also log each door access as a transaction in a database. Unsupervised learning techniques can be used to detect inconsistencies or anomalies in the mobility data, such as a cloned or forged Access Badge, or unusual behaviour by staff members. In this paper, we present an overview of our method of inferring directed graphs to represent a physical building network and the flows of mobility within it. We demonstrate how the graphs can be used for Visual Data Exploration, and outline how to apply algorithms based on Information Theory to the graph data in order to detect inconsistent or abnormal behaviour.
Resumo:
This article examines the relationship between the learning organisation and the implementation of curriculum innovation within schools. It also compares the extent of innovative activity undertaken by schools in the public and the private sectors. A learning organisation is characterised by long-term goals, participatory decision-making processes, collaboration with external stakeholders, effective mechanisms for the internal communication of knowledge and information, and the use of rewards for its members. These characteristics are expected to promote curriculum innovation, once a number of control factors have been taken into account. The article reports on a study carried out in 197 Greek public and private primary schools in the 1999-2000 school year. Structured interviews with school principals were used as a method of data collection. According to the statistical results, the most important determinants of the innovative activity of a school are the extent of its collaboration with other organisations (i.e. openness to society), and the implementation of development programmes for teachers and parents (i.e. communication of knowledge and information). Contrary to expectations, the existence of long-term goals, the extent of shared decision-making, and the use of teacher rewards had no impact on curriculum innovation. The study also suggests that the private sector, as such, has an additional positive effect on the implementation of curriculum innovation, once a number of human, financial, material, and management resources have been controlled for. The study concludes by making recommendations for future research that would shed more light on unexpected outcomes and would help explore the causal link between variables in the research model.
Resumo:
Oxytocin (OT) influences how humans process information about others. Whether OT affects the processing of information about oneself remains unknown. Using a double-blind, placebo-controlled within-subject design, we recorded event-related potentials (ERPs) from adults during trait judgments about oneself and a celebrity and during judgments on word valence, after intranasal OT or placebo administration. We found that OT vs. placebo treatment reduced the differential amplitudes of a fronto-central positivity at 220-280 ms (P2) during self- vs. valence-judgments. OT vs. placebo treatment tended to reduce the differential amplitude of a late positive potential at 520-1000 ms (LPP) during self-judgments but to increase the differential LPP amplitude during other-judgments. OT effects on the differential P2 and LPP amplitudes to self- vs. celebrity-judgments were positively correlated with a measure of interdependence of self-construals. Thus OT modulates the neural correlates of self-referential processing and this effect varies as a function of interdependence.
Resumo:
In order to formalize and extend on previous ad-hoc analysis and synthesis methods a theoretical treatment using vector representations of directional modulation (DM) systems is introduced and used to achieve DM transmitter characteristics. An orthogonal vector approach is proposed which allows the artificial orthogonal noise concept derived from information theory to be brought to bear on DM analysis and synthesis. The orthogonal vector method is validated and discussed via bit error rate (BER) simulations.
Resumo:
In this paper metrics for assessing the performance of directional modulation (DM) physical-layer secure wireless systems are discussed. In the paper DM systems are shown to be categorized as static or dynamic. The behavior of each type of system is discussed for QPSK modulation. Besides EVM-like and BER metrics, secrecy rate as used in information theory community is also derived for the purpose of this QPSK DM system evaluation.
Resumo:
Polar codes are one of the most recent advancements in coding theory and they have attracted significant interest. While they are provably capacity achieving over various channels, they have seen limited practical applications. Unfortunately, the successive nature of successive cancellation based decoders hinders fine-grained adaptation of the decoding complexity to design constraints and operating conditions. In this paper, we propose a systematic method for enabling complexity-performance trade-offs by constructing polar codes based on an optimization problem which minimizes the complexity under a suitably defined mutual information based performance constraint. Moreover, a low-complexity greedy algorithm is proposed in order to solve the optimization problem efficiently for very large code lengths.
Resumo:
The different quantum phases appearing in strongly correlated systems as well as their transitions are closely related to the entanglement shared between their constituents. In 1D systems, it is well established that the entanglement spectrum is linked to the symmetries that protect the different quantum phases. This relation extends even further at the phase transitions where a direct link associates the entanglement spectrum to the conformal field theory describing the former. For 2D systems much less is known. The lattice geometry becomes a crucial aspect to consider when studying entanglement and phase transitions. Here, we analyze the entanglement properties of triangular spin lattice models by also considering concepts borrowed from quantum information theory such as geometric entanglement.
Resumo:
We examine the representation of judgements of stochastic independence in probabilistic logics. We focus on a relational logic where (i) judgements of stochastic independence are encoded by directed acyclic graphs, and (ii) probabilistic assessments are flexible in the sense that they are not required to specify a single probability measure. We discuss issues of knowledge representation and inference that arise from our particular combination of graphs, stochastic independence, logical formulas and probabilistic assessments.