116 resultados para bounded rationality


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a novel method that leverages reasoning capabilities in a computer vision system dedicated to human action recognition. The proposed methodology is decomposed into two stages. First, a machine learning based algorithm - known as bag of words - gives a first estimate of action classification from video sequences, by performing an image feature analysis. Those results are afterward passed to a common-sense reasoning system, which analyses, selects and corrects the initial estimation yielded by the machine learning algorithm. This second stage resorts to the knowledge implicit in the rationality that motivates human behaviour. Experiments are performed in realistic conditions, where poor recognition rates by the machine learning techniques are significantly improved by the second stage in which common-sense knowledge and reasoning capabilities have been leveraged. This demonstrates the value of integrating common-sense capabilities into a computer vision pipeline. © 2012 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of sharing the cost of a network that meets the connection demands of a set of agents. The agents simultaneously choose paths in the network connecting their demand nodes. A mechanism splits the total cost of the network formed among the participants. We introduce two new properties of implementation. The first property, Pareto Nash implementation (PNI), requires that the efficient outcome always be implemented in a Nash equilibrium and that the efficient outcome Pareto dominates any other Nash equilibrium. The average cost mechanism and other asymmetric variations are the only mechanisms that meet PNI. These mechanisms are also characterized under strong Nash implementation. The second property, weakly Pareto Nash implementation (WPNI), requires that the least inefficient equilibrium Pareto dominates any other equilibrium. The egalitarian mechanism (EG) and other asymmetric variations are the only mechanisms that meet WPNI and individual
rationality. EG minimizes the price of stability across all individually rational mechanisms. © Springer-Verlag Berlin Heidelberg 2012

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The information encoded in a quantum system is generally spoiled by the influences of its environment, leading to a transition from pure to mixed states. Reducing the mixedness of a state is a fundamental step in the quest for a feasible implementation of quantum technologies. Here we show that it is impossible to transfer part of such mixedness to a trash system without losing some of the initial information. Such loss is lower-bounded by a value determined by the properties of the initial state to purify. We discuss this interesting phenomenon and its consequences for general quantum information theory, linking it to the information theoretical primitive embodied by the quantum state-merging protocol and to the behaviour of general quantum correlations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the conditions under which the trace distance between two different states of a given open system increases in time due to the interaction with an environment, therefore signaling non-Markovianity. We find that the finite-time difference in trace distance is bounded by two sharply defined quantities that are strictly linked to the occurrence of system-environment correlations created throughout their interaction and affecting the subsequent evolution of the system. This allows us to shed light on the origin of non-Markovian behaviors in quantum dynamics. We best illustrate our findings by tackling two physically relevant examples: a non-Markovian dephasing mechanism that has been the focus of a recent experimental endeavor and the open-system dynamics experienced by a spin connected to a finite-size quantum spin chain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to distribute quantum entanglement is a prerequisite for many fundamental tests of quantum theory and numerous quantum information protocols. Two distant parties can increase the amount of entanglement between them by means of quantum communication encoded in a carrier that is sent from one party to the other. Intriguingly, entanglement can be increased even when the exchanged carrier is not entangled with the parties. However, in light of the defining property of entanglement stating that it cannot increase under classical communication, the carrier must be quantum. Here we show that, in general, the increase of relative entropy of entanglement between two remote parties is bounded by the amount of nonclassical correlations of the carrier with the parties as quantified by the relative entropy of discord. We study implications of this bound, provide new examples of entanglement distribution via unentangled states, and put further limits on this phenomenon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Healing algorithms play a crucial part in distributed peer-to-peer networks where failures occur continuously and frequently. Whereas there are approaches for robustness that rely largely on built-in redundancy, we adopt a responsive approach that is more akin to that of biological networks e.g. the brain. The general goal of self-healing distributed graphs is to maintain certain network properties while recovering from failure quickly and making bounded alterations locally. Several self-healing algorithms have been suggested in the recent literature [IPDPS'08, PODC'08, PODC'09, PODC'11]; they heal various network properties while fulfilling competing requirements such as having low degree increase while maintaining connectivity, expansion and low stretch of the network. In this work, we augment the previous algorithms by adding the notion of edge-preserving self-healing which requires the healing algorithm to not delete any edges originally present or adversarialy inserted. This reflects the cost of adding additional edges but more importantly it immediately follows that edge preservation helps maintain any subgraph induced property that is monotonic, in particular important properties such as graph and subgraph densities. Density is an important network property and in certain distributed networks, maintaining it preserves high connectivity among certain subgraphs and backbones. We introduce a general model of self-healing, and introduce xheal+, an edge-preserving version of xheal[PODC'11]. © 2012 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: In randomised clinical trials (RCTs) the selection of appropriate outcomes is crucial to the assessment of whether one intervention is better than another. The purpose of this review is to identify different clinical outcomes reported in glaucoma trials.

Methods We conducted a systematic review of glaucoma RCTs. A sample or selection of glaucoma trials were included bounded by a time frame (between 2006 and March 2012). Only studies in English language were considered. All clinical measured and reported outcomes were included. The possible variations of clinical outcomes were defined prior to data analysis. Information on reported clinical outcomes was tabulated and analysed using descriptive statistics. Other data recorded included type of intervention and glaucoma, duration of the study, defined primary outcomes, and outcomes used for sample size calculation, if nominated.

Results The search strategy identified 4323 potentially relevant abstracts. There were 315 publications retrieved, of which 233 RCTs were included. A total of 967 clinical measures were reported. There were large variations in the definitions used to describe different outcomes and their measures. Intraocular pressure was the most commonly reported outcome (used in 201 RCTs, 86%) with a total of 422 measures (44%). Safety outcomes were commonly reported in 145 RCTs (62%) whereas visual field outcomes were used in 38 RCTs (16%).

Conclusions There is a large variation in the reporting of clinical outcomes in glaucoma RCTs. This lack of standardisation may impair the ability to evaluate the evidence of glaucoma interventions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The key requirement for quantum networking is the distribution of entanglement between nodes. Surprisingly, entanglement can be generated across a network without direct transfer - or communication - of entanglement. In contrast to information gain, which cannot exceed the communicated information, the entanglement gain is bounded by the communicated quantum discord, a more general measure of quantum correlation that includes but is not limited to entanglement. Here, we experimentally entangle two communicating parties sharing three initially separable photonic qubits by exchange of a carrier photon that is unentangled with either party at all times. We show that distributing entanglement with separable carriers is resilient to noise and in some cases becomes the only way of distributing entanglement through noisy environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Supreme Court of the United States in Feist v. Rural (Feist, 1991) specified that compilations or databases, and other works, must have a minimal degree of creativity to be copyrightable. The significance and global diffusion of the decision is only matched by the difficulties it has posed for interpretation. The judgment does not specify what is to be understood by creativity, although it does give a full account of the negative of creativity, as ‘so mechanical or routine as to require no creativity whatsoever’ (Feist, 1991, p.362). The negative of creativity as highly mechanical has particularly diffused globally.

A recent interpretation has correlated ‘so mechanical’ (Feist, 1991) with an automatic mechanical procedure or computational process, using a rigorous exegesis fully to correlate the two uses of mechanical. The negative of creativity is then understood as an automatic computation and as a highly routine process. Creativity is itself is conversely understood as non-computational activity, above a certain level of routinicity (Warner, 2013).

The distinction between the negative of creativity and creativity is strongly analogous to an independently developed distinction between forms of mental labour, between semantic and syntactic labour. Semantic labour is understood as human labour motivated by considerations of meaning and syntactic labour as concerned solely with patterns. Semantic labour is distinctively human while syntactic labour can be directly humanly conducted or delegated to machine, as an automatic computational process (Warner, 2005; 2010, pp.33-41).

The value of the analogy is to greatly increase the intersubjective scope of the distinction between semantic and syntactic mental labour. The global diffusion of the standard for extreme absence of copyrightability embodied in the judgment also indicates the possibility that the distinction fully captures the current transformation in the distribution of mental labour, where syntactic tasks which were previously humanly performed are now increasingly conducted by machine.

The paper has substantive and methodological relevance to the conference themes. Substantively, it is concerned with human creativity, with rationality as not reducible to computation, and has relevance to the language myth, through its indirect endorsement of a non-computable or not mechanical semantics. These themes are supported by the underlying idea of technology as a human construction. Methodologically, it is rooted in the humanities and conducts critical thinking through exegesis and empirically tested theoretical development

References

Feist. (1991). Feist Publications, Inc. v. Rural Tel. Service Co., Inc. 499 U.S. 340.

Warner, J. (2005). Labor in information systems. Annual Review of Information Science and Technology. 39, 2005, pp.551-573.

Warner, J. (2010). Human Information Retrieval (History and Foundations of Information Science Series). Cambridge, MA: MIT Press.

Warner, J. (2013). Creativity for Feist. Journal of the American Society for Information Science and Technology. 64, 6, 2013, pp.1173-1192.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background
When asked to solve mathematical problems, some people experience anxiety and threat, which can lead to impaired mathematical performance (Curr Dir Psychol Sci 11:181–185, 2002). The present studies investigated the link between mathematical anxiety and performance on the cognitive reflection test (CRT; J Econ Perspect 19:25–42, 2005). The CRT is a measure of a person’s ability to resist intuitive response tendencies, and it correlates strongly with important real-life outcomes, such as time preferences, risk-taking, and rational thinking.

Methods
In Experiments 1 and 2 the relationships between maths anxiety, mathematical knowledge/mathematical achievement, test anxiety and cognitive reflection were analysed using mediation analyses. Experiment 3 included a manipulation of working memory load. The effects of anxiety and working memory load were analysed using ANOVAs.

Results
Our experiments with university students (Experiments 1 and 3) and secondary school students (Experiment 2) demonstrated that mathematical anxiety was a significant predictor of cognitive reflection, even after controlling for the effects of general mathematical knowledge (in Experiment 1), school mathematical achievement (in Experiment 2) and test anxiety (in Experiments 1–3). Furthermore, Experiment 3 showed that mathematical anxiety and burdening working memory resources with a secondary task had similar effects on cognitive reflection.

Conclusions
Given earlier findings that showed a close link between cognitive reflection, unbiased decisions and rationality, our results suggest that mathematical anxiety might be negatively related to individuals’ ability to make advantageous choices and good decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let C be a bounded cochain complex of finitely generatedfree modules over the Laurent polynomial ring L = R[x, x−1, y, y−1].The complex C is called R-finitely dominated if it is homotopy equivalentover R to a bounded complex of finitely generated projective Rmodules.Our main result characterises R-finitely dominated complexesin terms of Novikov cohomology: C is R-finitely dominated if andonly if eight complexes derived from C are acyclic; these complexes areC ⊗L R[[x, y]][(xy)−1] and C ⊗L R[x, x−1][[y]][y−1], and their variants obtainedby swapping x and y, and replacing either indeterminate by its inverse.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radio-frequency (RF) impairments in the transceiver hardware of communication systems (e.g., phase noise (PN), high power amplifier (HPA) nonlinearities, or in-phase/quadrature-phase (I/Q) imbalance) can severely degrade the performance of traditional multiple-input multiple-output (MIMO) systems. Although calibration algorithms can partially compensate these impairments, the remaining distortion still has substantial impact. Despite this, most prior works have not analyzed this type of distortion. In this paper, we investigate the impact of residual transceiver hardware impairments on the MIMO system performance. In particular, we consider a transceiver impairment model, which has been experimentally validated, and derive analytical ergodic capacity expressions for both exact and high signal-to-noise ratios (SNRs). We demonstrate that the capacity saturates in the high-SNR regime, thereby creating a finite capacity ceiling. We also present a linear approximation for the ergodic capacity in the low-SNR regime, and show that impairments have only a second-order impact on the capacity. Furthermore, we analyze the effect of transceiver impairments on large-scale MIMO systems; interestingly, we prove that if one increases the number of antennas at one side only, the capacity behaves similar to the finite-dimensional case. On the contrary, if the number of antennas on both sides increases with a fixed ratio, the capacity ceiling vanishes; thus, impairments cause only a bounded offset in the capacity compared to the ideal transceiver hardware case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a fully-distributed self-healing algorithm DEX, that maintains a constant degree expander network in a dynamic setting. To the best of our knowledge, our algorithm provides the first efficient distributed construction of expanders - whose expansion properties hold deterministically - that works even under an all-powerful adaptive adversary that controls the dynamic changes to the network (the adversary has unlimited computational power and knowledge of the entire network state, can decide which nodes join and leave and at what time, and knows the past random choices made by the algorithm). Previous distributed expander constructions typically provide only probabilistic guarantees on the network expansion which rapidly degrade in a dynamic setting, in particular, the expansion properties can degrade even more rapidly under adversarial insertions and deletions. Our algorithm provides efficient maintenance and incurs a low overhead per insertion/deletion by an adaptive adversary: only O(log n) rounds and O(log n) messages are needed with high probability (n is the number of nodes currently in the network). The algorithm requires only a constant number of topology changes. Moreover, our algorithm allows for an efficient implementation and maintenance of a distributed hash table (DHT) on top of DEX, with only a constant additional overhead. Our results are a step towards implementing efficient self-healing networks that have guaranteed properties (constant bounded degree and expansion) despite dynamic changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In contingent valuation, the willingness to pay for hypothetical programs may be affected by the order in which programs are presented to respondents. With inclusive lists, economic theory suggests that sequence effects should be expected. However, when policy makers allocate public budgets to several environmental programs, they may be interested in assessing the value of the programs without the valuations being affected by the order in which the programs are presented. Using single-bounded dichotomous choice contingent valuation questions, we show that if respondents have the possibility to revise their willingness-to-pay answers, sequence effects are mitigated. (JEL Q51, Q54)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We undertake a detailed study of the sets of multiplicity in a second countable locally compact group G and their operator versions. We establish a symbolic calculus for normal completely bounded maps from the space B(L-2(G)) of bounded linear operators on L-2 (G) into the von Neumann algebra VN(G) of G and use it to show that a closed subset E subset of G is a set of multiplicity if and only if the set E* = {(s,t) is an element of G x G : ts(-1) is an element of E} is a set of operator multiplicity. Analogous results are established for M-1-sets and M-0-sets. We show that the property of being a set of multiplicity is preserved under various operations, including taking direct products, and establish an Inverse Image Theorem for such sets. We characterise the sets of finite width that are also sets of operator multiplicity, and show that every compact operator supported on a set of finite width can be approximated by sums of rank one operators supported on the same set. We show that, if G satisfies a mild approximation condition, pointwise multiplication by a given measurable function psi : G -> C defines a closable multiplier on the reduced C*-algebra G(r)*(G) of G if and only if Schur multiplication by the function N(psi): G x G -> C, given by N(psi)(s, t) = psi(ts(-1)), is a closable operator when viewed as a densely defined linear map on the space of compact operators on L-2(G). Similar results are obtained for multipliers on VN(C).