78 resultados para One-shot information theory

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we argue that important labor market phenomena can be better understood if one takes (a) the inherent incompleteness and relational nature of most employment contracts and (b) the existence of reference-dependent fairness concerns among a substantial share of the population into account. Theory shows and experiments confirm that, even if fairness concerns were to exert only weak effects in one-shot interactions, repeated interactions greatly magnify the relevance of such concerns on economic outcomes. We also review evidence from laboratory and field experiments examining the role of wages and fairness on effort, derive predictions from our approach for entry-level wages and incumbent workers' wages, confront these predictions with the evidence, and show that reference-dependent fairness concerns may have important consequences for the effects of economic policies such as minimum wage laws.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Plants such as Arabidopsis thaliana respond to foliar shade and neighbors who may become competitors for light resources by elongation growth to secure access to unfiltered sunlight. Challenges faced during this shade avoidance response (SAR) are different under a light-absorbing canopy and during neighbor detection where light remains abundant. In both situations, elongation growth depends on auxin and transcription factors of the phytochrome interacting factor (PIF) class. Using a computational modeling approach to study the SAR regulatory network, we identify and experimentally validate a previously unidentified role for long hypocotyl in far red 1, a negative regulator of the PIFs. Moreover, we find that during neighbor detection, growth is promoted primarily by the production of auxin. In contrast, in true shade, the system operates with less auxin but with an increased sensitivity to the hormonal signal. Our data suggest that this latter signal is less robust, which may reflect a cost-to-robustness tradeoff, a system trait long recognized by engineers and forming the basis of information theory.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Human cooperation is often based on reputation gained from previous interactions with third parties. Such reputation can be built on generous or punitive actions, and both, one's own reputation and the reputation of others have been shown to influence decision making in experimental games that control for confounding variables. Here we test how reputation-based cooperation and punishment react to disruption of the cognitive processing in different kinds of helping games with observers. Saying a few superfluous words before each interaction was used to possibly interfere with working memory. In a first set of experiments, where reputation could only be based on generosity, the disruption reduced the frequency of cooperation and lowered mean final payoffs. In a second set of experiments where reputation could only be based on punishment, the disruption increased the frequency of antisocial punishment (i.e. of punishing those who helped) and reduced the frequency of punishing defectors. Our findings suggest that working memory can easily be constraining in reputation-based interactions within experimental games, even if these games are based on a few simple rules with a visual display that provides all the information the subjects need to play the strategies predicted from current theory. Our findings also highlight a weakness of experimental games, namely that they can be very sensitive to environmental variation and that quantitative conclusions about antisocial punishment or other behavioral strategies can easily be misleading.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cette thèse examine la circulation et l'intégration des informations scientifiques dans la pensée quotidienne d'après la théorie des représentations sociales (TRS). En tant qu'alternative aux approches traditionnelles de la communication de la science, les transformations survenant entre le discours scientifique et le discours de sens commun sont considérées comme adaptatives. Deux études sur la circulation des informations dans les media (études 1 et 2) montrent des variations dans les thèmes de discours exposés aux profanes, et parmi les discours de ceux-ci, en fonction de différentes sources. Ensuite, le processus d'ancrage dans le positionnement préalable envers la science est étudié, pour l'explication qu'il fournit de la réception et de la transmission d'informations scientifiques dans le sens commun. Les effets d'ancrage dans les attitudes et croyances préexistants sont reportés dans différents contextes de circulation des informations scientifiques (études 3 à 7), incluant des études de type corrélationnel, experimental et de terrain. Globalement, cette thèse procure des arguments en faveur de la pertinence de la TRS pour la recherche sur la communication de la science, et suggère des développements théoriques et méthodologiques pour ces deux domaines de recherche. Drawing on the social representations theory (SRT), this thesis examines the circulation and integration of scientific information into everyday thinking. As an alternative to the traditional approaches of science communication, it considers transformations between scientific and common-sense discourses as adaptive. Two studies, focused on the spreading of information into the media (Studies 1 and 2), show variations in the themes of discourses introduced to laypersons and in the themes among laypersons' discourses, according to different sources. Anchoring in prior positioning toward science is then studied for the explanation it provides on the reception and transmission of scientific information into common sense. Anchoring effects in prior attitudes and beliefs are reported in different contexts of circulation of scientific information (Studies 3 to 7) by using results from correlational, field, and experimental studies. Overall, this thesis provides arguments for the relevance of SRT in science communication research and suggests theoretical and methodological developments for both domains of research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the key problems in conducting surveys is convincing people to participate.¦However, it is often difficult or impossible to determine why people refuse. Panel surveys¦provide information from previous waves that can offer valuable clues as to why people¦refuse to participate. If we are able to anticipate the reasons for refusal, then we¦may be able to take appropriate measures to encourage potential respondents to participate¦in the survey. For example, special training could be provided for interviewers¦on how to convince potential participants to participate.¦This study examines different influences, as determined from the previous wave,¦on refusal reasons that were given by the respondents in the subsequent wave of the¦telephone Swiss Household Panel. These influences include socio-demography, social¦inclusion, answer quality, and interviewer assessment of question understanding and¦of future participation. Generally, coefficients are similar across reasons, and¦between-respondents effects rather than within-respondents effects are significant.¦While 'No interest' reasons are easier to predict, the other reasons are more situational. Survey-specific issues are able to distinguish¦different reasons to some extent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: In order to evaluate the one-year evolution of web-based information on alcohol dependence, we re-assessed alcohol-related sites in July 2007 with the same evaluating tool that had been used to assess these sites in June 2006. Websites were assessed with a standardized form designed to rate sites on the basis of accountability, presentation, interactivity, readability, and content quality. The DISCERN scale was also used, which aimed to assist persons without content expertise in assessing the quality of written health publications. Scores were highly stable for all components of the form one year later (r = .77 to .95, p < .01). Analysis of variance for repeated measures showed no time effect, no interaction between time and scale, no interaction between time and group (affiliation categories), and no interaction between time, group, and scale. The study highlights lack of change of alcohol-dependence-related web pages across one year.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detection and discrimination of visuospatial input involve at least extracting, selecting and encoding relevant information and decision-making processes allowing selecting a response. These two operations are altered, respectively, by attentional mechanisms that change discrimination capacities, and by beliefs concerning the likelihood of uncertain events. Information processing is tuned by the attentional level that acts like a filter on perception, while decision-making processes are weighed by subjective probability of risk. In addition, it has been shown that anxiety could affect the detection of unexpected events through the modification of the level of arousal. Consequently, purpose of this study concerns whether and how decision-making and brain dynamics are affected by anxiety. To investigate these questions, the performance of women with either a high (12) or a low (12) STAI-T (State-Trait Anxiety Inventory, Spielberger, 1983) was examined in a decision-making visuospatial task where subjects have to recognize a target visual pattern from non-target patterns. The target pattern was a schematic image of furniture arranged in such a way as to give the impression of a living room. Non-target patterns were created by either the compression or the dilatation of the distances between objects. Target and non-target patterns were always presented in the same configuration. Preliminary behavioral results show no group difference in reaction time. In addition, visuo-spatial abilities were analyzed trough the signal detection theory for quantifying perceptual decisions in the presence of uncertainty (Green and Swets, 1966). This theory treats detection of a stimulus as a decision-making process determined by the nature of the stimulus and cognitive factors. Astonishingly, no difference in d' (corresponding to the distance between means of the distributions) and c (corresponds to the likelihood ratio) indexes was observed. Comparison of Event-related potentials (ERP) reveals that brain dynamics differ according to anxiety. It shows differences in component latencies, particularly a delay in anxious subjects over posterior electrode sites. However, these differences are compensated during later components by shorter latencies in anxious subjects compared to non-anxious one. These inverted effects seem indicate that the absence of difference in reaction time rely on a compensation of attentional level that tunes cortical activation in anxious subjects, but they have to hammer away to maintain performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (Bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker - typically a client of a forensic examination or a scientist acting on behalf of a client - ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and Bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and Bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked here also serve the purpose of supporting the discussion of the similarities, differences and complementary aspects of existing Bayesian probabilistic sampling criteria and the decision-theoretic approach proposed throughout this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work is to present a multitechnique approach to define the geometry, the kinematics, and the failure mechanism of a retrogressive large landslide (upper part of the La Valette landslide, South French Alps) by the combination of airborne and terrestrial laser scanning data and ground-based seismic tomography data. The advantage of combining different methods is to constrain the geometrical and failure mechanism models by integrating different sources of information. Because of an important point density at the ground surface (4. 1 points m?2), a small laser footprint (0.09 m) and an accurate three-dimensional positioning (0.07 m), airborne laser scanning data are adapted as a source of information to analyze morphological structures at the surface. Seismic tomography surveys (P-wave and S-wave velocities) may highlight the presence of low-seismic-velocity zones that characterize the presence of dense fracture networks at the subsurface. The surface displacements measured from the terrestrial laser scanning data over a period of 2 years (May 2008?May 2010) allow one to quantify the landslide activity at the direct vicinity of the identified discontinuities. An important subsidence of the crown area with an average subsidence rate of 3.07 m?year?1 is determined. The displacement directions indicate that the retrogression is controlled structurally by the preexisting discontinuities. A conceptual structural model is proposed to explain the failure mechanism and the retrogressive evolution of the main scarp. Uphill, the crown area is affected by planar sliding included in a deeper wedge failure system constrained by two preexisting fractures. Downhill, the landslide body acts as a buttress for the upper part. Consequently, the progression of the landslide body downhill allows the development of dip-slope failures, and coherent blocks start sliding along planar discontinuities. The volume of the failed mass in the crown area is estimated at 500,000 m3 with the sloping local base level method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Access to information legislations are now present in over 50 countries world-wide. Lagging behind some of its own Cantons, the Swiss Federal government was until recently one of the few hold outs in Europe. But, in December 2004, the Confederation voted the 'Loi sur la Transparence de l'administration' or Law on Transparency (LTrans) a Law that came into effect in July 2006. This paper presents an overview of the new Law and underlines the main institutional challenges to its introduction in Switzerland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: DNA sequence integrity, mRNA concentrations and protein-DNA interactions have been subject to genome-wide analyses based on microarrays with ever increasing efficiency and reliability over the past fifteen years. However, very recently novel technologies for Ultra High-Throughput DNA Sequencing (UHTS) have been harnessed to study these phenomena with unprecedented precision. As a consequence, the extensive bioinformatics environment available for array data management, analysis, interpretation and publication must be extended to include these novel sequencing data types. DESCRIPTION: MIMAS was originally conceived as a simple, convenient and local Microarray Information Management and Annotation System focused on GeneChips for expression profiling studies. MIMAS 3.0 enables users to manage data from high-density oligonucleotide SNP Chips, expression arrays (both 3'UTR and tiling) and promoter arrays, BeadArrays as well as UHTS data using MIAME-compliant standardized vocabulary. Importantly, researchers can export data in MAGE-TAB format and upload them to the EBI's ArrayExpress certified data repository using a one-step procedure. CONCLUSION: We have vastly extended the capability of the system such that it processes the data output of six types of GeneChips (Affymetrix), two different BeadArrays for mRNA and miRNA (Illumina) and the Genome Analyzer (a popular Ultra-High Throughput DNA Sequencer, Illumina), without compromising on its flexibility and user-friendliness. MIMAS, appropriately renamed into Multiomics Information Management and Annotation System, is currently used by scientists working in approximately 50 academic laboratories and genomics platforms in Switzerland and France. MIMAS 3.0 is freely available via http://multiomics.sourceforge.net/.