965 resultados para Take Two Interactive, strategia aziendale, videogiochi,


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aims of the present study were to test the association between insecure attachment and basal cortisol and catecholamines levels in a sample of obese children. The role of familial vulnerability and gender was also investigated. Methods: Cortisol and catecholamines levels of 8- to 13-year olds obese children were measured. Self-report questionnaires were used to assess attachment pattern and current anxiety and depression, and parent-report questionnaires were used to assess attachment, current anxiety and depression and familial vulnerability. Linear regression analyses were performed for individuals that scored low versus high on parental internalizing problems, and for boys and girls, separately. Results: In the group with high parental internalizing problems, insecure attachment was significantly associated with reduced basal levels of cortisol, in boys (p=0.007, b= -0.861, R2= 73.0%). In the group with low parental internalizing problems, the association between insecure attachment and cortisol was not significant in either boys or girls, and it was negative in boys (p=0.075, b= -0.606, R2= 36.7%) and positive in girls (p=0.677, b= 0.176, R2= 3.1%) . Conclusions: Apparently, physiological risk factors for psicopathology in obesity are more evident in individuals with a high familial vulnerability. In addition, patterns of physiological risk for psicopathology in obesity are different in boys and girls. Therefore, it is important to take into account familial vulnerability and gender when investigating physiological risk factors for psycopathology in obesity. Insecure attachment in childhood may be a risk factor for obesity. Interventions to increase children's attachment security should examine the effects on children's weight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Following the Introduction, which surveys existing literature on the technology advances and regulation in telecommunications and on two-sided markets, we address specific issues on the industries of the New Economy, featured by the existence of network effects. We seek to explore how each one of these industries work, identify potential market failures and find new solutions at the economic regulation level promoting social welfare. In Chapter 1 we analyze a regulatory issue on access prices and investments in the telecommunications market. The existing literature on access prices and investment has pointed out that networks underinvest under a regime of mandatory access provision with a fixed access price per end-user. We propose a new access pricing rule, the indexation approach, i.e., the access price, per end-user, that network i pays to network j is function of the investment levels set by both networks. We show that the indexation can enhance economic efficiency beyond what is achieved with a fixed access price. In particular, access price indexation can simultaneously induce lower retail prices and higher investment and social welfare as compared to a fixed access pricing or a regulatory holidays regime. Furthermore, we provide sufficient conditions under which the indexation can implement the socially optimal investment or the Ramsey solution, which would be impossible to obtain under fixed access pricing. Our results contradict the notion that investment efficiency must be sacrificed for gains in pricing efficiency. In Chapter 2 we investigate the effect of regulations that limit advertising airtime on advertising quality and on social welfare. We show, first, that advertising time regulation may reduce the average quality of advertising broadcast on TV networks. Second, an advertising cap may reduce media platforms and firms' profits, while the net effect on viewers (subscribers) welfare is ambiguous because the ad quality reduction resulting from a regulatory cap o¤sets the subscribers direct gain from watching fewer ads. We find that if subscribers are sufficiently sensitive to ad quality, i.e., the ad quality reduction outweighs the direct effect of the cap, a cap may reduce social welfare. The welfare results suggest that a regulatory authority that is trying to increase welfare via regulation of the volume of advertising on TV might necessitate to also regulate advertising quality or, if regulating quality proves impractical, take the effect of advertising quality into consideration. 3 In Chapter 3 we investigate the rules that govern Electronic Payment Networks (EPNs). In EPNs the No-Surcharge Rule (NSR) requires that merchants charge at most the same amount for a payment card transaction as for cash. In this chapter, we analyze a three- party model (consumers, merchants, and a proprietary EPN) with endogenous transaction volumes and heterogenous merchants' transactional benefits of accepting cards to assess the welfare impacts of the NSR. We show that, if merchants are local monopolists and the network externalities from merchants to cardholders are sufficiently strong, with the exception of the EPN, all agents will be worse o¤ with the NSR, and therefore the NSR is socially undesirable. The positive role of the NSR in terms of improvement of retail price efficiency for cardholders is also highlighted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change is a crisis that is going to affect all of our lives in the future. Ireland is expected to have increased storms and rain throughout the country. This will affect our lives greatly unless we do something to change it. In an attempt to try and reduce the impacts of climate change, countries across the world met to address the problem. The meeting became known as the Kyoto Protocol. The Kyoto protocol set out objectives for each developed country to achieve with regards to carbon emissions to the same levels as 1990 levels. Due to the economy in Ireland being at a low point in 1990, Ireland was given a target of 13% carbon emissions above 1990 levels. In order to meet targets Ireland produced two energy papers, the green paper and the white paper. The green paper identified drivers for energy management and control; they were security of energy supply, economic competitiveness and environmental protection. The white paper produced targets in which we should aim to achieve to try and address the green papers drivers. Within the targets was the plan to reduce energy consumption in the public sector by 33% by 2020 through energy conservation measures. Schools are part of the public sector that has targets to reduce its energy consumption. To help to achieve targets in schools initiatives have been developed by the government for schools. Energy audits should be performed in order to identify areas where the schools can improve their current trends and show where they can invest in the future to save money and reduce the schools overall environmental footprint. Grants are available for the schools for insulation through the energy efficiency scheme and for renewable energy technologies through the ReHeat scheme. The promotion of energy efficient programs in schools can have a positive effect for students to have an understanding. The Display Energy Certificate is a legal document that can be used to understand how each school is performing from an energy perspective. It can help schools to understand why they need to change their current energy management structure. By improving the energy management of the schools they then improve the performance on the Display Energy Certificate. Schools should use these tools wisely and take advantage of the grants available which can in the short to long term help them to save money and reduce their carbon footprint.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel hybrid (or multiphysics) algorithm, which couples pore-scale and Darcy descriptions of two-phase flow in porous media. The flow at the pore-scale is described by the Navier?Stokes equations, and the Volume of Fluid (VOF) method is used to model the evolution of the fluid?fluid interface. An extension of the Multiscale Finite Volume (MsFV) method is employed to construct the Darcy-scale problem. First, a set of local interpolators for pressure and velocity is constructed by solving the Navier?Stokes equations; then, a coarse mass-conservation problem is constructed by averaging the pore-scale velocity over the cells of a coarse grid, which act as control volumes; finally, a conservative pore-scale velocity field is reconstructed and used to advect the fluid?fluid interface. The method relies on the localization assumptions used to compute the interpolators (which are quite straightforward extensions of the standard MsFV) and on the postulate that the coarse-scale fluxes are proportional to the coarse-pressure differences. By numerical simulations of two-phase problems, we demonstrate that these assumptions provide hybrid solutions that are in good agreement with reference pore-scale solutions and are able to model the transition from stable to unstable flow regimes. Our hybrid method can naturally take advantage of several adaptive strategies and allows considering pore-scale fluxes only in some regions, while Darcy fluxes are used in the rest of the domain. Moreover, since the method relies on the assumption that the relationship between coarse-scale fluxes and pressure differences is local, it can be used as a numerical tool to investigate the limits of validity of Darcy's law and to understand the link between pore-scale quantities and their corresponding Darcy-scale variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliable quantification of the macromolecule signals in short echo-time H-1 MRS spectra is particularly important at high magnetic fields for an accurate quantification of metabolite concentrations (the neurochemical profile) due to effectively increased spectral resolution of the macromolecule components. The purpose of the present study was to assess two approaches of quantification, which take the contribution of macromolecules into account in the quantification step. H-1 spectra were acquired on a 14.1 T/26 cm horizontal scanner on five rats using the ultra-short echo-time SPECIAL (spin echo full intensity acquired localization) spectroscopy sequence. Metabolite concentrations were estimated using LCModel, combined with a simulated basis set of metabolites using published spectral parameters and either the spectrum of macromolecules measured in vivo, using an inversion recovery technique, or baseline simulated by the built-in spline function. The fitted spline function resulted in a smooth approximation of the in vivo macromolecules, but in accordance with previous studies using Subtract-QUEST could not reproduce completely all features of the in vivo spectrum of macromolecules at 14.1 T. As a consequence, the measured macromolecular 'baseline' led to a more accurate and reliable quantification at higher field strengths.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Autophagy is a cellular mechanism for degrading proteins and organelles. It was first described as a physiological process essential for maintaining homeostasis and cell survival, but understanding its role in conditions of stress has been complicated by the recognition of a new type of cell death ("type 2") characterized by deleterious autophagic activity. This paradox is important in the central nervous system where the activation of autophagy seems to be protective in certain neurodegenerative diseases but deleterious in cerebral ischemia. The development of new therapeutic strategies based on the manipulation of autophagy will need to take into account these opposing roles of autophagy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most integrodifference models of biological invasions are based on the nonoverlapping-generations approximation. However, the effect of multiple reproduction events overlapping generations on the front speed can be very important especially for species with a long life spam . Only in one-dimensional space has this approximation been relaxed previously, although almost all biological invasions take place in two dimensions. Here we present a model that takes into account the overlapping generations effect or, more generally, the stage structure of the population , and we analyze the main differences with the corresponding nonoverlappinggenerations results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a new approach for tonic identification in Indian art music and present a proposal for acomplete iterative system for the same. Our method splits the task of tonic pitch identification into two stages. In the first stage, which is applicable to both vocal and instrumental music, we perform a multi-pitch analysis of the audio signal to identify the tonic pitch-class. Multi-pitch analysisallows us to take advantage of the drone sound, which constantlyreinforces the tonic. In the second stage we estimate the octave in which the tonic of the singer lies and is thusneeded only for the vocal performances. We analyse the predominant melody sung by the lead performer in order to establish the tonic octave. Both stages are individually evaluated on a sizable music collection and are shown toobtain a good accuracy. We also discuss the types of errors made by the method.Further, we present a proposal for a system that aims to incrementally utilize all the available data, both audio and metadata in order to identify the tonic pitch. It produces a tonic estimate and a confidence value, and is iterative in nature. At each iteration, more data is fed into the systemuntil the confidence value for the identified tonic is above a defined threshold. Rather than obtain high overall accuracy for our complete database, ultimately our goal is to develop a system which obtains very high accuracy on a subset of the database with maximum confidence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Observing infants in triadic situations has revealed their triangular competence; namely, their ability to interact with both parents by simultaneously sharing their attention and affects with them. Infants' triangular interaction is linked with the coparenting unit's degree of coordination; in high-coordination (HC) families, parents act as a team in relation to the child, thus drawing clear and flexible boundaries with them; in low-coordination (LC) families, parents either avoid direct interaction with each other and include the child in their unit or join together against the child and exclude him or her, thus drawing inconsistent boundaries with the child. We explored the interactive strategies of LC 9-month-olds (n = 15) with those of their parents, comparing them with HC parents (n = 23) in two conditions: playing with both parents at the same time and witnessing their parents' dialogue. LC infants' affects were less positive; they addressed fewer positive triangular bids to their parents and tended to use a less triangular interactive mode. Thus, LC infants had fewer opportunities than did HC infants to acquire skills necessary for coping with triangular interaction. L'observation de nourrissons dans des situations triadiques a révélé leur compétence triangulaire, c'est-à-dire la capacité à interagir avec les deux parents en partageant simultanément leur attention et leurs affects avec eux. L'interaction triangulaire des nourrissons est liée au degré de coordination de l'unité de coparentage. Dans les familles à coordination élevée (abrégé HC en anglais, CE en français), les parents agissent en relation à l'enfant en tant qu'équipe, et établissent donc des limites claires et flexibles avec les enfants. Dans les familles à coordination faible (abrégé LC en anglais, CF en français), les parents évitent soit l'interaction directe l'un avec l'autre et incluent l'enfant dans leur unité, ou bien ils se liguent contre l'enfant et l'excluent, établissant donc des limites contradictoires avec l'enfant. Nous explorons les stratégies interactives de bébés de 9 mois CF avec celle de leurs parents, en les comparant avec des parents CE (N = 23) dans deux conditions: le jeu avec les deux parents au même moment et l'observation du dialogue des parents. Les affects des bébés CF étaient moins positifs. Les bébés se tournaient moins triangulairement vers leurs parents et avaient tendance à utiliser un mode interactif moins triangulaire. Les bébés CF avaient donc moins de chances que les bébés CE d'acquérir les compétences nécessaires pour faire face avec une interaction triangulaire.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research investigates the phenomenon of translationese in two monolingual comparable corpora of original and translated Catalan texts. Translationese has been defined as the dialect, sub-language or code of translated language. This study aims at giving empirical evidence of translation universals regardless the source language.Traditionally, research conducted on translation strategies has been mainly intuition-based. Computational Linguistics and Natural Language Processing techniques provide reliable information of lexical frequencies, morphological and syntactical distribution in corpora. Therefore, they have been applied to observe which translation strategies occur in these corpora.Results seem to prove the simplification, interference and explicitation hypotheses, whereas no sign of normalization has been detected with the methodology used.The data collected and the resources created for identifying lexical, morphological and syntactic patterns of translations can be useful for Translation Studies teachers, scholars and students: teachers will have more tools to help students avoid the reproduction of translationese patterns. Resources developed will help in detecting non-genuine or inadequate structures in the target language. This fact may imply an improvement in stylistic quality in translations. Translation professionals can also take advantage of these resources to improve their translation quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract The object of game theory lies in the analysis of situations where different social actors have conflicting requirements and where their individual decisions will all influence the global outcome. In this framework, several games have been invented to capture the essence of various dilemmas encountered in many common important socio-economic situations. Even though these games often succeed in helping us understand human or animal behavior in interactive settings, some experiments have shown that people tend to cooperate with each other in situations for which classical game theory strongly recommends them to do the exact opposite. Several mechanisms have been invoked to try to explain the emergence of this unexpected cooperative attitude. Among them, repeated interaction, reputation, and belonging to a recognizable group have often been mentioned. However, the work of Nowak and May (1992) showed that the simple fact of arranging the players according to a spatial structure and only allowing them to interact with their immediate neighbors is sufficient to sustain a certain amount of cooperation even when the game is played anonymously and without repetition. Nowak and May's study and much of the following work was based on regular structures such as two-dimensional grids. Axelrod et al. (2002) showed that by randomizing the choice of neighbors, i.e. by actually giving up a strictly local geographical structure, cooperation can still emerge, provided that the interaction patterns remain stable in time. This is a first step towards a social network structure. However, following pioneering work by sociologists in the sixties such as that of Milgram (1967), in the last few years it has become apparent that many social and biological interaction networks, and even some technological networks, have particular, and partly unexpected, properties that set them apart from regular or random graphs. Among other things, they usually display broad degree distributions, and show small-world topological structure. Roughly speaking, a small-world graph is a network where any individual is relatively close, in terms of social ties, to any other individual, a property also found in random graphs but not in regular lattices. However, in contrast with random graphs, small-world networks also have a certain amount of local structure, as measured, for instance, by a quantity called the clustering coefficient. In the same vein, many real conflicting situations in economy and sociology are not well described neither by a fixed geographical position of the individuals in a regular lattice, nor by a random graph. Furthermore, it is a known fact that network structure can highly influence dynamical phenomena such as the way diseases spread across a population and ideas or information get transmitted. Therefore, in the last decade, research attention has naturally shifted from random and regular graphs towards better models of social interaction structures. The primary goal of this work is to discover whether or not the underlying graph structure of real social networks could give explanations as to why one finds higher levels of cooperation in populations of human beings or animals than what is prescribed by classical game theory. To meet this objective, I start by thoroughly studying a real scientific coauthorship network and showing how it differs from biological or technological networks using divers statistical measurements. Furthermore, I extract and describe its community structure taking into account the intensity of a collaboration. Finally, I investigate the temporal evolution of the network, from its inception to its state at the time of the study in 2006, suggesting also an effective view of it as opposed to a historical one. Thereafter, I combine evolutionary game theory with several network models along with the studied coauthorship network in order to highlight which specific network properties foster cooperation and shed some light on the various mechanisms responsible for the maintenance of this same cooperation. I point out the fact that, to resist defection, cooperators take advantage, whenever possible, of the degree-heterogeneity of social networks and their underlying community structure. Finally, I show that cooperation level and stability depend not only on the game played, but also on the evolutionary dynamic rules used and the individual payoff calculations. Synopsis Le but de la théorie des jeux réside dans l'analyse de situations dans lesquelles différents acteurs sociaux, avec des objectifs souvent conflictuels, doivent individuellement prendre des décisions qui influenceront toutes le résultat global. Dans ce cadre, plusieurs jeux ont été inventés afin de saisir l'essence de divers dilemmes rencontrés dans d'importantes situations socio-économiques. Bien que ces jeux nous permettent souvent de comprendre le comportement d'êtres humains ou d'animaux en interactions, des expériences ont montré que les individus ont parfois tendance à coopérer dans des situations pour lesquelles la théorie classique des jeux prescrit de faire le contraire. Plusieurs mécanismes ont été invoqués pour tenter d'expliquer l'émergence de ce comportement coopératif inattendu. Parmi ceux-ci, la répétition des interactions, la réputation ou encore l'appartenance à des groupes reconnaissables ont souvent été mentionnés. Toutefois, les travaux de Nowak et May (1992) ont montré que le simple fait de disposer les joueurs selon une structure spatiale en leur permettant d'interagir uniquement avec leurs voisins directs est suffisant pour maintenir un certain niveau de coopération même si le jeu est joué de manière anonyme et sans répétitions. L'étude de Nowak et May, ainsi qu'un nombre substantiel de travaux qui ont suivi, étaient basés sur des structures régulières telles que des grilles à deux dimensions. Axelrod et al. (2002) ont montré qu'en randomisant le choix des voisins, i.e. en abandonnant une localisation géographique stricte, la coopération peut malgré tout émerger, pour autant que les schémas d'interactions restent stables au cours du temps. Ceci est un premier pas en direction d'une structure de réseau social. Toutefois, suite aux travaux précurseurs de sociologues des années soixante, tels que ceux de Milgram (1967), il est devenu clair ces dernières années qu'une grande partie des réseaux d'interactions sociaux et biologiques, et même quelques réseaux technologiques, possèdent des propriétés particulières, et partiellement inattendues, qui les distinguent de graphes réguliers ou aléatoires. Entre autres, ils affichent en général une distribution du degré relativement large ainsi qu'une structure de "petit-monde". Grossièrement parlant, un graphe "petit-monde" est un réseau où tout individu se trouve relativement près de tout autre individu en termes de distance sociale, une propriété également présente dans les graphes aléatoires mais absente des grilles régulières. Par contre, les réseaux "petit-monde" ont, contrairement aux graphes aléatoires, une certaine structure de localité, mesurée par exemple par une quantité appelée le "coefficient de clustering". Dans le même esprit, plusieurs situations réelles de conflit en économie et sociologie ne sont pas bien décrites ni par des positions géographiquement fixes des individus en grilles régulières, ni par des graphes aléatoires. De plus, il est bien connu que la structure même d'un réseau peut passablement influencer des phénomènes dynamiques tels que la manière qu'a une maladie de se répandre à travers une population, ou encore la façon dont des idées ou une information s'y propagent. Ainsi, durant cette dernière décennie, l'attention de la recherche s'est tout naturellement déplacée des graphes aléatoires et réguliers vers de meilleurs modèles de structure d'interactions sociales. L'objectif principal de ce travail est de découvrir si la structure sous-jacente de graphe de vrais réseaux sociaux peut fournir des explications quant aux raisons pour lesquelles on trouve, chez certains groupes d'êtres humains ou d'animaux, des niveaux de coopération supérieurs à ce qui est prescrit par la théorie classique des jeux. Dans l'optique d'atteindre ce but, je commence par étudier un véritable réseau de collaborations scientifiques et, en utilisant diverses mesures statistiques, je mets en évidence la manière dont il diffère de réseaux biologiques ou technologiques. De plus, j'extrais et je décris sa structure de communautés en tenant compte de l'intensité d'une collaboration. Finalement, j'examine l'évolution temporelle du réseau depuis son origine jusqu'à son état en 2006, date à laquelle l'étude a été effectuée, en suggérant également une vue effective du réseau par opposition à une vue historique. Par la suite, je combine la théorie évolutionnaire des jeux avec des réseaux comprenant plusieurs modèles et le réseau de collaboration susmentionné, afin de déterminer les propriétés structurelles utiles à la promotion de la coopération et les mécanismes responsables du maintien de celle-ci. Je mets en évidence le fait que, pour ne pas succomber à la défection, les coopérateurs exploitent dans la mesure du possible l'hétérogénéité des réseaux sociaux en termes de degré ainsi que la structure de communautés sous-jacente de ces mêmes réseaux. Finalement, je montre que le niveau de coopération et sa stabilité dépendent non seulement du jeu joué, mais aussi des règles de la dynamique évolutionnaire utilisées et du calcul du bénéfice d'un individu.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a pilot project to reinforce participatory practices in standardization. The INTERNORM project is funded by the University of Lausanne, Switzerland. It aims to create an interactive knowledge center based on the sharing of academic skills and the experiences accumulated by the civil society, especially consumer associations, environmental associations and trade unions to strengthen the participatory process of standardization. The first objective of the project is action-oriented: INTERNORM provides a common knowledge pool supporting the participation of civil society actors to international standard-setting activities by bringing them together with academic experts in working groups and by providing logistic and financial support to their participation to meetings of national and international technical committees. The second objective of the project is analytical: the standardization action initiated through INTERNORM provides a research field for a better understanding of the participatory dynamics underpinning international standardization. The paper presents three incentives that explain civil society (non-)involvement in standardization that try to overcome conventional resource-based hypotheses: an operational incentive, related to the use of standards in the selective goods provided by associations to their membership; a thematic incentive, provided by the setting of priorities by strategic committees created in some standardization organization; a rhetorical incentive, related to the discursive resource that civil society concerns offers to the different stakeholders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"Beauty-contest" is a game in which participants have to choose, typically, a number in [0,100], the winner being the person whose number is closest to a proportion of the average of all chosen numbers. We describe and analyze Beauty-contest experiments run in newspapers in UK, Spain, and Germany and find stable patterns of behavior across them, despite the uncontrollability of these experiments. These results are then compared with lab experiments involving undergraduates and game theorists as subjects, in what must be one of the largest empirical corroborations of interactive behavior ever tried. We claim that all observed behavior, across a wide variety of treatments and subject pools, can be interpretedas iterative reasoning. Level-1 reasoning, Level-2 reasoning and Level-3 reasoning are commonly observed in all the samples, while the equilibrium choice (Level-Maximum reasoning) is only prominently chosen by newspaper readers and theorists. The results show the empirical power of experiments run with large subject-pools, and open the door for more experimental work performed on the rich platform offered by newspapers and magazines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’evolució que ha experimentat la societat, les grans ciutats, la industrialització i molts altres factors han modificat l’estil de vida de les persones accentuant-ne, irremediablement, el sedentarisme i l’abstinència de realitzar exercici físic. La pràctica esportiva i/o d’exercici físic realitzada sota les condicions recomanades pels professionals, és beneficiosa per millorar el nivell de salut o mantenir-lo en tot el possible, ja que provoca modificacions beneficioses sobre el metabolisme, el sistema cardiovascular i l’aparell locomotor. Lamentablement, l’hàbit de realitzar exercici físic no és comú en totes les persones, ja sigui per l’estil de vida que genera incompatibilitats d’horaris amb la feina, fills i familiars o bé per mandra o desgana d’haver d’aprofitar aquelles estones de temps lliure per dedicar-los al culte del cos i de la salut. Els efectes negatius que suposa una modalitat de vida sedentària per a la salut són notablement elevats, amb la qual cosa, cal buscar sistemes per augmentar l’ interès de la població per la pràctica de l’esport i l’activitat física. La creació d’aquest projecte neix de la idea d’unir el fet d’enginyar un mètode per incrementar l’interès de les persones per l’exercici físic amb els avenços tecnològics que s’han realitzat aquesta última dècada relacionats amb el desenvolupament web i multimèdia. A grans trets, la idea general d’aquest projecte es basa en el cas d’un gimnàs real i en actiu, amb necessitat de crear un portal web que serveixi alhora de pàgina web informativa i d’eina de gestió acadèmica del centre proporcionant certes funcionalitats als clients tot presentant-los una nova modalitat de realitzar exercici físic dirigit: realitzar-lo des de casa. Per desenvolupar tot el sistema informàtic que ho durà a terme, després de realitzar una recerca, anàlisi i elecció de les eines mitjançant les quals poder-ho realitzar, s’ha optat per crear l’entorn web mitjançant els llenguatges HTML i PHP en combinació amb els fulls d’estil CSS. Pel que fa a l’entorn de desenvolupament, s’ha utilitzat Notepad++ i com a entorn de proves, WAMP Server. Per últim, pel que fa a la transmissió del contingut multimèdia (vídeos de les sessions d’activitats) s’ha utilitzat Flash Media Interactive Server en combinació de Flash Media Live Encoder per codificar-ne el contingut. L’usuari final, des de qualsevol punt del planeta, podrà realitzar (sempre i quan disposi del temps i el material necessari i una connexió a Internet) en temps real i en directe les classes dirigides que es realitzen al centre. Tanmateix, també s’ha desenvolupat una botiga virtual on qualsevol persona podrà comprar-hi, entre d’altres coses relacionades amb la pràctica de l’exercici físic, tot el material necessari per realitzar qualsevol de les activitats que s’imparteixen al gimnàs i ho rebrà còmodament a casa. Aprofitar unes circumstàncies econòmiques adverses per generar una nova manera de captar clients proporcionant-los una alternativa econòmica, diferent, nova i original d’anar al gimnàs. Temps de crisis, temps d’oportunitats. Aquesta és la moralitat que pretén donar aquest projecte.