945 resultados para The Impossible Is Possible
Resumo:
We introduce attention games. Alternatives ranked by quality (producers, politicians, sexual partners...) desire to be chosen and compete for the imperfect attention of a chooser by investing in their own salience. We prove that if alternatives can control the attention they get, then ”the showiest is the best”: the equilibrium ordering of salience (weakly) reproduces the quality ranking and the best alternative is the one that gets picked most often. This result also holds under more general conditions. However, if those conditions fail, then even the worst alternative can be picked most often.
Resumo:
Host cell factor-1 (HCF-1), a transcriptional co-regulator of human cell-cycle progression, undergoes proteolytic maturation in which any of six repeated sequences is cleaved by the nutrient-responsive glycosyltransferase, O-linked N-acetylglucosamine (O-GlcNAc) transferase (OGT). We report that the tetratricopeptide-repeat domain of O-GlcNAc transferase binds the carboxyl-terminal portion of an HCF-1 proteolytic repeat such that the cleavage region lies in the glycosyltransferase active site above uridine diphosphate-GlcNAc. The conformation is similar to that of a glycosylation-competent peptide substrate. Cleavage occurs between cysteine and glutamate residues and results in a pyroglutamate product. Conversion of the cleavage site glutamate into serine converts an HCF-1 proteolytic repeat into a glycosylation substrate. Thus, protein glycosylation and HCF-1 cleavage occur in the same active site.
Resumo:
This paper examines the persistence of under-employment amongst UK higher education graduates. For the cohort of individuals who graduated in 2002/3, micro-data collected by the Higher Education Statistical Agency, are used to calculate the rates of "non-graduate job" employment 6 months and 42 months after graduation. A logic regression analysis suggests the underemployment is not a short-term phenomenon and is systematically related to a set of observable characteristics. It is also found that under-employment 42 months after graduation, which is consistent with the view that the nature of the first job after graduation is important in terms of occupational attainment later in the life-cycle.
Resumo:
As computer chips implementation technologies evolve to obtain more performance, those computer chips are using smaller components, with bigger density of transistors and working with lower power voltages. All these factors turn the computer chips less robust and increase the probability of a transient fault. Transient faults may occur once and never more happen the same way in a computer system lifetime. There are distinct consequences when a transient fault occurs: the operating system might abort the execution if the change produced by the fault is detected by bad behavior of the application, but the biggest risk is that the fault produces an undetected data corruption that modifies the application final result without warnings (for example a bit flip in some crucial data). With the objective of researching transient faults in computer system’s processor registers and memory we have developed an extension of HP’s and AMD joint full system simulation environment, named COTSon. This extension allows the injection of faults that change a single bit in processor registers and memory of the simulated computer. The developed fault injection system makes it possible to: evaluate the effects of single bit flip transient faults in an application, analyze an application robustness against single bit flip transient faults and validate fault detection mechanism and strategies.
Resumo:
This paper presents an initial challenge to tackle the every so "tricky" points encountered when dealing with energy accounting, and thereafter illustrates how such a system of accounting can be used when assessing for the metabolic changes in societies. The paper is divided in four main sections. The first three, present a general discussion on the main issues encountered when conducting energy analyses. The last section, subsequently, combines this heuristic approach to the actual formalization of it, in quantitative terms, for the analysis of possible energy scenarios. Section one covers the broader issue of how to account for the relevant categories used when accounting for Joules of energy; emphasizing on the clear distinction between Primary Energy Sources (PES) (which are the physical exploited entities that are used to derive useable energy forms (energy carriers)) and Energy Carriers (EC) (the actual useful energy that is transmitted for the appropriate end uses within a society). Section two sheds light on the concept of Energy Return on Investment (EROI). Here, it is emphasized that, there must already be a certain amount of energy carriers available to be able to extract/exploit Primary Energy Sources to thereafter generate a net supply of energy carriers. It is pointed out that this current trend of intense energy supply has only been possible to the great use and dependence on fossil energy. Section three follows up on the discussion of EROI, indicating that a single numeric indicator such as an output/input ratio is not sufficient in assessing for the performance of energetic systems. Rather an integrated approach that incorporates (i) how big the net supply of Joules of EC can be, given an amount of extracted PES (the external constraints); (ii) how much EC needs to be invested to extract an amount of PES; and (iii) the power level that it takes for both processes to succeed, is underlined. Section four, ultimately, puts the theoretical concepts at play, assessing for how the metabolic performances of societies can be accounted for within this analytical framework.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.
Resumo:
The thymus is a central lymphoid organ, in wich T cell precursors differentiale and generate most of the so-called T cell reprtoire. Along with a variety of acute infectious diseases, we and others determined important changes in both microenvironmental and lymphoid compartments of the organ. For example, one major and common feature observed in acute viral, bacterial and parasitic diseases, is a depletion of cortical thymocytes, mostly those bearing the CD4-CD8 double positive phenotype. This occurs simmultaneously to the relative enrichment in medullary CD4 or CD8 single positive cells, expressing high densities of the CD3 complex. Additionally we noticed a variety of changes in the thymic microenvironment (and particularly is epithelial component), comprising abnormal location of thymic epithelial cell subsets as well has a denser Ia-bearing cellular network. Moreover, the extracellular matrix network was altered with an intralobular increase of basement membrane proteins that positively correlated with the degree of thymocyte death. Lastly, anti-thymic cell antibodies were detected in both human and animal models of infectious diseases, and in some of them a phenomenon of molecular mimicry could be evidenced. Taken together, the data receiwed herein clearly show that the thymus should be regarded as a target in infectious diseases.
Resumo:
In this work we have studied the modifications in the biological properties of Trypanosoma cruzi when the parasite is maintained for a long time in axenic culture. The studies were done with a clone from an avirulent strain (Dm30L) and a non-cloned virulent strain (EP) of T. cruzi. Both parasiteswere maintained, for at least three years, by successive triatomine/mouse alternate passage (control condition), or by serial passage in axenic medium (culture condition), or only in the mouse (mouse condition). The comparison between parasites of culture and control condition showed that metacyclogenesis capacity was reduced in the former and that the resulting metacyclics displayed an attenuatedvirulence. In order to compare the virulence of metacyclics from the urine of the insect vector, Rhodnius prolixus were infected by artificial feeding with parasites of the control or culture condition. After three triatomine/triatomine passages, there was observed an almost identical biological behavior for these parasites, hence indicating that the maintenance of T. cruzi for a long time in axenic culture affects the differentiation capacity and the virulence of the parasite. Additionally, it was demonstrated that it is possible to maintain T. cruzi exclusively through passages in the invertebrate host.
Resumo:
Introduction In my thesis I argue that economic policy is all about economics and politics. Consequently, analysing and understanding economic policy ideally has at least two parts. The economics part, which is centered around the expected impact of a specific policy on the real economy both in terms of efficiency and equity. The insights of this part point into which direction the fine-tuning of economic policies should go. However, fine-tuning of economic policies will be most likely subject to political constraints. That is why, in the politics part, a much better understanding can be gained by taking into account how the incentives of politicians and special interest groups as well as the role played by different institutional features affect the formation of economic policies. The first part and chapter of my thesis concentrates on the efficiency-related impact of economic policies: how does corporate income taxation in general, and corporate income tax progressivity in specific, affect the creation of new firms? Reduced progressivity and flat-rate taxes are in vogue. By 2009, 22 countries are operating flat-rate income tax systems, as do 7 US states and 14 Swiss cantons (for corporate income only). Tax reform proposals in the spirit of the "flat tax" model typically aim to reduce three parameters: the average tax burden, the progressivity of the tax schedule, and the complexity of the tax code. In joint work, Marius Brülhart and I explore the implications of changes in these three parameters on entrepreneurial activity, measured by counts of firm births in a panel of Swiss municipalities. Our results show that lower average tax rates and reduced complexity of the tax code promote firm births. Controlling for these effects, reduced progressivity inhibits firm births. Our reading of these results is that tax progressivity has an insurance effect that facilitates entrepreneurial risk taking. The positive effects of lower tax levels and reduced complexity are estimated to be significantly stronger than the negative effect of reduced progressivity. To the extent that firm births reflect desirable entrepreneurial dynamism, it is not the flattening of tax schedules that is key to successful tax reforms, but the lowering of average tax burdens and the simplification of tax codes. Flatness per se is of secondary importance and even appears to be detrimental to firm births. The second part of my thesis, which corresponds to the second and third chapter, concentrates on how economic policies are formed. By the nature of the analysis, these two chapters draw on a broader literature than the first chapter. Both economists and political scientists have done extensive research on how economic policies are formed. Thereby, researchers in both disciplines have recognised the importance of special interest groups trying to influence policy-making through various channels. In general, economists base their analysis on a formal and microeconomically founded approach, while abstracting from institutional details. In contrast, political scientists' frameworks are generally richer in terms of institutional features but lack the theoretical rigour of economists' approaches. I start from the economist's point of view. However, I try to borrow as much as possible from the findings of political science to gain a better understanding of how economic policies are formed in reality. In the second chapter, I take a theoretical approach and focus on the institutional policy framework to explore how interactions between different political institutions affect the outcome of trade policy in presence of special interest groups' lobbying. Standard political economy theory treats the government as a single institutional actor which sets tariffs by trading off social welfare against contributions from special interest groups seeking industry-specific protection from imports. However, these models lack important (institutional) features of reality. That is why, in my model, I split up the government into a legislative and executive branch which can both be lobbied by special interest groups. Furthermore, the legislative has the option to delegate its trade policy authority to the executive. I allow the executive to compensate the legislative in exchange for delegation. Despite ample anecdotal evidence, bargaining over delegation of trade policy authority has not yet been formally modelled in the literature. I show that delegation has an impact on policy formation in that it leads to lower equilibrium tariffs compared to a standard model without delegation. I also show that delegation will only take place if the lobby is not strong enough to prevent it. Furthermore, the option to delegate increases the bargaining power of the legislative at the expense of the lobbies. Therefore, the findings of this model can shed a light on why the U.S. Congress often practices delegation to the executive. In the final chapter of my thesis, my coauthor, Antonio Fidalgo, and I take a narrower approach and focus on the individual politician level of policy-making to explore how connections to private firms and networks within parliament affect individual politicians' decision-making. Theories in the spirit of the model of the second chapter show how campaign contributions from lobbies to politicians can influence economic policies. There exists an abundant empirical literature that analyses ties between firms and politicians based on campaign contributions. However, the evidence on the impact of campaign contributions is mixed, at best. In our paper, we analyse an alternative channel of influence in the shape of personal connections between politicians and firms through board membership. We identify a direct effect of board membership on individual politicians' voting behaviour and an indirect leverage effect when politicians with board connections influence non-connected peers. We assess the importance of these two effects using a vote in the Swiss parliament on a government bailout of the national airline, Swissair, in 2001, which serves as a natural experiment. We find that both the direct effect of connections to firms and the indirect leverage effect had a strong and positive impact on the probability that a politician supported the government bailout.
Resumo:
Vector species has not hitherto been studied as influencing metacyclogenesis of Trypanosoma cruzi, while the role of the parasite strain has been frequently stressed as of dominant importance in this process. In order to fill this gap in our knowledge, metacyclogenesis was monitored in nine triatomine species. The first part of this paper presents photographs of the main and intermediate parasite stages in each vector species studied. In the second part of the study the proportional distribution of all these forms, as seen in Giemsa stained smears is summarized, thus providing an opportunity to analyze both: the length of time between the ingestion of the blood trypomastigotes and the appearance of metacyclic forms and the rates of developmental stages leading to these latter. The most remarkable observation was that metacyclogenesis rates in vivo appear to be vector dependent, reaching 50 in Rhodnius neglectus, 37 in its congener R. prolixus and being dramatically lower in the majority of Triatoma species (5 in T. sordida, 3 in T. brasiliensis and 0 in T. pseudomaculata) at the 120th day of infection. These observations suggest that through screening of different vector species it is possible to find some that are capable of minimizing or maximizing metacyclic production.
Resumo:
Oxytocin is a neuropeptide that can reduce neophobia and improve social affiliation. In vitro, oxytocin induces a massive release of GABA from neurons in the lateral division of the central amygdala which results in inhibition of a subpopulation of peripherally projecting neurons in the medial division of the central amygdala (CeM). Common anxiolytics, such as diazepam, act as allosteric modulators of GABA(A) receptors. Because oxytocin and diazepam act on GABAergic transmission, it is possible that oxytocin can potentiate the inhibitory effects of diazepam if both exert their pre, - respectively postsynaptic effects on the same inhibitory circuit in the central amygdala. We found that in CeM neurons in which diazepam increased the inhibitory postsynaptic current (IPSC) decay time, TGOT (a specific oxytocin receptor agonist) increased IPSC frequency. Combined application of diazepam and TGOT resulted in generation of IPSCs with increased frequency, decay times as well as amplitudes. While individual saturating concentrations of TGOT and diazepam each decreased spontaneous spiking frequency of CeM neurons to similar extent, co-application of the two was still able to cause a significantly larger decrease. These findings show that oxytocin and diazepam act on different components of the same GABAergic circuit in the central amygdala and that oxytocin can facilitate diazepam effects when used in combination. This raises the possibility that neuropeptides could be clinically used in combination with currently used anxiolytic treatments to improve their therapeutic efficacy.
Resumo:
BACKGROUND: Sudden cardiac death (SCD) among the young is a rare and devastating event, but its exact incidence in many countries remains unknown. An autopsy is recommended in every case because some of the cardiac pathologies may have a genetic origin, which can have an impact on the living family members. The aims of this retrospective study completed in the canton of Vaud, Switzerland were to determine both the incidence of SCD and the autopsy rate for individuals from 5 to 39 years of age. METHODS: The study was conducted from 2000 to 2007 on the basis of official statistics and analysis of the International Classification of Diseases codes for potential SCDs and other deaths that might have been due to cardiac disease. RESULTS: During the 8 year study period there was an average of 292'546 persons aged 5-39 and there were a total of 1122 deaths, certified as potential SCDs in 3.6% of cases. The calculated incidence is 1.71/100'000 person-years (2.73 for men and 0.69 for women). If all possible cases of SCD (unexplained deaths, drowning, traffic accidents, etc.) are included, the incidence increases to 13.67/100'000 person-years. However, the quality of the officially available data was insufficient to provide an accurate incidence of SCD as well as autopsy rates. The presumed autopsy rate of sudden deaths classified as diseases of the circulatory system is 47.5%. For deaths of unknown cause (11.1% of the deaths), the autopsy was conducted in 13.7% of the cases according to codified data. CONCLUSIONS: The incidence of presumed SCD in the canton of Vaud, Switzerland, is comparable to the data published in the literature for other geographic regions but may be underestimated as it does not take into account other potential SCDs, as unexplained deaths. Increasing the autopsy rate of SCD in the young, better management of information obtained from autopsies as well developing of structured registry could improve the reliability of the statistical data, optimize the diagnostic procedures, and the preventive measures for the family members.
Resumo:
The dilemma efficiency versus equity, together with political partisan interests, has received increasing attention to explain the territorial allocation of investments. However, centralization intended to introduce or reinforce hierarchization in the political system has not been object as of now of empirical analysis. Our main contribution to the literature is providing evidence that meta-political objectives related to the ordering of political power and administration influence regional investment. In this way, we find evidence that network mode’s (roads and railways) investment programs are influenced by the centralization strategy of investing near to the political capital, while investment effort in no-network modes (airports and ports) appears to be positively related to distance. Since investment in surface transportation infrastructures is much higher than that in airports and ports, and taken into account that regions surrounding the political capital are poorer than the average, we suggest that centralization rather than redistribution has been the driver for the concentration of public investment on these regions.
Resumo:
“Dawn or the Galaxy” és un treball de final de carrera que té com a objectiu principal la creació i desenvolupament d’una versió de demostració per a un joc del tipus MMORTS (massive multiplayer online real-time strategy) tractant d’incloure elements innovadors en aquest gènere de jocs i oferint un ampli ventall estratègic des de l’inici del joc. Per tal d’assolir l’objectiu es realitzarà un petit sondeig de mercat inicial i un estudi de models de jocs d’estratègia. El joc estarà integrat per més de seixanta fitxers de codi, una base de dades amb catorze taules interrelacionades no normalitzades i podrà tenir cabuda per a uns cinc-cents jugadors. Un cop programat l’aplicatiu, el joc es provarà en un entorn real, amb usuaris reals. Per a resoldre els problemes durant el transcurs del joc de forma ràpida, la aplicació serà sotmesa a un seguiment exhaustiu. La col·laboració dels jugadors en aquest punt serà fonamental.
Resumo:
In natural populations, dispersal tends to be limited so that individuals are in local competition with their neighbours. As a consequence, most behaviours tend to have a social component, e.g. they can be selfish, spiteful, cooperative or altruistic as usually considered in social evolutionary theory. How social behaviours translate into fitness costs and benefits depends considerably on life-history features, as well as on local demographic and ecological conditions. Over the last four decades, evolutionists have been able to explore many of the consequences of these factors for the evolution of social behaviours. In this paper, we first recall the main theoretical concepts required to understand social evolution. We then discuss how life history, demography and ecology promote or inhibit the evolution of helping behaviours, but the arguments developed for helping can be extended to essentially any social trait. The analysis suggests that, on a theoretical level, it is possible to contrast three critical benefit-to-cost ratios beyond which costly helping is selected for (three quantitative rules for the evolution of altruism). But comparison between theoretical results and empirical data has always been difficult in the literature, partly because of the perennial question of the scale at which relatedness should be measured under localized dispersal. We then provide three answers to this question.