5 resultados para Forced eruption
em Scottish Institute for Research in Economics (SIRE) (SIRE), United Kingdom
Resumo:
Society often allocates valuable resources - such as prestigious positions, salaries, or marriage partners - via tournament-like institutions. In such situations, inequality affects incentives to compete and hence has a direct effect on equilibrium choices and hence material outcomes. We introduce a new distinction between inequality in initial endowments (e.g. ability, inherited wealth) and inequality of what one can obtain as rewards (e.g. prestigious positions, money). We show that these two types of inequality have opposing effects on equilibrium behavior and wellbeing. Greater inequality of rewards tends to hurt most people — both the middle class and the poor, — who are forced into greater effort. In contrast, greater inequality of endowments tends to benefit the middle class. Thus, which type of inequality is considered hugely affects the correctness of our intuitions about the implications of inequality.
Resumo:
This paper examines the performance of monetary policy under the new framework established in 1997 up to the end of the Labour government in May 2010. Performance was relatively good in the years before the crisis, but much weaker from 2008. The new framework largely neglected open economy issues, while the Treasury’s EMU assessment in 2003 can be interpreted in different ways. inflation targeting in the UK and elsewhere may have contributed in some way to the eruption and depth of the financial crisis from 2008, but UK monetary policy responded in a bold and innovative way. Overall, the design and operation of monetary policy were much better than in earlier periods, but there remains scope for significant further evolution.
Resumo:
This paper is an investigation into the dynamics of asset markets with adverse selection a la Akerlof (1970). The particular question asked is: can market failure at some later date precipitate market failure at an earlier date? The answer is yes: there can be "contagious illiquidity" from the future back to the present. The mechanism works as follows. If the market is expected to break down in the future, then agents holding assets they know to be lemons (assets with low returns) will be forced to hold them for longer - they cannot quickly resell them. As a result, the effective difference in payoff between a lemon and a good asset is greater. But it is known from the static Akerlof model that the greater the payoff differential between lemons and non-lemons, the more likely is the market to break down. Hence market failure in the future is more likely to lead to market failure today. Conversely, if the market is not anticipated to break down in the future, assets can be readily sold and hence an agent discovering that his or her asset is a lemon can quickly jettison it. In effect, there is little difference in payoff between a lemon and a good asset. The logic of the static Akerlof model then runs the other way: the small payoff differential is unlikely to lead to market breakdown today. The conclusion of the paper is that the nature of today's market - liquid or illiquid - hinges critically on the nature of tomorrow's market, which in turn depends on the next day's, and so on. The tail wags the dog.
Resumo:
This paper shows how one of the developers of QWERTY continued to use the trade secret that underlay its development to seek further efficiency improvements after its introduction. It provides further evidence that this was the principle used to design QWERTY in the first place and adds further weight to arguments that QWERTY itself was a consequence of creative design and an integral part of a highly efficient system rather than an accident of history. This further serves to raise questions over QWERTY's forced servitude as 'paradigm case' of inferior standard in the path dependence literature. The paper also shows how complementarities in forms of intellectual property rights protection played integral roles in the development of QWERTY and the search for improvements on it, and also helped effectively conceal the source of the efficiency advantages that QWERTY helped deliver.
Resumo:
In this paper we study decision making in situations where the individual’s preferences are not assumed to be complete. First, we identify conditions that are necessary and sufficient for choice behavior in general domains to be consistent with maximization of a possibly incomplete preference relation. In this model of maximally dominant choice, the agent defers/avoids choosing at those and only those menus where a most preferred option does not exist. This allows for simple explanations of conflict-induced deferral and choice overload. It also suggests a criterion for distinguishing between indifference and incomparability based on observable data. A simple extension of this model also incorporates decision costs and provides a theoretical framework that is compatible with the experimental design that we propose to elicit possibly incomplete preferences in the lab. The design builds on the introduction of monetary costs that induce choice of a most preferred feasible option if one exists and deferral otherwise. Based on this design we found evidence suggesting that a quarter of the subjects in our study had incomplete preferences, and that these made significantly more consistent choices than a group of subjects who were forced to choose. The latter effect, however, is mitigated once data on indifferences are accounted for.