28 resultados para uncertainty aversion
em Repositório digital da Fundação Getúlio Vargas - FGV
Resumo:
In this paper we apply the theory of declsion making with expected utility and non-additive priors to the choice of optimal portfolio. This theory describes the behavior of a rational agent who i5 averse to pure 'uncertainty' (as well as, possibly, to 'risk'). We study the agent's optimal allocation of wealth between a safe and an uncertain asset. We show that there is a range of prices at which the agent neither buys not sells short the uncertain asset. In contrast the standard theory of expected utility predicts that there is exactly one such price. We also provide a definition of an increase in uncertainty aversion and show that it causes the range of prices to increase.
Resumo:
This artic/e applies a theorem of Nash equilibrium under uncertainty (Dow & Werlang, 1994) to the classic Coumot model of oligopolistic competition. It shows, in particular, how one can map all Coumot equilibrium (which includes the monopoly and the null solutions) with only a function of uncertainty aversion coefficients of producers. The effect of variations in these parameters over the equilibrium quantities are studied, also assuming exogenous increases in the number of matching firms in the game. The Cournot solutions under uncertainty are compared with the monopolistic one. It shows principally that there is an uncertainty aversion level in the industry such that every aversion coefficient beyond it induces firms to produce an aggregate output smaller than the monopoly output. At the end of the artic/e equilibrium solutions are specialized for Linear Demand and for Coumot duopoly. Equilibrium analysis in the symmetric case allows to identify the uncertainty aversion coefficient for the whole industry as a proportional lack of information cost which would be conveyed by market price in the perfect competition case (Lerner Index).
Resumo:
We define Nash equilibrium for two-person normal form games in the presence of uncertainty, in the sense of Knight(1921). We use the fonna1iution of uncertainty due to Schmeidler and Gilboa. We show tbat there exist Nash equilibria for any degree of uncertainty, as measured by the uncertainty aversion (Dow anel Wer1ang(l992a». We show by example tbat prudent behaviour (maxmin) can be obtained as an outcome even when it is not rationaliuble in the usual sense. Next, we break down backward industion in the twice repeated prisoner's dilemma. We link these results with those on cooperation in the finitely repeated prisoner's dilemma obtained by Kreps-Milgrom-Roberts-Wdson(1982), and withthe 1iterature on epistemological conditions underlying Nash equilibrium. The knowledge notion implicit in this mode1 of equilibrium does not display logical omniscience.
Resumo:
In this paper I will investigate the conditions under which a convex capacity (or a non-additive probability which exhibts uncertainty aversion) can be represented as a squeeze of a(n) (additive) probability measure associate to an uncertainty aversion function. Then I will present two alternatives forrnulations of the Choquet integral (and I will extend these forrnulations to the Choquet expected utility) in a parametric approach that will enable me to do comparative static exercises over the uncertainty aversion function in an easy way.
Resumo:
We investigate the eff ect of aggregate uncertainty shocks on real variables. More speci fically, we introduce a shock in the volatility of productivity in an RBC model with long-run volatility risk and preferences that exhibit generalised disappointment aversion. We find that, when combined with a negative productivity shock, a volatility shock leads to further decline in real variables, such as output, consumption, hours worked and investment. For instance, out of the 2% decrease in output as a result of both shocks, we attribute 0.25% to the e ffect of an increase in volatility. We also fi nd that this e ffect is the same as the one obtained in a model with Epstein-Zin- Weil preferences, but higher than that of a model with expected utility. Moreover, GDA preferences yield superior asset pricing results, when compared to both Epstein-Zin-Weil preferences and expected utility.
Resumo:
This paper aims at contributing to the research agenda on the sources of price stickiness, showing that the adoption of nominal price rigidity may be an optimal firms' reaction to the consumers' behavior, even if firms have no adjustment costs. With regular broadly accepted assumptions on economic agents behavior, we show that firms' competition can lead to the adoption of sticky prices as an (sub-game perfect) equilibrium strategy. We introduce the concept of a consumption centers model economy in which there are several complete markets. Moreover, we weaken some traditional assumptions used in standard monetary policy models, by assuming that households have imperfect information about the ineflicient time-varying cost shocks faced by the firms, e.g. the ones regarding to inefficient equilibrium output leveIs under fiexible prices. Moreover, the timing of events are assumed in such a way that, at every period, consumers have access to the actual prices prevailing in the market only after choosing a particular consumption center. Since such choices under uncertainty may decrease the expected utilities of risk averse consumers, competitive firms adopt some degree of price stickiness in order to minimize the price uncertainty and fi attract more customers fi.'
Resumo:
Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.
Resumo:
Lucas (1987) has shown the surprising result that the welfare cost of business cycles is quite small. Using standard assumptions on preferences and a fully-áedged econometric model we computed the welfare costs of macroeconomic uncertainty for the post-WWII era using the multivariate Beveridge-Nelson decomposition for trends and cycles, which considers not only business-cycle uncertainty but also uncertainty from the stochastic trend in consumption. The post-WWII period is relatively quiet, with the welfare costs of uncertainty being about 0:9% of per-capita consumption. Although changing the decomposition method changed substantially initial results, the welfare cost of uncertainty is qualitatively small in the post-WWII era - about $175.00 a year per-capita in the U.S. We also computed the marginal welfare cost of macroeconomic uncertainty using this same technique. It is about twice as large as the welfare cost ñ$350.00 a year per-capita.
Resumo:
Consider the demand for a good whose consumption be chosen prior to the resolution of uncertainty regarding income. How do changes in the distribution of income affect the demand for this good? In this paper we show that normality, is sufficient to guarantee that consumption increases of the Radon-Nikodym derivative of the new distribution with respect to the old is non-decreasing in the whole domain. However, if only first order stochastic dominance is assumed more structure must be imposed on preferences to guanantee the validity of the result. Finally a converse of the first result also obtains. If the change in measure is characterized by non-decreasing Radon-Nicodyn derivative, consumption of such a good will always increase if and only if the good is normal.
Resumo:
With standard assumptions on preferences and a fully-fledged econometric model we computed the welfare costs of macroeconomic uncertainty for post-war U.S. using the BeveridgeNelson decomposition. Welfare costs are about 0.9% per-capita consumption ($175.00) and marginal welfare costs are about twice as large.
Resumo:
We present two alternative definitions of Nash equilibrium for two person games in the presence af uncertainty, in the sense of Knight. We use the formalization of uncertainty due to Schmeidler and Gilboa. We show that, with one of the definitions, prudent behaviour (maxmin) can be obtained as an outcome even when it is not rationalizable in the usual sense. Most striking is that with the Same definition we break down backward induction in the twice repeated prisoner's dilemma. We also link these results with the Kreps-Milgrom-Roberts-Wilson explanation of cooperation in the finitely repeated prisoner's dilemma.
Resumo:
Este Trabalho Discute as Perspectivas da Regulação Econômica no Brasil. para Tanto, Primeiramente Apresenta-Se a Evolução Histórica da Regulação no País, Discutindo as Principais Questões Relacionadas Às Agências Reguladoras Federais. em Segundo Lugar, os Marcos Regulatórios de Cinco Diferentes Setores (Telecomunicações, Eletricidade, Saneamento Básico, Petróleo e Gás Natural) são Analisados. em Terceiro Lugar, a Questão do Financiamento de Investimentos em Infra-Estrutura é Tratada, Enfatizando o Papel das Parcerias Público-Privadas (Ppps). uma Seção Final Cont~Em um Possível Agenda para a Regulação no Brasil
Resumo:
This paper investigates the causes of municipalities secession in Brazil. The theoretical model proposes that the median voter is not fully informed about the efficiency effect of secession on public good provision and uses the break up decision undertaken by neighbor’s municipalities within the state to account for his voting. Our empirical results confirms that prediction
Resumo:
Recently Kajii and (2008) proposed to characterize interim efficient allocations in an exchange economy under asymmetric information when uncertainty is represented by multiple posteriors. When agents have Bewley's incomplete preferences, Kajii and Ui (2008) proposed a necessary and sufficient condition on the set of posteriors. However, when agents have Gilboa--Schmeidler's MaxMin expected utility preferences, they only propose a sufficient condition. The objective of this paper is to complete Kajii and Ui's work by proposing a necessary and sufficient condition for interim efficiency for various models of ambiguity aversion and in particular MaxMin expected utility. Our proof is based on a direct application of some results proposed by Rigotti, Shannon and Stralecki (2008).