1000 resultados para Oligopolis-Models economètrics
Resumo:
I consider the problem of assigning agents to objects where each agent must pay the price of the object he gets and prices must sum to a given number. The objective is to select an assignment-price pair that is envy-free with respect to the true preferences. I prove that the proposed mechanism will implement both in Nash and strong Nash the set of envy-free allocations. The distinguishing feature of the mechanism is that it treats the announced preferences as the true ones and selects an envy-free allocation with respect to the announced preferences.
Resumo:
This paper analyzes the linkages between the credibility of a target zone regime, the volatility of the exchange rate, and the width of the band where the exchange rate is allowed to fluctuate. These three concepts should be related since the band width induces a trade-off between credibility and volatility. Narrower bands should give less scope for the exchange rate to fluctuate but may make agents perceive a larger probability of realignment which by itself should increase the volatility of the exchange rate. We build a model where this trade-off is made explicit. The model is used to understand the reduction in volatility experienced by most EMS countries after their target zones were widened on August 1993. As a natural extension, the model also rationalizes the existence of non-official, implicit target zones (or fear of floating), suggested by some authors.
Resumo:
Actual tax systems do not follow the normative recommendations of yhe theory of optimal taxation. There are two reasons for this. Firstly, the informational difficulties of knowing or estimating all relevant elasticities and parameters. Secondly, the political complexities that would arise if a new tax implementation would depart too much from current systems that are perceived as somewhat egalitarians. Hence an ex-novo overhaul of the tax system might just be non-viable. In contrast, a small marginal tax reform could be politically more palatable to accept and economically more simple to implement. The goal of this paper is to evaluate, as a step previous to any tax reform, the marginal welfare cost of the current tax system in Spain. We do this by using a computational general equilibrium model calibrated to a point-in-time micro database. The simulations results show that the Spanish tax system gives rise to a considerable marginal excess burden. Its order of magnitude is of about 0.50 money units for each additional money unit collected through taxes.
Resumo:
We analyze the effects of uncertainty and private information on horizontal mergers. Firms face uncertain demands or costs and receive private signals. They may decide to merge sharing their private information. If the uncertainty parameters are independent and the signals are perfect, uncertainty generates an informational advantage only to the merging firms, increasing merger incentives and decreasing free-riding effects. Thus, mergers become more profitable and stable. These results generalize to the case of correlated parameters if the correlation is not very severe, and for perfect correlation if the firms receive noisy signals. From the normative point of view, mergers are socially less harmful compared to deterministic markets and may even be welfare enhancing. If the signals are, instead, publicly observed, uncertainty does not necessarily give more incentives to merge, and mergers are not always less socially harmful.
Resumo:
We use structural methods to assess equilibrium models of bidding with data from first-price auction experiments. We identify conditions to test the Nash equilibrium models for homogenous and for heterogeneous constant relative risk aversion when bidders private valuations are independent and uniformly drawn. The outcomes of our study indicate that behavior may have been affected by the procedure used to conduct the experiments and that the usual Nash equilibrium model for heterogeneous constant relative risk averse bidders does not consistently explain the observed overbidding. From an empirical standpoint, our analysis shows the possible drawbacks of overlooking the homogeneity hypothesis when testing symmetric equilibrium models of bidding and it puts in perspective the sensitivity of structural inferences to the available information.
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of "the historical tendencies": a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin's method, to prove the result.
Resumo:
Labour market reforms face very often opposition from the employed workers, because it normally reduces their wages. Also product market regulations are regularly biased towards too much benefitting the firms. As a result there remain many frictions in both the labour and product markets that hinder an optimal functioning of the economy. These issues have recently received a lot of attention in the economics literature and scholars have been looking for politically viable reforms in both markets. However, despite its potential importance, there has been done virtually no research on the interaction between reforms in product and labour markets. We find that when combining reforms, the opposition for reforms decreases considerably. This is because there exist complementarities and the gains in total welfare can be more evenly distributed over the interest groups. Moreover, the interaction of reforms offers a way out for the so-called 'sclerosis' effect.
Resumo:
From the classical gold standard up to the current ERM2 arrangement of the European Union, target zones have been a widely used exchange regime in contemporary history. This paper presents a benchmark model that rationalizes the choice of target zones over the rest of regimes: the fixed rate, the free float and the managed float. It is shown that the monetary authority may gain efficiency by reducing volatility of both the exchange rate and the interest rate at the same time. Furthermore, the model is consistent with some known stylized facts in the empirical literature that previous models were not able to produce, namely, the positive relation between the exchange rate and the interest rate differential, the degree of non-linearity of the function linking the exchage rate to fundamentals and the shape of the exchange rate stochastic distribution.
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of ?the historical tendencies?: a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin?s method, to prove the result.
Resumo:
Ever since the appearance of the ARCH model [Engle(1982a)], an impressive array of variance specifications belonging to the same class of models has emerged [i.e. Bollerslev's (1986) GARCH; Nelson's (1990) EGARCH]. This recent domain has achieved very successful developments. Nevertheless, several empirical studies seem to show that the performance of such models is not always appropriate [Boulier(1992)]. In this paper we propose a new specification: the Quadratic Moving Average Conditional heteroskedasticity model. Its statistical properties, such as the kurtosis and the symmetry, as well as two estimators (Method of Moments and Maximum Likelihood) are studied. Two statistical tests are presented, the first one tests for homoskedasticity and the second one, discriminates between ARCH and QMACH specification. A Monte Carlo study is presented in order to illustrate some of the theoretical results. An empirical study is undertaken for the DM-US exchange rate.
Resumo:
We study markets where the characteristics or decisions of certain agents are relevant but not known to their trading partners. Assuming exclusive transactions, the environment is described as a continuum economy with indivisible commodities. We characterize incentive efficient allocations as solutions to linear programming problems and appeal to duality theory to demonstrate the generic existence of external effects in these markets. Because under certain conditions such effects may generate non-convexities, randomization emerges as a theoretic possibility. In characterizing market equilibria we show that, consistently with the personalized nature of transactions, prices are generally non-linear in the underlying consumption. On the other hand, external effects may have critical implications for market efficiency. With adverse selection, in fact, cross-subsidization across agents with different private information may be necessary for optimality, and so, the market need not even achieve an incentive efficient allocation. In contrast, for the case of a single commodity, we find that when informational asymmetries arise after the trading period (e.g. moral hazard; ex post hidden types) external effects are fully internalized at a market equilibrium.
Resumo:
We study the relation between public capital, employment and growth under different assumptions concerning wage formation. We show that public capital increases economic growth, and that, if there is wage inertia, employment positively depends on both economic growth and public capital.
Resumo:
In this paper we propose the infimum of the Arrow-Pratt index of absolute risk aversion as a measure of global risk aversion of a utility function. We then show that, for any given arbitrary pair of distributions, there exists a threshold level of global risk aversion such that all increasing concave utility functions with at least as much global risk aversion would rank the two distributions in the same way. Furthermore, this threshold level is sharp in the sense that, for any lower level of global risk aversion, we can find two utility functions in this class yielding opposite preference relations for the two distributions.
Resumo:
The choice of either the rate of monetary growth or the nominal interest rate as the instrument controlled by monetary authorities has both positive and normative implications for economic performance. We reexamine some of the issues related to the choice of the monetary policy instrument in a dynamic general equilibrium model exhibiting endogenous growth in which a fraction of productive government spending is financed by means of issuing currency. When we evaluate the performance of the two monetary instruments attending to the fluctuations of endogenous variables, we find that the inflation rate is less volatile under nominal interest rate targeting. Concerning the fluctuations of consumption and of the growth rate, both monetary policy instruments lead to statistically equivalent volatilities. Finally, we show that none of these two targeting procedures displays unambiguously higher welfare levels.
Resumo:
We extend the basic tax evasion model to a multi-period economy exhibiting sustained growth. When individuals conceal part of their true income from the tax authority, they face the risk of being audited and hence of paying the corresponding fine. Both taxes and fines determine individual saving and the rate of capital accumulation. In this context we show that the sign of the relation between the level of the tax rate and the amount of evaded income is the same as that obtained in static setups. Moreover, high tax rates on income are typically associated with low growth rates as occurs in standard growth models that disregard the tax evasion phenomenon.