7 resultados para no fit polygon

em Scottish Institute for Research in Economics (SIRE) (SIRE), United Kingdom


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper shows that introducing weak property rights in the standard real business cycle (RBC) model can help to explain economic fluctuations. This is motivated by the empirical observation that changes in institutions in emerging markets are related to the evolution of the main macroeconomic variables. In particular, in Mexico, the movements in productivity in the data are associated with changes in institutions, so that we can explain productivity shocks to a large extent as shocks to the quality of institutions. We find that the model with shocks to the degree of protection of property rights only - without technology shocks - can match the second moments in the data for Mexico well. In particular, the fit is better than that of the standard neoclassical model with full protection of property rights regarding the auto-correlations and cross-correlations in the data, especially those related to labor. Viewing productivity shocks as shocks to institutions is also consistent with the stylized fact of falling productivity and non-decreasing labor hours in Mexico over 1980-1994, which is a feature that the neoclassical model cannot match.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report experiments designed to test between Nash equilibria that are stable and unstable under learning. The “TASP” (Time Average of the Shapley Polygon) gives a precise prediction about what happens when there is divergence from equilibrium under fictitious play like learning processes. We use two 4 x 4 games each with a unique mixed Nash equilibrium; one is stable and one is unstable under learning. Both games are versions of Rock-Paper-Scissors with the addition of a fourth strategy, Dumb. Nash equilibrium places a weight of 1/2 on Dumb in both games, but the TASP places no weight on Dumb when the equilibrium is unstable. We also vary the level of monetary payoffs with higher payoffs predicted to increase instability. We find that the high payoff unstable treatment differs from the others. Frequency of Dumb is lower and play is further from Nash than in the other treatments. That is, we find support for the comparative statics prediction of learning theory, although the frequency of Dumb is substantially greater than zero in the unstable treatments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The monetary policy reaction function of the Bank of England is estimated by the standard GMM approach and the ex-ante forecast method developed by Goodhart (2005), with particular attention to the horizons for inflation and output at which each approach gives the best fit. The horizons for the ex-ante approach are much closer to what is implied by the Bank’s view of the transmission mechanism, while the GMM approach produces an implausibly slow adjustment of the interest rate, and suffers from a weak instruments problem. These findings suggest a strong preference for the ex-ante approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is inspired by articles in the last decade or so that have argued for more attention to theory, and to empirical analysis, within the well-known, and long-lasting, contingency framework for explaining the organisational form of the firm. Its contribution is to extend contingency analysis in three ways: (a) by empirically testing it, using explicit econometric modelling (rather than case study evidence) involving estimation by ordered probit analysis; (b) by extending its scope from large firms to SMEs; (c) by extending its applications from Western economic contexts, to an emerging economy context, using field work evidence from China. It calibrates organizational form in a new way, as an ordinal dependent variable, and also utilises new measures of familiar contingency factors from the literature (i.e. Environment, Strategy, Size and Technology) as the independent variables. An ordered probit model of contingency was constructed, and estimated by maximum likelihood, using a cross section of 83 private Chinese firms. The probit was found to be a good fit to the data, and displayed significant coefficients with plausible interpretations for key variables under all the four categories of contingency analysis, namely Environment, Strategy, Size and Technology. Thus we have generalised the contingency model, in terms of specification, interpretation and applications area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The monetary policy reaction function of the Bank of England is estimated by the standard GMM approach and the ex-ante forecast method developed by Goodhart (2005), with particular attention to the horizons for inflation and output at which each approach gives the best fit. The horizons for the ex-ante approach are much closer to what is implied by the Bank’s view of the transmission mechanism, while the GMM approach produces an implausibly slow adjustment of the interest rate, and suffers from a weak instruments problem. These findings suggest a strong preference for the ex-ante approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this note is to supplement the author’s earlier remarks on the unsatisfactory nature of the neoclassical account of how the return on capital is determined. (See Strathclyde Discussion Paper 12-03: “The Marginal Productivity Theory of the Price of Capital: An Historical Perspective on the Origins of the Codswallop”). The point is made via a simple illustration that certain matters which are problematical in neoclassical terms are perfectly straightforward when viewed from a classical perspective. Basically, the marginalist model of the nature of an economic system is not fit for purpose in that it fails to comprehend the essential features of a surplus-producing economic system as distinct from one merely of exchange.