2 resultados para 340208 Macroeconomics (incl. Monetary and Fiscal Theory)

em Massachusetts Institute of Technology


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Japanese economy entered a long recession in spring 1997. Its economic growth has been much lower than in the US and the EU despite large fiscal stimulus packages, a monetary policy which has brought interest rates to zero since 1999, injections of public money to recapitalize banks, and programs of liberalization and deregulation. How could all these policies have failed to bring the Japanese economy back on a sustainable growth path? This paper argues that the failure of Japan's efforts to restore a sound economic environment is the result of having deliberately chosen inappropriate and inadequate monetary and fiscal instruments to tackle the macroeconomic and structural problems that have burdened the Japanese economy since the burst of the financial bubble at the beginning of the 90s. These choices were deliberate, since the "right" policies (in primis the resolution of the banking crisis) presented unbearable political costs, not only for the ruling parties, but also for the bureaucratic and business elites. The misfortunes of the Japanese economy during the long recession not only allow us to draw important economic policy lessons, but also stimulate reflections on the disruptive role on economic policies caused by powerful vested interests when an economy needs broad and deep structural changes. The final part of the paper focuses on ways to tackle Japan's banking crisis. In particular, it explores the Scandinavian solution, which, mutatis mutandis, might serve Japanese policy-makers well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Learning an input-output mapping from a set of examples, of the type that many neural networks have been constructed to perform, can be regarded as synthesizing an approximation of a multi-dimensional function, that is solving the problem of hypersurface reconstruction. From this point of view, this form of learning is closely related to classical approximation techniques, such as generalized splines and regularization theory. This paper considers the problems of an exact representation and, in more detail, of the approximation of linear and nolinear mappings in terms of simpler functions of fewer variables. Kolmogorov's theorem concerning the representation of functions of several variables in terms of functions of one variable turns out to be almost irrelevant in the context of networks for learning. We develop a theoretical framework for approximation based on regularization techniques that leads to a class of three-layer networks that we call Generalized Radial Basis Functions (GRBF), since they are mathematically related to the well-known Radial Basis Functions, mainly used for strict interpolation tasks. GRBF networks are not only equivalent to generalized splines, but are also closely related to pattern recognition methods such as Parzen windows and potential functions and to several neural network algorithms, such as Kanerva's associative memory, backpropagation and Kohonen's topology preserving map. They also have an interesting interpretation in terms of prototypes that are synthesized and optimally combined during the learning stage. The paper introduces several extensions and applications of the technique and discusses intriguing analogies with neurobiological data.