902 resultados para 340208 Macroeconomics (incl. Monetary and Fiscal Theory)
Resumo:
Negli ultimi anni i modelli VAR sono diventati il principale strumento econometrico per verificare se può esistere una relazione tra le variabili e per valutare gli effetti delle politiche economiche. Questa tesi studia tre diversi approcci di identificazione a partire dai modelli VAR in forma ridotta (tra cui periodo di campionamento, set di variabili endogene, termini deterministici). Usiamo nel caso di modelli VAR il test di Causalità di Granger per verificare la capacità di una variabile di prevedere un altra, nel caso di cointegrazione usiamo modelli VECM per stimare congiuntamente i coefficienti di lungo periodo ed i coefficienti di breve periodo e nel caso di piccoli set di dati e problemi di overfitting usiamo modelli VAR bayesiani con funzioni di risposta di impulso e decomposizione della varianza, per analizzare l'effetto degli shock sulle variabili macroeconomiche. A tale scopo, gli studi empirici sono effettuati utilizzando serie storiche di dati specifici e formulando diverse ipotesi. Sono stati utilizzati tre modelli VAR: in primis per studiare le decisioni di politica monetaria e discriminare tra le varie teorie post-keynesiane sulla politica monetaria ed in particolare sulla cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015) e regola del GDP nominale in Area Euro (paper 1); secondo per estendere l'evidenza dell'ipotesi di endogeneità della moneta valutando gli effetti della cartolarizzazione delle banche sul meccanismo di trasmissione della politica monetaria negli Stati Uniti (paper 2); terzo per valutare gli effetti dell'invecchiamento sulla spesa sanitaria in Italia in termini di implicazioni di politiche economiche (paper 3). La tesi è introdotta dal capitolo 1 in cui si delinea il contesto, la motivazione e lo scopo di questa ricerca, mentre la struttura e la sintesi, così come i principali risultati, sono descritti nei rimanenti capitoli. Nel capitolo 2 sono esaminati, utilizzando un modello VAR in differenze prime con dati trimestrali della zona Euro, se le decisioni in materia di politica monetaria possono essere interpretate in termini di una "regola di politica monetaria", con specifico riferimento alla cosiddetta "nominal GDP targeting rule" (McCallum 1988 Hall e Mankiw 1994; Woodford 2012). I risultati evidenziano una relazione causale che va dallo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo alle variazioni dei tassi di interesse di mercato a tre mesi. La stessa analisi non sembra confermare l'esistenza di una relazione causale significativa inversa dalla variazione del tasso di interesse di mercato allo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo. Risultati simili sono stati ottenuti sostituendo il tasso di interesse di mercato con il tasso di interesse di rifinanziamento della BCE. Questa conferma di una sola delle due direzioni di causalità non supporta un'interpretazione della politica monetaria basata sulla nominal GDP targeting rule e dà adito a dubbi in termini più generali per l'applicabilità della regola di Taylor e tutte le regole convenzionali della politica monetaria per il caso in questione. I risultati appaiono invece essere più in linea con altri approcci possibili, come quelli basati su alcune analisi post-keynesiane e marxiste della teoria monetaria e più in particolare la cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015). Queste linee di ricerca contestano la tesi semplicistica che l'ambito della politica monetaria consiste nella stabilizzazione dell'inflazione, del PIL reale o del reddito nominale intorno ad un livello "naturale equilibrio". Piuttosto, essi suggeriscono che le banche centrali in realtà seguono uno scopo più complesso, che è il regolamento del sistema finanziario, con particolare riferimento ai rapporti tra creditori e debitori e la relativa solvibilità delle unità economiche. Il capitolo 3 analizza l’offerta di prestiti considerando l’endogeneità della moneta derivante dall'attività di cartolarizzazione delle banche nel corso del periodo 1999-2012. Anche se gran parte della letteratura indaga sulla endogenità dell'offerta di moneta, questo approccio è stato adottato raramente per indagare la endogeneità della moneta nel breve e lungo termine con uno studio degli Stati Uniti durante le due crisi principali: scoppio della bolla dot-com (1998-1999) e la crisi dei mutui sub-prime (2008-2009). In particolare, si considerano gli effetti dell'innovazione finanziaria sul canale dei prestiti utilizzando la serie dei prestiti aggiustata per la cartolarizzazione al fine di verificare se il sistema bancario americano è stimolato a ricercare fonti più economiche di finanziamento come la cartolarizzazione, in caso di politica monetaria restrittiva (Altunbas et al., 2009). L'analisi si basa sull'aggregato monetario M1 ed M2. Utilizzando modelli VECM, esaminiamo una relazione di lungo periodo tra le variabili in livello e valutiamo gli effetti dell’offerta di moneta analizzando quanto la politica monetaria influisce sulle deviazioni di breve periodo dalla relazione di lungo periodo. I risultati mostrano che la cartolarizzazione influenza l'impatto dei prestiti su M1 ed M2. Ciò implica che l'offerta di moneta è endogena confermando l'approccio strutturalista ed evidenziando che gli agenti economici sono motivati ad aumentare la cartolarizzazione per una preventiva copertura contro shock di politica monetaria. Il capitolo 4 indaga il rapporto tra spesa pro capite sanitaria, PIL pro capite, indice di vecchiaia ed aspettativa di vita in Italia nel periodo 1990-2013, utilizzando i modelli VAR bayesiani e dati annuali estratti dalla banca dati OCSE ed Eurostat. Le funzioni di risposta d'impulso e la scomposizione della varianza evidenziano una relazione positiva: dal PIL pro capite alla spesa pro capite sanitaria, dalla speranza di vita alla spesa sanitaria, e dall'indice di invecchiamento alla spesa pro capite sanitaria. L'impatto dell'invecchiamento sulla spesa sanitaria è più significativo rispetto alle altre variabili. Nel complesso, i nostri risultati suggeriscono che le disabilità strettamente connesse all'invecchiamento possono essere il driver principale della spesa sanitaria nel breve-medio periodo. Una buona gestione della sanità contribuisce a migliorare il benessere del paziente, senza aumentare la spesa sanitaria totale. Tuttavia, le politiche che migliorano lo stato di salute delle persone anziane potrebbe essere necessarie per una più bassa domanda pro capite dei servizi sanitari e sociali.
Resumo:
Robert J. Barro, a Harvard Egyetem professzora főként a gazdaságpolitika makroökonómiai modellezése területén elért eredményei alapján ismert a közgazdászok körében. Tevékenysége kiterjed mind az elméleti, mind pedig az empirikus kutatások területére. Jelen tanulmány Barro azon kutatásainak feltételezéseit és eredményeit összegzi, amelyek a ricardói ekvivalenciaelvből kiindulva a költségvetési politika elméletét magyarázó újszerű eredmények kibontakozását segítették elő. A 80-as években az Egyesült Államok magas költségvetési hiánya számos közgazdászt ösztönzött hasonló témájú elmélet kidolgozására. Mivel hazánkban szinte mindennapos vita forrása a költségvetési hiány túlzott mértéke, ami veszélyezteti a monetáris közösségben való részvételünket, különösen érdekes és időszerű annak áttekintése, hogy hogyan gondolkodik egy modern közgazdász a költségvetési hiány okairól és következményeiről. ________________ The question of budgetary discipline emerges in relation to the criteria of the Economic and Monetary Union in almost all European special journals today. There is much less attention paid to budgetary overspending, the adjustment of which caused a serious puzzle for the government and the economists of the United States in the 80's. The Lucasian world of new classical economics has questioned the effectiveness of government intervention, it confuted above all the efficiency of fiscal policy. The macroeconomic models of Barro (1979, 1986) introduced in the present study - building upon the theoretical approach of economic policy on similar foundations - examine the effect of budgetary spending principally from a long-run perspective. His empirical analysis, overarching almost seventy years (1916–1982), is based upon the time series of variables affecting the budgetary deficit of the United States, distinguishing the effect of the usual government expenses from the over average items within. On the basis of his investigation on the United States and the United Kingdom he, furthermore, did not reject the economic invigorating role of government spending, he opposed Lucas' conclusions and got a modest step closer to the Keynesian standpoint in this sense. Barro, however, irrefutably argues on classical grounds, he recalls and reevaluates the Ricardian equivalence principle, summarizes the critiques raised against it and unintentionally praises the Classical economists. According to Barro we cannot ignore the one-time theorem of Ricardo if we are endeavoring to model government spending - we have to count with it if not definitely as a positive, but at least as a normative economic relationship.
Resumo:
Tavaly ünnepelte a közgazdász-társadalom Milton Friedman Nobel-díjas közgazdász születésének századik évfordulóját. A jubileumi megemlékezésnek különös aktualitást ad, hogy a 2008 óta tartó pénzügyi világválság hátterében ismét fellobbant a 20. századi közgazdaságtan két meghatározó irányzata - a Friedman nevével fémjelzett monetarizmus és a Keynes és követői által követett keynesizmus - közötti vita. E szerteágazó vitasorozat egyik "gyöngyszeme" két nemzetközileg ismert és elismert közgazdász, Tim Congdon és Robert (Lord) Skidelsky, összecsapása a Standpoint hasábjain 2009-ben. A szerző megmutatja, hogy a vita valójában nem a pénz fontosságáról vagy a mennyiségi pénzelmélet igazságáról folyt, hanem egyrészt egy sokkal elvontabb fogalomról: a bizonytalanság közgazdasági szerepéről, másrészt gyakorlati, gazdaságpolitikai kérdésekről: a monetáris és a fiskális politika lehetséges hatékonyságáról. A máig is tartó vitában "az inga többször kilengett", hol a keynesiánusok, hol a monetaristák javára, de még semmi nem dőlt el. ____ Last year economists marked the centenary of the birth of genius among them, Milton Friedman. The commemoration was especially topical because the world financial crisis that erupted in 2008 has brought sharply into focus again the old division in 20th-century economics between monetarism and Keynesianism. One highlight in this series of disputes was the 2009 clash between two internationally known and appreciated economists Tim Congdon and Robert (Lord) Skidelsky in the columns of Standpoint. The central element in the discussion is the role of money: what kind of economic policy to pursue, monetary or fiscal, to pull troubled economies out of crisis. The question closely resembles a decisive dilemma for Keynes in the 1930s. Though Keynes turned against some basic propositions of neoclassical economics, he never challenged the importance of money to the functioning of the economy, or the validity of the quantity theory of money. The author argues here that the issue is not about the formal category of money or demand for it, but about the far deeper economic concept of the role of uncertainty in economics. Another aspect concerns the relative efficiency of various kinds of economic policy, i. e. the strengths and weaknesses of monetary and fiscal policies.
Resumo:
In this paper, nonlinear dynamic equations of a wheeled mobile robot are described in the state-space form where the parameters are part of the state (angular velocities of the wheels). This representation, known as quasi-linear parameter varying, is useful for control designs based on nonlinear H(infinity) approaches. Two nonlinear H(infinity) controllers that guarantee induced L(2)-norm, between input (disturbances) and output signals, bounded by an attenuation level gamma, are used to control a wheeled mobile robot. These controllers are solved via linear matrix inequalities and algebraic Riccati equation. Experimental results are presented, with a comparative study among these robust control strategies and the standard computed torque, plus proportional-derivative, controller.
Resumo:
The Systems Theory Framework was developed to produce a metatheoretical framework through which the contribution of all theories to our understanding of career behaviour could be recognised. In addition it emphasises the individual as the site for the integration of theory and practice. Its utility has become more broadly acknowledged through its application to a range of cultural groups and settings, qualitative assessment processes, career counselling, and multicultural career counselling. For these reasons, the STF is a very valuable addition to the field of career theory. In viewing the field of career theory as a system, open to changes and developments from within itself and through constantly interrelating with other systems, the STF and this book is adding to the pattern of knowledge and relationships within the career field. The contents of this book will be integrated within the field as representative of a shift in understanding existing relationships within and between theories. In the same way, each reader will integrate the contents of the book within their existing views about the current state of career theory and within their current theory-practice relationship. This book should be required reading for anyone involved in career theory. It is also highly suitable as a text for an advanced career counselling or theory course.
Resumo:
HE PROBIT MODEL IS A POPULAR DEVICE for explaining binary choice decisions in econometrics. It has been used to describe choices such as labor force participation, travel mode, home ownership, and type of education. These and many more examples can be found in papers by Amemiya (1981) and Maddala (1983). Given the contribution of economics towards explaining such choices, and given the nature of data that are collected, prior information on the relationship between a choice probability and several explanatory variables frequently exists. Bayesian inference is a convenient vehicle for including such prior information. Given the increasing popularity of Bayesian inference it is useful to ask whether inferences from a probit model are sensitive to a choice between Bayesian and sampling theory techniques. Of interest is the sensitivity of inference on coefficients, probabilities, and elasticities. We consider these issues in a model designed to explain choice between fixed and variable interest rate mortgages. Two Bayesian priors are employed: a uniform prior on the coefficients, designed to be noninformative for the coefficients, and an inequality restricted prior on the signs of the coefficients. We often know, a priori, whether increasing the value of a particular explanatory variable will have a positive or negative effect on a choice probability. This knowledge can be captured by using a prior probability density function (pdf) that is truncated to be positive or negative. Thus, three sets of results are compared:those from maximum likelihood (ML) estimation, those from Bayesian estimation with an unrestricted uniform prior on the coefficients, and those from Bayesian estimation with a uniform prior truncated to accommodate inequality restrictions on the coefficients.
Resumo:
We present a mathematical framework that combines extinction-colonization dynamics with the dynamics of patch succession. We draw an analogy between the epidemiological categorization of individuals (infected, susceptible, latent and resistant) and the patch structure of a spatially heterogeneous landscape (occupied-suitable, empty-suitable, occupied-unsuitable and empty-unsuitable). This approach allows one to consider life-history attributes that influence persistence in patchy environments (e.g., longevity, colonization ability) in concert with extrinsic processes (e.g., disturbances, succession) that lead to spatial heterogeneity in patch suitability. It also allows the incorporation of seed banks and other dormant life forms, thus broadening patch occupancy dynamics to include sink habitats. We use the model to investigate how equilibrium patch occupancy is influenced by four critical parameters: colonization rate? extinction rate, disturbance frequency and the rate of habitat succession. This analysis leads to general predictions about how the temporal scaling of patch succession and extinction-colonization dynamics influences long-term persistence. We apply the model to herbaceous, early-successional species that inhabit open patches created by periodic disturbances. We predict the minimum disturbance frequency required far viable management of such species in the Florida scrub ecosystem. (C) 2001 Academic Press.
Resumo:
In this paper, we develop a theory for diffusion and flow of pure sub-critical adsorbates in microporous activated carbon over a wide range of pressure, ranging from very low to high pressure, where capillary condensation is occurring. This theory does not require any fitting parameter. The only information needed for the prediction is the complete pore size distribution of activated carbon. The various interesting behaviors of permeability versus loading are observed such as the maximum permeability at high loading (occurred at about 0.8-0.9 relative pressure). The theory is tested with diffusion and flow of benzene through a commercial activated carbon, and the agreement is found to be very good in the light that there is no fitting parameter in the model. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
This paper proposes an alternative framework for examining the international macroeconomic impact of domestic monetary and fiscal policies and focuses on the distinction between national spending and national production and the reactive behavior of foreign investors to changing external account balances. It demonstrates that under a floating exchange rate regime, monetary and fiscal policies can affect aggregate expenditure and output quite differently, with important implications for the behavior of the exchange rate, the current account balance, and national income in the short run, as well as the economy's price level in the long run. In particular, this paper predicts that expansionary monetary and fiscal policies tend to depreciate the currency and only temporarily raise gross domestic product and the current account surplus, although permanently raise the domestic price level. This is a revised version of a paper presented at the Forty-Ninth International Atlantic Economic Conference, March 14–21, 2000, Munich, Germany.
Resumo:
This article advances the theoretical integration between securitization theory and the framing approach, resulting in a set of criteria hereby called security framing. It seeks to make a twofold contribution: to sharpen the study of the ideational elements that underlie the construction of threats, and to advance towards a greater assessment of the audience's preferences. The case study under examination is the 2011 military intervention of the countries of the Gulf Cooperation Council in Bahrain. The security framing of this case will help illuminate the dynamics at play in one of the most important recent events in Gulf politics.