9 resultados para decomposition analysis

em Repositório digital da Fundação Getúlio Vargas - FGV


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper investigates the degree of short run and long run co-movement in U.S. sectoral output data by estimating sectoraI trends and cycles. A theoretical model based on Long and Plosser (1983) is used to derive a reduced form for sectoral output from first principles. Cointegration and common features (cycles) tests are performed; sectoral output data seem to share a relatively high number of common trends and a relatively low number of common cycles. A special trend-cycle decomposition of the data set is performed and the results indicate a very similar cyclical behavior across sectors and a very different behavior for trends. Indeed. sectors cyclical components appear as one. In a variance decomposition analysis, prominent sectors such as Manufacturing and Wholesale/Retail Trade exhibit relatively important transitory shocks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

From a methodological point of view, this paper makes two contributions to the literature. One contribution is the proposal of a new measure of pro-poor growth. This new measure provides the linkage between growth rates in mean income and in income inequality. In this context, growth is defined as propoor (or anti-poor) if there is a gain (or loss) in the growth rate due to a decrease (or increase) in inequality. The other contribution is a decomposition methodology that explores linkages between growth patterns and social policies. Through the decomposition analysis, we assess the contribution of different income sources to growth patterns. The proposed methodologies are then applied to the Brazilian National Household Survey (PNAD) covering the period 1995-2004. The paper analyzes the evolution of Brazilian social indicators based on per capita income exploring links with adverse labour market performance and social policy change, with particular emphasis on the expansion of targeted cash transfers and devising more pro-poor social security benefits.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

From a methodological point of view, this paper makes two contributions to the literature. One contribution is the proposal of a new measure of pro-poor growth. This new measure provides the linkage between growth rates in mean income and in income inequality. In this context, growth is defined as pro-poor (or anti-poor) if there is a gain (or loss) in the growth rate due to a decrease (or increase) in inequality. The other contribution is a decomposition methodology that explores linkages growth patterns, and labour market performances. Through the decomposition analysis, growth in per capita income is explained in terms of four labour market components: the employment rate, hours of work, the labour force participation rate, and productivity. The proposed methodology are then applied to the Brazilian National Household Survey (PNAD) covering the period 1995-2004. The paper analyzes the evolution of Brazilian social indicators based on per capita income exploring links with adverse labour market performance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

From a methodological point of view, this paper makes two contlibutions to the literature. One contribution is the proposal of a new measure of pro-poor growth. This new measure provides the linkage between growth rates in mean income and in income inequality. In this context, growth is defined as pro-poor (or anti-poor) if there is a gain (or loss) in the growth rate due to a decrease (or increase) in inequality. The other contribution is a decomposition methodology that explores linkages between three dimensions: growth pattems, labour market performances. and social policies. Through the decomposition analysis, growth in per capita income is explained in terms of four labour market components: the employment rate. hours of work, the labour force participation rate. and productivity. We also assess the contribution of different nonlabour income sources to growth patterns. The proposed methodologies are then applied to the Brazilian National Household Survey (PNAD) covering the period 1995-2004. The paper analyzes the evolution of Brazilian social indicators based on per capita income exploring links with adverse labour market performance and social policy change, with particular emphasis on the expansion of targeted cash transfers and devising more propoor social security benefits.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider a class of sampling-based decomposition methods to solve risk-averse multistage stochastic convex programs. We prove a formula for the computation of the cuts necessary to build the outer linearizations of the recourse functions. This formula can be used to obtain an efficient implementation of Stochastic Dual Dynamic Programming applied to convex nonlinear problems. We prove the almost sure convergence of these decomposition methods when the relatively complete recourse assumption holds. We also prove the almost sure convergence of these algorithms when applied to risk-averse multistage stochastic linear programs that do not satisfy the relatively complete recourse assumption. The analysis is first done assuming the underlying stochastic process is interstage independent and discrete, with a finite set of possible realizations at each stage. We then indicate two ways of extending the methods and convergence analysis to the case when the process is interstage dependent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Paper Tackles the Problem of Aggregate Tfp Measurement Using Stochastic Frontier Analysis (Sfa). Data From Penn World Table 6.1 are Used to Estimate a World Production Frontier For a Sample of 75 Countries Over a Long Period (1950-2000) Taking Advantage of the Model Offered By Battese and Coelli (1992). We Also Apply the Decomposition of Tfp Suggested By Bauer (1990) and Kumbhakar (2000) to a Smaller Sample of 36 Countries Over the Period 1970-2000 in Order to Evaluate the Effects of Changes in Efficiency (Technical and Allocative), Scale Effects and Technical Change. This Allows Us to Analyze the Role of Productivity and Its Components in Economic Growth of Developed and Developing Nations in Addition to the Importance of Factor Accumulation. Although not Much Explored in the Study of Economic Growth, Frontier Techniques Seem to Be of Particular Interest For That Purpose Since the Separation of Efficiency Effects and Technical Change Has a Direct Interpretation in Terms of the Catch-Up Debate. The Estimated Technical Efficiency Scores Reveal the Efficiency of Nations in the Production of Non Tradable Goods Since the Gdp Series Used is Ppp-Adjusted. We Also Provide a Second Set of Efficiency Scores Corrected in Order to Reveal Efficiency in the Production of Tradable Goods and Rank Them. When Compared to the Rankings of Productivity Indexes Offered By Non-Frontier Studies of Hall and Jones (1996) and Islam (1995) Our Ranking Shows a Somewhat More Intuitive Order of Countries. Rankings of the Technical Change and Scale Effects Components of Tfp Change are Also Very Intuitive. We Also Show That Productivity is Responsible For Virtually All the Differences of Performance Between Developed and Developing Countries in Terms of Rates of Growth of Income Per Worker. More Important, We Find That Changes in Allocative Efficiency Play a Crucial Role in Explaining Differences in the Productivity of Developed and Developing Nations, Even Larger Than the One Played By the Technology Gap

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the commonly held belief that aggregate data display short-run comovement, there has been little discussion about the econometric consequences of this feature of the data. We use exhaustive Monte-Carlo simulations to investigate the importance of restrictions implied by common-cyclical features for estimates and forecasts based on vector autoregressive models. First, we show that the ìbestî empirical model developed without common cycle restrictions need not nest the ìbestî model developed with those restrictions. This is due to possible differences in the lag-lengths chosen by model selection criteria for the two alternative models. Second, we show that the costs of ignoring common cyclical features in vector autoregressive modelling can be high, both in terms of forecast accuracy and efficient estimation of variance decomposition coefficients. Third, we find that the Hannan-Quinn criterion performs best among model selection criteria in simultaneously selecting the lag-length and rank of vector autoregressions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the belief, supported byrecentapplied research, thataggregate datadisplay short-run comovement, there has been little discussion about the econometric consequences ofthese data “features.” W e use exhaustive M onte-Carlo simulations toinvestigate theimportance ofrestrictions implied by common-cyclicalfeatures for estimates and forecasts based on vectorautoregressive and errorcorrection models. First, weshowthatthe“best” empiricalmodeldevelopedwithoutcommoncycles restrictions neednotnestthe“best” modeldevelopedwiththoserestrictions, duetothe use ofinformation criteria forchoosingthe lagorderofthe twoalternative models. Second, weshowthatthecosts ofignoringcommon-cyclicalfeatures inV A R analysis may be high in terms offorecastingaccuracy and e¢ciency ofestimates ofvariance decomposition coe¢cients. A lthough these costs are more pronounced when the lag orderofV A R modelsareknown, theyarealsonon-trivialwhenitis selectedusingthe conventionaltoolsavailabletoappliedresearchers. T hird, we…ndthatifthedatahave common-cyclicalfeatures andtheresearcherwants touseaninformationcriterium to selectthelaglength, theH annan-Q uinn criterium is themostappropriate, sincethe A kaike and theSchwarz criteriahave atendency toover- and under-predictthe lag lengthrespectivelyinoursimulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neste trabalho investigamos as propriedades em pequena amostra e a robustez das estimativas dos parâmetros de modelos DSGE. Tomamos o modelo de Smets and Wouters (2007) como base e avaliamos a performance de dois procedimentos de estimação: Método dos Momentos Simulados (MMS) e Máxima Verossimilhança (MV). Examinamos a distribuição empírica das estimativas dos parâmetros e sua implicação para as análises de impulso-resposta e decomposição de variância nos casos de especificação correta e má especificação. Nossos resultados apontam para um desempenho ruim de MMS e alguns padrões de viés nas análises de impulso-resposta e decomposição de variância com estimativas de MV nos casos de má especificação considerados.