31 resultados para dual-factor model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The approach proposed here explores the hierarchical nature of item-level data on price changes. On one hand, price data is naturally organized around a regional strucuture, with variations being observed on separate cities. Moreover, the itens that comprise the natural structure of CPIs are also normally interpreted in terms of groups that have economic interpretations, such as tradables and non-tradables, energyrelated, raw foodstuff, monitored prices, etc. The hierarchical dynamic factor model allow the estimation of multiple factors that are naturally interpreted as relating to each of these regional and economic levels.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O objetivo deste trabalho é verificar se os fundos de investimento Multimercado no Brasil geram alphas significativamente positivos, ou seja, se os gestores possuem habilidade e contribuem positivamente para o retorno de seus fundos. Para calcular o alpha dos fundos, foi utilizado um modelo com sete fatores, baseado, principalmente, em Edwards e Caglayan (2001), com a inclusão do fator de iliquidez de uma ação. O período analisado vai de 2003 a 2013. Encontramos que, em média, os fundos multimercado geram alpha negativo. Porém, apesar de o percentual dos que geram interceptos positivos ser baixo, a magnitude dos mesmos é expressiva. Os resultados diferem bastante por classificação Anbima e por base de dados utilizada. Verifica-se também se a performance desses fundos é persistente através de um modelo não-paramétrico baseado em tabelas de contingência. Não encontramos evidências de persistência, nem quando separamos os fundos por classificação.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Differences-in-Differences (DID) is one of the most widely used identification strategies in applied economics. However, how to draw inferences in DID models when there are few treated groups remains an open question. We show that the usual inference methods used in DID models might not perform well when there are few treated groups and errors are heteroskedastic. In particular, we show that when there is variation in the number of observations per group, inference methods designed to work when there are few treated groups tend to (under-) over-reject the null hypothesis when the treated groups are (large) small relative to the control groups. This happens because larger groups tend to have lower variance, generating heteroskedasticity in the group x time aggregate DID model. We provide evidence from Monte Carlo simulations and from placebo DID regressions with the American Community Survey (ACS) and the Current Population Survey (CPS) datasets to show that this problem is relevant even in datasets with large numbers of observations per group. We then derive an alternative inference method that provides accurate hypothesis testing in situations where there are few treated groups (or even just one) and many control groups in the presence of heteroskedasticity. Our method assumes that we know how the heteroskedasticity is generated, which is the case when it is generated by variation in the number of observations per group. With many pre-treatment periods, we show that this assumption can be relaxed. Instead, we provide an alternative application of our method that relies on assumptions about stationarity and convergence of the moments of the time series. Finally, we consider two recent alternatives to DID when there are many pre-treatment groups. We extend our inference method to linear factor models when there are few treated groups. We also propose a permutation test for the synthetic control estimator that provided a better heteroskedasticity correction in our simulations than the test suggested by Abadie et al. (2010).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O trabalho relaciona, com um modelo de três fatores proposto por Huse (2007), variáveis macroeconômicas e financeiras observáveis com a estrutura a termo da taxa de juros (ETTJ) dos países da América Latina (Brasil, Chile, Colômbia e México). Consideramos os seguintes determinantes macroeconômicos: taxa de inflação, taxa de variação do nível de atividade, variação da taxa de câmbio, nível do credit default swaps (CDS), nível da taxa de desemprego, nível da taxa de juros nominal e fatores globais (inclinação da curva de juros norte-americana e variação de índices de commodities). Os modelos explicam mais do que 75% nos casos do Brasil, Chile e Colômbia e de 68% no caso do México. Variações positivas no nível de atividade e inflação são acompanhadas, em todos os países, de um aumento na ETTJ. Aumentos do CDS, com exceção do Chile, acarretam em aumento das taxas longas. Já crescimentos na taxa de desemprego têm efeitos distintos nos países. Ao mesmo tempo, depreciações cambiais não são acompanhadas de subida de juros, o que pode ser explicado pelos bancos centrais considerarem que depreciações de câmbio tem efeitos transitórios na inflação. No México, aumentos na ETTJ são diretamente relacionados com o índice de commodities de energia e metálicas. Já no caso brasileiro, em que os preços da gasolina são regulados e não impactam a inflação, esse canal não é relevante. Variações positivas na inclinação da curva norte-americana têm efeitos similares nas curvas da América Latina, reduzindo as taxas curtas e aumentando as taxas longas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The synthetic control (SC) method has been recently proposed as an alternative to estimate treatment effects in comparative case studies. The SC relies on the assumption that there is a weighted average of the control units that reconstruct the potential outcome of the treated unit in the absence of treatment. If these weights were known, then one could estimate the counterfactual for the treated unit using this weighted average. With these weights, the SC would provide an unbiased estimator for the treatment effect even if selection into treatment is correlated with the unobserved heterogeneity. In this paper, we revisit the SC method in a linear factor model where the SC weights are considered nuisance parameters that are estimated to construct the SC estimator. We show that, when the number of control units is fixed, the estimated SC weights will generally not converge to the weights that reconstruct the factor loadings of the treated unit, even when the number of pre-intervention periods goes to infinity. As a consequence, the SC estimator will be asymptotically biased if treatment assignment is correlated with the unobserved heterogeneity. The asymptotic bias only vanishes when the variance of the idiosyncratic error goes to zero. We suggest a slight modification in the SC method that guarantees that the SC estimator is asymptotically unbiased and has a lower asymptotic variance than the difference-in-differences (DID) estimator when the DID identification assumption is satisfied. If the DID assumption is not satisfied, then both estimators would be asymptotically biased, and it would not be possible to rank them in terms of their asymptotic bias.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The knowledge of the current state of the economy is crucial for policy makers, economists and analysts. However, a key economic variable, the gross domestic product (GDP), are typically colected on a quartely basis and released with substancial delays by the national statistical agencies. The first aim of this paper is to use a dynamic factor model to forecast the current russian GDP, using a set of timely monthly information. This approach can cope with the typical data flow problems of non-synchronous releases, mixed frequency and the curse of dimensionality. Given that Russian economy is largely dependent on the commodity market, our second motivation relates to study the effects of innovations in the russian macroeconomic fundamentals on commodity price predictability. We identify these innovations through a news index which summarizes deviations of offical data releases from the expectations generated by the DFM and perform a forecasting exercise comparing the performance of different models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the income inequality generated by a jobsearch process when di§erent cohorts of homogeneous workers are allowed to have di§erent degrees of impatience. Using the fact the average wage under the invariant Markovian distribution is a decreasing function of the discount factor (Cysne (2004, 2006)), I show that the Lorenz curve and the between-cohort Gini coe¢ cient of income inequality can be easily derived in this case. An example with arbitrary measures regarding the wage o§ers and the distribution of time preferences among cohorts provides some insights into how much income inequality can be generated, and into how it varies as a function of the probability of unemployment and of the probability that the worker does not Önd a job o§er each period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parametric term structure models have been successfully applied to innumerous problems in fixed income markets, including pricing, hedging, managing risk, as well as studying monetary policy implications. On their turn, dynamic term structure models, equipped with stronger economic structure, have been mainly adopted to price derivatives and explain empirical stylized facts. In this paper, we combine flavors of those two classes of models to test if no-arbitrage affects forecasting. We construct cross section (allowing arbitrages) and arbitrage-free versions of a parametric polynomial model to analyze how well they predict out-of-sample interest rates. Based on U.S. Treasury yield data, we find that no-arbitrage restrictions significantly improve forecasts. Arbitrage-free versions achieve overall smaller biases and Root Mean Square Errors for most maturities and forecasting horizons. Furthermore, a decomposition of forecasts into forward-rates and holding return premia indicates that the superior performance of no-arbitrage versions is due to a better identification of bond risk premium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop and calibrate a model where differences in factor endowments lead countries to trade intermediate goods, and gains from trade reflect in total factor productivity. We perform several output and growth decompositions, to assess the impact that barriers to trade, as well as changes in terms of trade, have on measured TFP. We find that for very poor economies gains from trade are large, in some cases representing a doubling of GDP. Also, that an improvement in the terms of trade - by allowing the use of a better mix of intermediate inputs in the production process - translates into productivity growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop and calibrate a model where diferences in factor en-dowments lead countries to trade di¤erent goods, so that the existence of international trade changes the sectorial composition of output from one country to another. Gains from trade re ect in total factor productivity. We perform a development decomposition, to assess the impact of trade and barriers to trade on measured TFP. In our sample, the median size of that e¤ect is about 6.5% of output, with a median of 17% and a maximum of 89%. Also, the model predicts that changes in the terms of trade cause a change of productivity, and that efect has an average elasticity of 0.71.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we construct common-factor portfolios using a novel linear transformation of standard factor models extracted from large data sets of asset returns. The simple transformation proposed here keeps the basic properties of the usual factor transformations, although some new interesting properties are further attached to them. Some theoretical advantages are shown to be present. Also, their practical importance is confirmed in two applications: the performance of common-factor portfolios are shown to be superior to that of asset returns and factors commonly employed in the finance literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We extend the standard price discovery analysis to estimate the information share of dual-class shares across domestic and foreign markets. By examining both common and preferred shares, we aim to extract information not only about the fundamental value of the rm, but also about the dual-class premium. In particular, our interest lies on the price discovery mechanism regulating the prices of common and preferred shares in the BM&FBovespa as well as the prices of their ADR counterparts in the NYSE and in the Arca platform. However, in the presence of contemporaneous correlation between the innovations, the standard information share measure depends heavily on the ordering we attribute to prices in the system. To remain agnostic about which are the leading share class and market, one could for instance compute some weighted average information share across all possible orderings. This is extremely inconvenient given that we are dealing with 2 share prices in Brazil, 4 share prices in the US, plus the exchange rate (and hence over 5,000 permutations!). We thus develop a novel methodology to carry out price discovery analyses that does not impose any ex-ante assumption about which share class or trading platform conveys more information about shocks in the fundamental price. As such, our procedure yields a single measure of information share, which is invariant to the ordering of the variables in the system. Simulations of a simple market microstructure model show that our information share estimator works pretty well in practice. We then employ transactions data to study price discovery in two dual-class Brazilian stocks and their ADRs. We uncover two interesting ndings. First, the foreign market is at least as informative as the home market. Second, shocks in the dual-class premium entail a permanent e ect in normal times, but transitory in periods of nancial distress. We argue that the latter is consistent with the expropriation of preferred shareholders as a class.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A tese propõe que empresas que emitem apenas ações com direitos de voto utilizam mais capitais de terceiros do que empresas que emitem tanto ações votantes quanto não votantes. No desenvolvimento do trabalho, foi demonstrada a relevância de relacionar endividamento ao fato de uma empresa emitir ou não ações sem direito a voto, considerando as principais teorias de estrutura de capital e a realidade brasileira. Como os modelos teóricos que explicam o nível de endividamento das empresas ainda carecem de capacidade explicativa, a busca por novos determinantes está presente na literatura de estrutura de capital. A ocorrência da emissão de ações em classes diferenciadas (dual-class) como fator impactante no nível de endividamento foi analisada sob três prismas: de mercado, dos setores e das empresas que unificaram suas ações. Pelas três investigações ficou evidenciada a perspectiva de que o endividamento seja menor nos casos de emissão de ações preferenciais, considerando o ambiente de negociação e regulamentação do Brasil. A aceitação da tese tem reflexos teóricos na identificação de um fator que deve ser levado em consideração nos modelos de estrutura de capital, bem como suscita a importância de gestores, investidores e credores reconhecerem que o fato de uma empresa ser dual-class impacta não apenas na sua estrutura de controle, mas principalmente em sua estrutura de capital. Entre as considerações da aceitação da tese, estaria o reconhecimento de que empresas que ingressam no Novo Mercado na prática estão, no longo prazo, trocando o uso de ações preferenciais como forma de financiamento pela emissão de dívida.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Externai debt service requires a dual resource transfer. Trade surpluses have to be generated in order to make foreign exchange revenues available for debt repayment. In addition, with developing countries' externai debt being largely a public liability, debt service requires that resources can be effectively transferred from the private to the public sector. This paper derives a statistical model for dealing with dual constraints in the presence of binary dependent variables and applies it to the dual resource transfer problem. The results from the estimation of the model for a sample of 31 middle-income developing countries in the period of 1980 to 1990, strongly support the hypothesis that both externai and fiscal constraints are important in explaining externai debt service disruptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we use factor models to describe a certain class of covariance structure for financiaI time series models. More specifical1y, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. We build on previous work by allowing the factor loadings, in the factor mo deI structure, to have a time-varying structure and to capture changes in asset weights over time motivated by applications with multi pIe time series of daily exchange rates. We explore and discuss potential extensions to the models exposed here in the prediction area. This discussion leads to open issues on real time implementation and natural model comparisons.