847 resultados para Excess returns


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The current study applies a two-state switching regression model to examine the behavior of a hypothetical portfolio of ten socially responsible (SRI) equity mutual funds during the expansion and contraction phases of US business cycles between April 1991 and June 2009, based on the Carhart four-factor model, using monthly data. The model identified a business cycle effect on the performance of SRI equity mutual funds. Fund returns were less volatile during expansion/peaks than during contraction/troughs, as indicated by the standard deviation of returns. During contraction/troughs, fund excess returns were explained by the differential in returns between small and large companies, the difference between the returns on stocks trading at high and low Book-to-Market Value, the market excess return over the risk-free rate, and fund objective. During contraction/troughs, smaller companies offered higher returns than larger companies (ci = 0.26, p = 0.01), undervalued stocks out-performed high growth stocks (h i = 0.39, p <0.0001), and funds with growth objectives out-performed funds with other objectives (oi = 0.01, p = 0.02). The hypothetical SRI portfolio was less risky than the market (bi = 0.74, p <0.0001). During expansion/peaks, fund excess returns were explained by the market excess return over the risk-free rate, and fund objective. Funds with other objectives, such as balanced funds and income funds out-performed funds with growth objectives (oi = −0.01, p = 0.03). The hypothetical SRI portfolio exhibited similar risk as the market (bi = 0.93, p <0.0001). The SRI investor adds a third criterion to the risk and return trade-off of traditional portfolio theory. This constraint is social performance. The research suggests that managers of SRI equity mutual funds may diminish value by using social and ethical criteria to select stocks, but add value by superior stock selection. The result is that the performance of SRI mutual funds is very similar to that of the market. There was no difference in the value added among secular SRI, religious SRI, and vice screens.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

I study the link between capital markets and sources of macroeconomic risk. In chapter 1 I show that expected inflation risk is priced in the cross section of stock returns even after controlling for cash flow growth and volatility risks. Motivated by this evidence I study a long run risk model with a built-in inflation non-neutrality channel that allows me to decompose the real stochastic discount factor into news about current and expected cash flow growth, news about expected inflation and news about volatility. The model can successfully price a broad menu of assets and provides a setting for analyzing cross sectional variation in expected inflation risk premium. For industries like retail and durable goods inflation risk can account for nearly a third of the overall risk premium while the energy industry and a broad commodity index act like inflation hedges. Nominal bonds are exposed to expected inflation risk and have inflation premiums that increase with bond maturity. The price of expected inflation risk was very high during the 70's and 80's, but has come down a lot since being very close to zero over the past decade. On average, the expected inflation price of risk is negative, consistent with the view that periods of high inflation represent a "bad" state of the world and are associated with low economic growth and poor stock market performance. In chapter 2 I look at the way capital markets react to predetermined macroeconomic announcements. I document significantly higher excess returns on the US stock market on macro release dates as compared to days when no macroeconomic news hit the market. Almost the entire equity premium since 1997 is being realized on days when macroeconomic news are released. At high frequency, there is a pattern of returns increasing in the hours prior to the pre-determined announcement time, peaking around the time of the announcement and dropping thereafter.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed timevarying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible realtime term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed time-varying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible real-time term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most fundamental and widely accepted ideas in finance is that investors are compensated through higher returns for taking on non-diversifiable risk. Hence the quantification, modeling and prediction of risk have been, and still are one of the most prolific research areas in financial economics. It was recognized early on that there are predictable patterns in the variance of speculative prices. Later research has shown that there may also be systematic variation in the skewness and kurtosis of financial returns. Lacking in the literature so far, is an out-of-sample forecast evaluation of the potential benefits of these new more complicated models with time-varying higher moments. Such an evaluation is the topic of this dissertation. Essay 1 investigates the forecast performance of the GARCH (1,1) model when estimated with 9 different error distributions on Standard and Poor’s 500 Index Future returns. By utilizing the theory of realized variance to construct an appropriate ex post measure of variance from intra-day data it is shown that allowing for a leptokurtic error distribution leads to significant improvements in variance forecasts compared to using the normal distribution. This result holds for daily, weekly as well as monthly forecast horizons. It is also found that allowing for skewness and time variation in the higher moments of the distribution does not further improve forecasts. In Essay 2, by using 20 years of daily Standard and Poor 500 index returns, it is found that density forecasts are much improved by allowing for constant excess kurtosis but not improved by allowing for skewness. By allowing the kurtosis and skewness to be time varying the density forecasts are not further improved but on the contrary made slightly worse. In Essay 3 a new model incorporating conditional variance, skewness and kurtosis based on the Normal Inverse Gaussian (NIG) distribution is proposed. The new model and two previously used NIG models are evaluated by their Value at Risk (VaR) forecasts on a long series of daily Standard and Poor’s 500 returns. The results show that only the new model produces satisfactory VaR forecasts for both 1% and 5% VaR Taken together the results of the thesis show that kurtosis appears not to exhibit predictable time variation, whereas there is found some predictability in the skewness. However, the dynamic properties of the skewness are not completely captured by any of the models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A vast literature documents negative skewness and excess kurtosis in stock return distributions on several markets. We approach the issue of negative skewness from a different angle than in previous studies by suggesting a model, which we denote the “negative news threshold” hypothesis, that builds on asymmetrically distributed information and symmetric market responses. Our empirical tests reveal that returns for days when non-scheduled news are disclosed are the source of negative skewness in stock returns. This finding lends solid support to our model and suggests that negative skewness in stock returns is induced by asymmetries in the news disclosure policies of firm management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As an alternative to the present system of intermediation of the German savings surplus, this paper suggests that the risk-adjusted rate of return could be improved by creating a sovereign wealth fund for Germany (designated DESWF), which could invest excess German savings globally. Such a DESWF would offer German savers a secure vehicle paying a guaranteed positive minimum real interest rate, with a top-up when real investment returns allowed. The vehicle would invest the funds in a portfolio that is highly diversified by geography and asset classes. Positive real returns can be expected in the long run based on positive real global growth. Since, in this case, a significant amount of funds would flow outside the euro area, the euro would depreciate, which would help crisis countries presently struggling to revive growth through exports and to close their external deficits so as to recoup their international credit-worthiness. Target imbalances would gradually disappear and German claims abroad would move from nominal claims on the ECB to diversified real and nominal claims on various private and public foreign entities in a variety of asset classes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The position of Real Estate within a multi-asset portfolio has received considerable attention recently. Previous research has concentrated on the percentage holding property would achieve given its risk/return characteristics. Such studies have invariably used Modern Portfolio Theory and these approaches have been criticised for both the quality of the real estate data and problems with the methodology itself. The first problem is now well understood, and the second can be addressed by the use of realistic constraints on asset holdings. This paper takes a different approach. We determine the level of return that Real Estate needs to achieve to justify an allocation within the multi asset portfolio. In order to test the importance of the quality of the data we use historic appraisal based and desmoothed returns to examine the sensitivity of the results. Consideration is also given to the Holding period and the imposition of realistic constraints on the asset holdings in order to model portfolios held by pension fund investors. We conclude, using several benchmark levels of portfolio risk and return, that using appraisal based data the required level of return for Real Estate was less than that achieved over the period 1972-1993. The use of desmoothed series can reverse this result at the highest levels of desmoothing although within a restricted holding period Real Estate offered returns in excess of those required to enter the portfolio and might have a role to play in the multi-asset portfolio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to contribute on the forecasting literature in stock return for emerging markets. We use Autometrics to select relevant predictors among macroeconomic, microeconomic and technical variables. We develop predictive models for the Brazilian market premium, measured as the excess return over Selic interest rate, Itaú SA, Itaú-Unibanco and Bradesco stock returns. We nd that for the market premium, an ADL with error correction is able to outperform the benchmarks in terms of economic performance. For individual stock returns, there is a trade o between statistical properties and out-of-sample performance of the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to contribute on the forecasting literature in stock return for emerging markets. We use Autometrics to select relevant predictors among macroeconomic, microeconomic and technical variables. We develop predictive models for the Brazilian market premium, measured as the excess return over Selic interest rate, Itaú SA, Itaú-Unibanco and Bradesco stock returns. We find that for the market premium, an ADL with error correction is able to outperform the benchmarks in terms of economic performance. For individual stock returns, there is a trade o between statistical properties and out-of-sample performance of the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates whether the non-normality typically observed in daily stock-market returns could arise because of the joint existence of breaks and GARCH effects. It proposes a data-driven procedure to credibly identify the number and timing of breaks and applies it on the benchmark stock-market indices of 27 OECD countries. The findings suggest that a substantial element of the observed deviations from normality might indeed be due to the co-existence of breaks and GARCH effects. However, the presence of structural changes is found to be the primary reason for the non-normality and not the GARCH effects. Also, there is still some remaining excess kurtosis that is unlikely to be linked to the specification of the conditional volatility or the presence of breaks. Finally, an interesting sideline result implies that GARCH models have limited capacity in forecasting stock-market volatility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extreme stock price movements are of great concern to both investors and the entire economy. For investors, a single negative return, or a combination of several smaller returns, can possible wipe out so much capital that the firm or portfolio becomes illiquid or insolvent. If enough investors experience this loss, it could shock the entire economy. An example of such a case is the stock market crash of 1987. Furthermore, there has been a lot of recent interest regarding the increasing volatility of stock prices. ^ This study presents an analysis of extreme stock price movements. The data utilized was the daily returns for the Standard and Poor's 500 index from January 3, 1978 to May 31, 2001. Research questions were analyzed using the statistical models provided by extreme value theory. One of the difficulties in examining stock price data is that there is no consensus regarding the correct shape of the distribution function generating the data. An advantage with extreme value theory is that no detailed knowledge of this distribution function is required to apply the asymptotic theory. We focus on the tail of the distribution. ^ Extreme value theory allows us to estimate a tail index, which we use to derive bounds on the returns for very low probabilities on an excess. Such information is useful in evaluating the volatility of stock prices. There are three possible limit laws for the maximum: Gumbel (thick-tailed), Fréchet (thin-tailed) or Weibull (no tail). Results indicated that extreme returns during the time period studied follow a Fréchet distribution. Thus, this study finds that extreme value analysis is a valuable tool for examining stock price movements and can be more efficient than the usual variance in measuring risk. ^