980 resultados para INDEX RETURNS
Resumo:
In this article, we investigate the commonly used autoregressive filter method of adjusting appraisal-based real estate returns to correct for the perceived biases induced in the appraisal process. Many articles have been written on appraisal smoothing but remarkably few have considered the relationship between smoothing at the individual property level and the amount of persistence in the aggregate appraisal-based index. To investigate this issue we analyze a large sample of appraisal data at the individual property level from the Investment Property Databank. We find that commonly used unsmoothing estimates at the index level overstate the extent of smoothing that takes place at the individual property level. There is also strong support for an ARFIMA representation of appraisal returns at the index level and an ARMA model at the individual property level.
Resumo:
One of the most fundamental and widely accepted ideas in finance is that investors are compensated through higher returns for taking on non-diversifiable risk. Hence the quantification, modeling and prediction of risk have been, and still are one of the most prolific research areas in financial economics. It was recognized early on that there are predictable patterns in the variance of speculative prices. Later research has shown that there may also be systematic variation in the skewness and kurtosis of financial returns. Lacking in the literature so far, is an out-of-sample forecast evaluation of the potential benefits of these new more complicated models with time-varying higher moments. Such an evaluation is the topic of this dissertation. Essay 1 investigates the forecast performance of the GARCH (1,1) model when estimated with 9 different error distributions on Standard and Poor’s 500 Index Future returns. By utilizing the theory of realized variance to construct an appropriate ex post measure of variance from intra-day data it is shown that allowing for a leptokurtic error distribution leads to significant improvements in variance forecasts compared to using the normal distribution. This result holds for daily, weekly as well as monthly forecast horizons. It is also found that allowing for skewness and time variation in the higher moments of the distribution does not further improve forecasts. In Essay 2, by using 20 years of daily Standard and Poor 500 index returns, it is found that density forecasts are much improved by allowing for constant excess kurtosis but not improved by allowing for skewness. By allowing the kurtosis and skewness to be time varying the density forecasts are not further improved but on the contrary made slightly worse. In Essay 3 a new model incorporating conditional variance, skewness and kurtosis based on the Normal Inverse Gaussian (NIG) distribution is proposed. The new model and two previously used NIG models are evaluated by their Value at Risk (VaR) forecasts on a long series of daily Standard and Poor’s 500 returns. The results show that only the new model produces satisfactory VaR forecasts for both 1% and 5% VaR Taken together the results of the thesis show that kurtosis appears not to exhibit predictable time variation, whereas there is found some predictability in the skewness. However, the dynamic properties of the skewness are not completely captured by any of the models.
Resumo:
This paper considers the effect of short- and long-term interest rates, and interest rate spreads upon real estate index returns in the UK. Using Johansen's vector autoregressive framework, it is found that the real estate index cointegrates with the term spread, but not with the short or long rates themselves. Granger causality tests indicate that movements in short term interest rates and the spread cause movements in the returns series. However, decomposition of the forecast error variances from VAR models indicate that changes in these variables can only explain a small proportion of the overall variability of the returns, and that the effect has fully worked through after two months. The results suggest that these financial variables could potentially be used as leading indicators for real estate markets, with corresponding implications for return predictability.
Resumo:
A "self-exciting" market is one in which the probability of observing a crash increases in response to the occurrence of a crash. It essentially describes cases where the initial crash serves to weaken the system to some extent, making subsequent crashes more likely. This thesis investigates if equity markets possess this property. A self-exciting extension of the well-known jump-based Bates (1996) model is used as the workhorse model for this thesis, and a particle-filtering algorithm is used to facilitate estimation by means of maximum likelihood. The estimation method is developed so that option prices are easily included in the dataset, leading to higher quality estimates. Equilibrium arguments are used to price the risks associated with the time-varying crash probability, and in turn to motivate a risk-neutral system for use in option pricing. The option pricing function for the model is obtained via the application of widely-used Fourier techniques. An application to S&P500 index returns and a panel of S&P500 index option prices reveals evidence of self excitation.
Resumo:
Using a data set consisting of three years of 5-minute intraday stock index returns for major European stock indices and U.S. macroeconomic surprises, the conditional mean and volatility behaviors in European market were investigated. The findings suggested that the opening of the U.S market significantly raised the level of volatility in Europe, and that all markets respond in an identical fashion. Furthermore, the U.S. macroeconomic surprises exerted an immediate and major impact on both European stock markets’ returns and volatilities. Thus, high frequency data appear to be critical for the identification of news that impacted the markets.
Resumo:
Ma thèse est composée de trois chapitres reliés à l'estimation des modèles espace-état et volatilité stochastique. Dans le première article, nous développons une procédure de lissage de l'état, avec efficacité computationnelle, dans un modèle espace-état linéaire et gaussien. Nous montrons comment exploiter la structure particulière des modèles espace-état pour tirer les états latents efficacement. Nous analysons l'efficacité computationnelle des méthodes basées sur le filtre de Kalman, l'algorithme facteur de Cholesky et notre nouvelle méthode utilisant le compte d'opérations et d'expériences de calcul. Nous montrons que pour de nombreux cas importants, notre méthode est plus efficace. Les gains sont particulièrement grands pour les cas où la dimension des variables observées est grande ou dans les cas où il faut faire des tirages répétés des états pour les mêmes valeurs de paramètres. Comme application, on considère un modèle multivarié de Poisson avec le temps des intensités variables, lequel est utilisé pour analyser le compte de données des transactions sur les marchés financières. Dans le deuxième chapitre, nous proposons une nouvelle technique pour analyser des modèles multivariés à volatilité stochastique. La méthode proposée est basée sur le tirage efficace de la volatilité de son densité conditionnelle sachant les paramètres et les données. Notre méthodologie s'applique aux modèles avec plusieurs types de dépendance dans la coupe transversale. Nous pouvons modeler des matrices de corrélation conditionnelles variant dans le temps en incorporant des facteurs dans l'équation de rendements, où les facteurs sont des processus de volatilité stochastique indépendants. Nous pouvons incorporer des copules pour permettre la dépendance conditionnelle des rendements sachant la volatilité, permettant avoir différent lois marginaux de Student avec des degrés de liberté spécifiques pour capturer l'hétérogénéité des rendements. On tire la volatilité comme un bloc dans la dimension du temps et un à la fois dans la dimension de la coupe transversale. Nous appliquons la méthode introduite par McCausland (2012) pour obtenir une bonne approximation de la distribution conditionnelle à posteriori de la volatilité d'un rendement sachant les volatilités d'autres rendements, les paramètres et les corrélations dynamiques. Le modèle est évalué en utilisant des données réelles pour dix taux de change. Nous rapportons des résultats pour des modèles univariés de volatilité stochastique et deux modèles multivariés. Dans le troisième chapitre, nous évaluons l'information contribuée par des variations de volatilite réalisée à l'évaluation et prévision de la volatilité quand des prix sont mesurés avec et sans erreur. Nous utilisons de modèles de volatilité stochastique. Nous considérons le point de vue d'un investisseur pour qui la volatilité est une variable latent inconnu et la volatilité réalisée est une quantité d'échantillon qui contient des informations sur lui. Nous employons des méthodes bayésiennes de Monte Carlo par chaîne de Markov pour estimer les modèles, qui permettent la formulation, non seulement des densités a posteriori de la volatilité, mais aussi les densités prédictives de la volatilité future. Nous comparons les prévisions de volatilité et les taux de succès des prévisions qui emploient et n'emploient pas l'information contenue dans la volatilité réalisée. Cette approche se distingue de celles existantes dans la littérature empirique en ce sens que ces dernières se limitent le plus souvent à documenter la capacité de la volatilité réalisée à se prévoir à elle-même. Nous présentons des applications empiriques en utilisant les rendements journaliers des indices et de taux de change. Les différents modèles concurrents sont appliqués à la seconde moitié de 2008, une période marquante dans la récente crise financière.
Resumo:
This paper presents gamma stochastic volatility models and investigates its distributional and time series properties. The parameter estimators obtained by the method of moments are shown analytically to be consistent and asymptotically normal. The simulation results indicate that the estimators behave well. The insample analysis shows that return models with gamma autoregressive stochastic volatility processes capture the leptokurtic nature of return distributions and the slowly decaying autocorrelation functions of squared stock index returns for the USA and UK. In comparison with GARCH and EGARCH models, the gamma autoregressive model picks up the persistence in volatility for the US and UK index returns but not the volatility persistence for the Canadian and Japanese index returns. The out-of-sample analysis indicates that the gamma autoregressive model has a superior volatility forecasting performance compared to GARCH and EGARCH models.
Resumo:
Efficient markets should guarantee the existence of zero spreads for total return swaps. However, real estate markets have recorded values that are significantly different from zero in both directions. Possible explanations might suggest non-rational behaviour by inexperienced market players or unusual features of the underlying asset market. We find that institutional characteristics in the underlying market lead to market inefficiencies and, hence, to the creation of a rational trading window with upper and lower bounds within which transactions do not offer arbitrage opportunities. Given the existence of this rational trading window, we also argue that the observed spreads can substantially be explained by trading imbalances due to the limited liquidity of a newly formed market and/or to the effect of market sentiment, complementing explanations based on the lag between underlying market returns and index returns.
Resumo:
This paper considers how trading volume impacts upon the first three moments of REIT returns. Consistent with previous studies of the broader stock market, we find that volume is a significant factor with respect to both returns and volatility. We also find evidence supportive of the Hong & Stein’s (2003) Investor Heterogeneity Theory with respect to the finding that skewness in REIT index returns is significantly related to volume. Furthermore, we also report findings that show the influence of the variability of volume with skewness.
Resumo:
This paper uses a recently developed nonlinear Granger causality test to determine whether linear orthogonalization really does remove general stock market influences on real estate returns to leave pure industry effects in the latter. The results suggest that there is no nonlinear relationship between the US equity-based property index returns and returns on a general stock market index, although there is evidence of nonlinear causality for the corresponding UK series.
Resumo:
Esta tese se dedica a estudos na área de finanças. Os estudos se subdividem nas subáreas de microestrutura e apreçamento de ativos, mas há uma inserção do trabalho em finanças corporativas, uma vez que trato da governança corporativa das empresas. No primeiro capítulo estimo o coeficiente de assimetria de informação embutido no spread de compra e venda de ações brasileiras. Além disso, verifico se há padrões para esse coeficiente e para o próprio spread em relação ao tamanho da transação e à hora de negociação. No capítulo dois, eu investigo quais características ligadas às empresas têm relação com as variáveis estimadas no capítulo 1, o coeficiente de assimetria de informação embutido no spread de compra e venda de ações brasileiras e o próprio spread. A governança corporativa das empresas é uma das características examinadas. No terceiro capítulo, eu observo quais mecanismos de governança corporativa fazem com que haja uma relação antagônica entre os retornos das ações brasileiras e o índice de governança corporativa, conforme mostrado por Carvalhal e Nobili (2011). Nesta investigação, dou ênfase à concentração acionária das empresas brasileiras que, em comparação com países mais desenvolvidos, é extremamente alta.
Resumo:
Current research compares the Bayesian estimates obtained for the parameters of processes of ARCH family with normal and Student's t distributions for the conditional distribution of the return series. A non-informative prior distribution was adopted and a reparameterization of models under analysis was taken into account to map parameters' space into real space. The procedure adopts a normal prior distribution for the transformed parameters. The posterior summaries were obtained by Monte Carlo Markov Chain (MCMC) simulation methods. The methodology was evaluated by a series of Bovespa Index returns and the predictive ordinate criterion was employed to select the best adjustment model to the data. Results show that, as a rule, the proposed Bayesian approach provides satisfactory estimates and that the GARCH process with Student's t distribution adjusted better to the data.
Resumo:
This thesis focuses on the limits that may prevent an entrepreneur from maximizing her value, and the benefits of diversification in reducing her cost of capital. After reviewing all relevant literature dealing with the differences between traditional corporate finance and entrepreneurial finance, we focus on the biases occurring when traditional finance techniques are applied to the entrepreneurial context. In particular, using the portfolio theory framework, we determine the degree of under-diversification of entrepreneurs. Borrowing the methodology developed by Kerins et al. (2004), we test a model for the cost of capital according to the firms' industry and the entrepreneur's wealth commitment to the firm. This model takes three market inputs (standard deviation of market returns, expected return of the market, and risk-free rate), and two firm-specific inputs (standard deviation of the firm returns and correlation between firm and market returns) as parameters, and returns an appropriate cost of capital as an output. We determine the expected market return and the risk-free rate according to the huge literature on the market risk premium. As for the market return volatility, it is estimated considering a GARCH specification for the market index returns. Furthermore, we assume that the firm-specific inputs can be obtained considering new-listed firms similar in risk to the firm we are evaluating. After we form a database including all the data needed for our analysis, we perform an empirical investigation to understand how much of the firm's total risk depends on market risk, and which explanatory variables can explain it. Our results show that cost of capital declines as the level of entrepreneur's commitment decreases. Therefore, maximizing the value for the entrepreneur depends on the fraction of entrepreneur's wealth invested in the firm and the fraction she sells to outside investors. These results are interesting both for entrepreneurs and policy makers: the former can benefit from an unbiased model for their valuation; the latter can obtain some guidelines to overcome the recent financial market crisis.
Resumo:
The primary purpose of the paper is to analyze the conditional correlations, conditional covariances, and co-volatility spillovers between international crude oil and associated financial markets. The paper investigates co-volatility spillovers (namely, the delayed effect of a returns shock in one physical or financial asset on the subsequent volatility or co-volatility in another physical or financial asset) between the oil and financial markets. The oil industry has four major regions, namely North Sea, USA, Middle East, and South-East Asia. Associated with these regions are two major financial centers, namely UK and USA. For these reasons, the data to be used are the returns on alternative crude oil markets, returns on crude oil derivatives, specifically futures, and stock index returns in UK and USA. The paper will also analyze the Chinese financial markets, where the data are more recent. The empirical analysis will be based on the diagonal BEKK model, from which the conditional covariances will be used for testing co-volatility spillovers, and policy recommendations. Based on these results, dynamic hedging strategies will be suggested to analyze market fluctuations in crude oil prices and associated financial markets.