785 resultados para idiosyncratic volatility


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The value premium is well established in empirical asset pricing, but to date there is little understanding as to its fundamental drivers. We use a stochastic earnings valuation model to establish a direct link between the volatility of future earnings growth and firm value. We illustrate that risky earnings growth affects growth and value firms differently. We provide empirical evidence that the volatility of future earnings growth is a significant determinant of the value premium. Using data on individual firms and characteristic-sorted test portfolios, we also find that earnings growth volatility is significant in explaining the cross-sectional variation of stock returns. Our findings imply that the value premium is the rational consequence of accounting for risky earnings growth in the firm valuation process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes an parallel semi-Lagrangian finite difference approach to the pricing of early exercise Asian Options on assets with a stochastic volatility. A multigrid procedure is described for the fast iterative solution of the discrete linear complementarity problems that result. The accuracy and performance of this approach is improved considerably by a strike-price related analytic transformation of asset prices. Asian options are contingent claims with payoffs that depend on the average price of an asset over some time interval. The payoff may depend on this average and a fixed strike price (Fixed Strike Asians) or it may depend on the average and the asset price (Floating Strike Asians). The option may also permit early exercise (American contract) or confine the holder to a fixed exercise date (European contract). The Fixed Strike Asian with early exercise is considered here where continuous arithmetic averaging has been used. Pricing such an option where the asset price has a stochastic volatility leads to the requirement to solve a tri-variate partial differential inequation in the three state variables of asset price, average price and volatility (or equivalently, variance). The similarity transformations [6] used with Floating Strike Asian options to reduce the dimensionality of the problem are not applicable to Fixed Strikes and so the numerical solution of a tri-variate problem is necessary. The computational challenge is to provide accurate solutions sufficiently quickly to support realtime trading activities at a reasonable cost in terms of hardware requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimate the monthly volatility of the US economy from 1968 to 2006 by extending the coincidentindex model of Stock and Watson (1991). Our volatility index, which we call VOLINX, hasfour applications. First, it sheds light on the Great Moderation. VOLINX captures the decrease in thevolatility in the mid-80s as well as the different episodes of stress over the sample period. In the 70sand early 80s the stagflation and the two oil crises marked the pace of the volatility whereas 09/11 is themost relevant shock after the moderation. Second, it helps to understand the economic indicators thatcause volatility. While the main determinant of the coincident index is industrial production, VOLINXis mainly affected by employment and income. Third, it adapts the confidence bands of the forecasts.In and out-of-sample evaluations show that the confidence bands may differ up to 50% with respect to amodel with constant variance. Last, the methodology we use permits us to estimate monthly GDP, whichhas conditional volatility that is partly explained by VOLINX. These applications can be used by policymakers for monitoring and surveillance of the stress of the economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Drawing on historical research, personal interviews, performance analysis, and my own embodied experience as a participant-observer in several clown workshops, I explore the diverse historical influences on clown theatre as it is conceived today. I then investigate how the concept of embodied knowledge is reflected in red-nose clown pedagogy. Finally, I argue that through shared embodied knowledge spectators are able to perceive and appreciate the humor of clown theatre in performance. I propose that clown theatre represents a reaction to the eroding personal connections prompted by the so-called information age, and that humor in clown theatre is a revealing index of socio-cultural values, attitudes, dispositions, and concerns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

International research shows that low-volatility stocks have beaten high-volatility stocks in terms of returns for decades on multiple markets. This abbreviation from traditional risk-return framework is known as low-volatility anomaly. This study focuses on explaining the anomaly and finding how strongly it appears in NASDAQ OMX Helsinki stock exchange. Data consists of all listed companies starting from 2001 and ending close to 2015. Methodology follows closely Baker and Haugen (2012) by sorting companies into deciles according to 3-month volatility and then calculating monthly returns for these different volatility groups. Annualized return for the lowest volatility decile is 8.85 %, while highest volatility decile destroys wealth at rate of -19.96 % per annum. Results are parallel also in quintiles that represent larger amount of companies and thus dilute outliers. Observation period captures financial crisis of 2007-2008 and European debt crisis, which embodies as low main index annual return of 1 %, but at the same time proves the success of low-volatility strategy. Low-volatility anomaly is driven by multiple reasons such as leverage constrained trading and managerial incentives which both prompt to invest in risky assets, but behavioral matters also have major weight in maintaining the anomaly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I investigate the effects of information frictions in price setting decisions. I show that firms' output prices and wages are less sensitive to aggregate economic conditions when firms and workers cannot perfectly understand (or know) the aggregate state of the economy. Prices and wages respond with a lag to aggregate innovations because agents learn slowly about those changes, and this delayed adjustment in prices makes output and unemployment more sensitive to aggregate shocks. In the first chapter of this dissertation, I show that workers' noisy information about the state of the economy help us to explain why real wages are sluggish. In the context of a search and matching model, wages do not immediately respond to a positive aggregate shock because workers do not (yet) have enough information to demand higher wages. This increases firms' incentives to post more vacancies, and it makes unemployment volatile and sensitive to aggregate shocks. This mechanism is robust to two major criticisms of existing theories of sluggish wages and volatile unemployment: the flexibility of wages for new hires and the cyclicality of the opportunity cost of employment. Calibrated to U.S. data, the model explains 60% of the overall unemployment volatility. Consistent with empirical evidence, the response of unemployment to TFP shocks predicted by my model is large, hump-shaped, and peaks one year after the TFP shock, while the response of the aggregate wage is weak and delayed, peaking after two years. In the second chapter of this dissertation, I study the role of information frictions and inventories in firms' price setting decisions in the context of a monetary model. In this model, intermediate goods firms accumulate output inventories, observe aggregate variables with one period lag, and observe their nominal input prices and demand at all times. Firms face idiosyncratic shocks and cannot perfectly infer the state of nature. After a contractionary nominal shock, nominal input prices go down, and firms accumulate inventories because they perceive some positive probability that the nominal price decline is due to a good productivity shock. This prevents firms' prices from decreasing and makes current profits, households' income, and aggregate demand go down. According to my model simulations, a 1% decrease in the money growth rate causes output to decline 0.17% in the first quarter and 0.38% in the second followed by a slow recovery to the steady state. Contractionary nominal shocks also have significant effects on total investment, which remains 1% below the steady state for the first 6 quarters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Entrepreneurship is having the courage to transform an idea in reality and with it achieve personal, nancial and recognition satisfaction. The psychological ability to handle failure has proven essential in success. We analysed the importance of idiosyncratic psychological aspects in the success of entrepreneurs through an observational study accompanying 20 entrepreneurs from the idea presentation phase to company incorporation. During the observation period 4 distinct psychological phases of the entrepreneurs were observed, being it possible to describe them as follows: absorption of information and knowledge; application of the gathered knowledge to their speci c cases; frustration generated by criticism, namely from investors who didn’t recognise the value of their projects; realism and implementation of the project. Having passed more than 6 months after the analysis period, one can verify that the entrepreneurs who have travelled the 4 phases are today developing their projects being that the remaining ones are in a similar situation as at the end of the initial two months. Conclusion: The ability to cope with frustration and rejection is a determinant factor in the success of the entrepreneur. The ability to learn from rejection, more than resilience help the entrepreneur to proceed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This Ph.D. thesis contains 4 essays in mathematical finance with a focus on pricing Asian option (Chapter 4), pricing futures and futures option (Chapter 5 and Chapter 6) and time dependent volatility in futures option (Chapter 7). In Chapter 4, the applicability of the Albrecher et al.(2005)'s comonotonicity approach was investigated in the context of various benchmark models for equities and com- modities. Instead of classical Levy models as in Albrecher et al.(2005), the focus is the Heston stochastic volatility model, the constant elasticity of variance (CEV) model and the Schwartz (1997) two-factor model. It is shown that the method delivers rather tight upper bounds for the prices of Asian Options in these models and as a by-product delivers super-hedging strategies which can be easily implemented. In Chapter 5, two types of three-factor models were studied to give the value of com- modities futures contracts, which allow volatility to be stochastic. Both these two models have closed-form solutions for futures contracts price. However, it is shown that Model 2 is better than Model 1 theoretically and also performs very well empiri- cally. Moreover, Model 2 can easily be implemented in practice. In comparison to the Schwartz (1997) two-factor model, it is shown that Model 2 has its unique advantages; hence, it is also a good choice to price the value of commodity futures contracts. Fur- thermore, if these two models are used at the same time, a more accurate price for commodity futures contracts can be obtained in most situations. In Chapter 6, the applicability of the asymptotic approach developed in Fouque et al.(2000b) was investigated for pricing commodity futures options in a Schwartz (1997) multi-factor model, featuring both stochastic convenience yield and stochastic volatility. It is shown that the zero-order term in the expansion coincides with the Schwartz (1997) two-factor term, with averaged volatility, and an explicit expression for the first-order correction term is provided. With empirical data from the natural gas futures market, it is also demonstrated that a significantly better calibration can be achieved by using the correction term as compared to the standard Schwartz (1997) two-factor expression, at virtually no extra effort. In Chapter 7, a new pricing formula is derived for futures options in the Schwartz (1997) two-factor model with time dependent spot volatility. The pricing formula can also be used to find the result of the time dependent spot volatility with futures options prices in the market. Furthermore, the limitations of the method that is used to find the time dependent spot volatility will be explained, and it is also shown how to make sure of its accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding how imperfect information affects firms' investment decision helps answer important questions in economics, such as how we may better measure economic uncertainty; how firms' forecasts would affect their decision-making when their beliefs are not backed by economic fundamentals; and how important are the business cycle impacts of changes in firms' productivity uncertainty in an environment of incomplete information. This dissertation provides a synthetic answer to all these questions, both empirically and theoretically. The first chapter, provides empirical evidence to demonstrate that survey-based forecast dispersion identifies a distinctive type of second moment shocks different from the canonical volatility shocks to productivity, i.e. uncertainty shocks. Such forecast disagreement disturbances can affect the distribution of firm-level beliefs regardless of whether or not belief changes are backed by changes in economic fundamentals. At the aggregate level, innovations that increase the dispersion of firms' forecasts lead to persistent declines in aggregate investment and output, which are followed by a slow recovery. On the contrary, the larger dispersion of future firm-specific productivity innovations, the standard way to measure economic uncertainty, delivers the ``wait and see" effect, such that aggregate investment experiences a sharp decline, followed by a quick rebound, and then overshoots. At the firm level, data uncovers that more productive firms increase investments given rises in productivity dispersion for the future, whereas investments drop when firms disagree more about the well-being of their future business conditions. These findings challenge the view that the dispersion of the firms' heterogeneous beliefs captures the concept of economic uncertainty, defined by a model of uncertainty shocks. The second chapter presents a general equilibrium model of heterogeneous firms subject to the real productivity uncertainty shocks and informational disagreement shocks. As firms cannot perfectly disentangle aggregate from idiosyncratic productivity because of imperfect information, information quality thus drives the wedge of difference between the unobserved productivity fundamentals, and the firms' beliefs about how productive they are. Distribution of the firms' beliefs is no longer perfectly aligned with the distribution of firm-level productivity across firms. This model not only explains why, at the macro and micro level, disagreement shocks are different from uncertainty shocks, as documented in Chapter 1, but helps reconcile a key challenge faced by the standard framework to study economic uncertainty: a trade-off between sizable business cycle effects due to changes in uncertainty, and the right amount of pro-cyclicality of firm-level investment rate dispersion, as measured by its correlation with the output cycles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper applies two measures to assess spillovers across markets: the Diebold Yilmaz (2012) Spillover Index and the Hafner and Herwartz (2006) analysis of multivariate GARCH models using volatility impulse response analysis. We use two sets of data, daily realized volatility estimates taken from the Oxford Man RV library, running from the beginning of 2000 to October 2016, for the S&P500 and the FTSE, plus ten years of daily returns series for the New York Stock Exchange Index and the FTSE 100 index, from 3 January 2005 to 31 January 2015. Both data sets capture both the Global Financial Crisis (GFC) and the subsequent European Sovereign Debt Crisis (ESDC). The spillover index captures the transmission of volatility to and from markets, plus net spillovers. The key difference between the measures is that the spillover index captures an average of spillovers over a period, whilst volatility impulse responses (VIRF) have to be calibrated to conditional volatility estimated at a particular point in time. The VIRF provide information about the impact of independent shocks on volatility. In the latter analysis, we explore the impact of three different shocks, the onset of the GFC, which we date as 9 August 2007 (GFC1). It took a year for the financial crisis to come to a head, but it did so on 15 September 2008, (GFC2). The third shock is 9 May 2010. Our modelling includes leverage and asymmetric effects undertaken in the context of a multivariate GARCH model, which are then analysed using both BEKK and diagonal BEKK (DBEKK) models. A key result is that the impact of negative shocks is larger, in terms of the effects on variances and covariances, but shorter in duration, in this case a difference between three and six months.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ph.D. in the Faculty of Business Administration

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a method denoted as synthetic portfolio for event studies in market microstructure that is particularly interesting to use with high frequency data and thinly traded markets. The method is based on Synthetic Control Method and provides a robust data driven method to build a counterfactual for evaluating the effects of the volatility call auctions. We find that SMC could be used if the loss function is defined as the difference between the returns of the asset and the returns of a synthetic portfolio. We apply SCM to test the performance of the volatility call auction as a circuit breaker in the context of an event study. We find that for Colombian Stock Market securities, the asynchronicity of intraday data reduces the analysis to a selected group of stocks, however it is possible to build a tracking portfolio. The realized volatility increases after the auction, indicating that the mechanism is not enhancing the price discovery process.