860 resultados para Volatility Models, Volatility, Equity Markets
Resumo:
We model the effects of quantitative easing on the volatility of returns to individual gilts, examining both the effects of QE overall and of the specific days of asset purchases. The action of QE successfully neutralized the six fold increase in volatility that had been experienced by gilts since the start of the financial crisis. The volatility of longer term bonds reduced more quickly than the volatility of short to medium term bonds. The reversion of the volatility of shorter term bonds to pre-crisis levels was found to be more sensitive to the specific operational actions of QE, particularly where they experienced relatively greater purchase activity.
Resumo:
This paper applies the vector AR-DCC-FIAPARCH model to eight national stock market indices' daily returns from 1988 to 2010, taking into account the structural breaks of each time series linked to the Asian and the recent Global financial crisis. We find significant cross effects, as well as long range volatility dependence, asymmetric volatility response to positive and negative shocks, and the power of returns that best fits the volatility pattern. One of the main findings of the model analysis is the higher dynamic correlations of the stock markets after a crisis event, which means increased contagion effects between the markets. The fact that during the crisis the conditional correlations remain on a high level indicates a continuous herding behaviour during these periods of increased market volatility. Finally, during the recent Global financial crisis the correlations remain on a much higher level than during the Asian financial crisis.
Resumo:
2000 Mathematics Subject Classification: 65M06, 65M12.
Resumo:
Ennek a cikknek az a célja, hogy áttekintést adjon annak a folyamatnak néhány főbb állomásáról, amit Black, Scholes és Merton opcióárazásról írt cikkei indítottak el a 70-es évek elején, és ami egyszerre forradalmasította a fejlett nyugati pénzügyi piacokat és a pénzügyi elméletet. / === / This review article compares the development of financial theory within and outside Hungary in the last three decades starting with the Black-Scholes revolution. Problems like the term structure of interest rate volatilities which is in the focus of many research internationally has not received the proper attention among the Hungarian economists. The article gives an overview of no-arbitrage pricing, the partial differential equation approach and the related numerical techniques, like the lattice methods in pricing financial derivatives. The relevant concepts of the martingal approach are overviewed. There is a special focus on the HJM framework of the interest rate development. The idea that the volatility and the correlation can be traded is a new horizon to the Hungarian capital market.
Resumo:
Széleskörűen alátámasztott, empirikus tény, hogy önmagában a nagyobb volatilitás csökkenti a piac likviditását, vagyis változékonyabb piacokon várhatóan nagyobb lesz egy-egy tranzakció áreltérítő hatása. Kutatásomban azt a kérdést vizsgáltam, hogy a Budapesti Értéktőzsdén az OTP-részvény piacán a 2007/2008-as válságban tapasztalható, átmeneti likviditáscsökkenés betudható volt-e egyszerűen a megnövekedett volatilitásnak, vagy ezen túl abban más tényezők (pl. a szereplők körének és viselkedésének drasztikus megváltozása, általános forráscsökkenés stb.) is szerepet játszhattak-e. A volatilitást a loghozamok szórásával, illetve a tényleges ársávval, míg az illikviditást a Budapesti Likviditási Mértékkel (BLM) reprezentáltam. Egyrészt azt állapítottam meg, hogy az OTP esetében a tényleges ársáv szorosabban korrelál a BLM-mel, mint a szórás. Másrészt az is egyértelmű, hogy a válság előtti kapcsolat a volatilitás és a likviditás között a válságban és azután már jelentősen megváltozott. Válságban az illikviditás jóval nagyobb volt, mint amit a volatilitás növekedése alapján vártunk, a válság lecsengése után azonban megfordult ez a reláció. _________ It is a widely supported empirical fact, that the greater volatility in itself decreases the liquidity of the market, namely more volatile a market is, the higher a transaction’s price impact will be. I have examined in my paper the question, whether the decrease of liquidity during the crisis of 2007/2008 in case of the OTP stock – traded on the Budapest Stock Exchange – was the consequence of the increased volatility, or other factors had an effect on the illiquidity as well (e.g.: the drastic change of market participants’ behaviour; reduction of fi nancing sources; etc.). I have represented volatility with the standard deviation of the logreturns, and with the true range, while the illiquidity with the Budapest Liquidity Measure (BLM). On one hand I have identifi ed, that in case of the OTP, the true range has a stronger relationship with the BLM than the standard deviation has. On the other hand it was clear, that the relationship between volatility and liquidity has changed notably during and after the crisis. During crisis the illiquidity was greater than what I have estimated based on the volatility increase, but after the crisis this relation has changed.
Resumo:
Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. ^ Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. ^ Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. ^ Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption. ^
Resumo:
We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.
Resumo:
In their dialogue entitled - The Food Service Industry Environment: Market Volatility Analysis - by Alex F. De Noble, Assistant Professor of Management, San Diego State University and Michael D. Olsen, Associate Professor and Director, Division of Hotel, Restaurant & Institutional Management at Virginia Polytechnic Institute and State University, De Noble and Olson preface the discussion by saying: “Hospitality executives, as a whole, do not believe they exist in a volatile environment and spend little time or effort in assessing how current and future activity in the environment will affect their success or failure. The authors highlight potential differences that may exist between executives' perceptions and objective indicators of environmental volatility within the hospitality industry and suggest that executives change these perceptions by incorporating the assumption of a much more dynamic environment into their future strategic planning efforts. Objective, empirical evidence of the dynamic nature of the hospitality environment is presented and compared to several studies pertaining to environmental perceptions of the industry.” That weighty thesis statement presumes that hospitality executives/managers do not fully comprehend the environment in which they operate. The authors provide a contrast, which conventional wisdom would seem to support and satisfy. “Broadly speaking, the operating environment of an organization is represented by its task domain,” say the authors. “This task domain consists of such elements as a firm's customers, suppliers, competitors, and regulatory groups.” These are dynamic actors and the underpinnings of change, say the authors by way of citation. “The most difficult aspect for management in this regard tends to be the development of a proper definition of the environment of their particular firm. Being able to precisely define who the customers, competitors, suppliers, and regulatory groups are within the environment of the firm is no easy task, yet is imperative if proper planning is to occur,” De Noble and Olson further contribute to support their thesis statement. The article is bloated, and that’s not necessarily a bad thing, with tables both survey and empirically driven, to illustrate market volatility. One such table is the Bates and Eldredge outline; Table-6 in the article. “This comprehensive outline…should prove to be useful to most executives in expanding their perception of the environment of their firm,” say De Noble and Olson. “It is, however, only a suggested outline,” they advise. “…risk should be incorporated into every investment decision, especially in a volatile environment,” say the authors. De Noble and Olson close with an intriguing formula to gauge volatility in an environment.
Resumo:
Estimating the required rate of return for hotel properties is a daunting task because a lodging property is considered a hybrid between a real estate asset, and a revenue-generating enterprise affiliated with a hotel brand. Computing the expected rate of return for a hotel becomes even more complicated when a third party foreign investor/entrepreneur is the one performing the computation for an investment hotel in an emerging country. This clinical case illustrates the challenges surrounding the estimation of a project’s cost of equity in the multinational hotel industry. The results reveal that estimating cost of equity in emerging markets for hotel investments continues to be a conundrum. Future investors should make multiple adjustments and use several models when making their capital investment decisions.
Resumo:
In finance literature many economic theories and models have been proposed to explain and estimate the relationship between risk and return. Assuming risk averseness and rational behavior on part of the investor, the models are developed which are supposed to help in forming efficient portfolios that either maximize (minimize) the expected rate of return (risk) for a given level of risk (rates of return). One of the most used models to form these efficient portfolios is the Sharpe's Capital Asset Pricing Model (CAPM). In the development of this model it is assumed that the investors have homogeneous expectations about the future probability distribution of the rates of return. That is, every investor assumes the same values of the parameters of the probability distribution. Likewise financial volatility homogeneity is commonly assumed, where volatility is taken as investment risk which is usually measured by the variance of the rates of return. Typically the square root of the variance is used to define financial volatility, furthermore it is also often assumed that the data generating process is made of independent and identically distributed random variables. This again implies that financial volatility is measured from homogeneous time series with stationary parameters. In this dissertation, we investigate the assumptions of homogeneity of market agents and provide evidence for the case of heterogeneity in market participants' information, objectives, and expectations about the parameters of the probability distribution of prices as given by the differences in the empirical distributions corresponding to different time scales, which in this study are associated with different classes of investors, as well as demonstrate that statistical properties of the underlying data generating processes including the volatility in the rates of return are quite heterogeneous. In other words, we provide empirical evidence against the traditional views about homogeneity using non-parametric wavelet analysis on trading data, The results show heterogeneity of financial volatility at different time scales, and time-scale is one of the most important aspects in which trading behavior differs. In fact we conclude that heterogeneity as posited by the Heterogeneous Markets Hypothesis is the norm and not the exception.
Resumo:
Liquidity is an important market characteristic for participants in every financial market. One of the three components of liquidity is market depth. Prior literature lacks a comprehensive analysis of depth in U.S. futures markets due to past limitations on the availability of data. However, recent innovations in data collection and dissemination provide new opportunities to investigate the depth dimension of liquidity. In this dissertation, the Chicago Mercantile Exchange (CME) Group proprietary database on depth is employed to study the dynamics of depth in the U.S. futures markets. This database allows for the analysis of depth along the entire limit order book rather than just at the first level. The first essay examines the characteristics of depth within the context of the five-deep limit order book. Results show that a large amount of depth is present in the book beyond the best level. Furthermore, the findings show that the characteristics of five-deep depth between day and night trading vary and that depth is unequal across levels within the limit order book. The second essay examines the link between the five-deep market depth and the bid-ask spread. The results suggest an inverse relation between the spread and the depth after adjusting for control factors. The third essay explores transitory volatility in relation to depth in the limit order book. Evidence supports the relation between an increase in volatility and a subsequent decrease in market depth. Overall, the results of this dissertation are consistent with limit order traders actively managing depth along the limit order book in electronic U.S. futures markets.
Resumo:
Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption.
Resumo:
We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.
Resumo:
This dissertation investigates, based on the Post-Keynesian theory and on its concept of monetary economy of production, the exchange rate behavior of the Brazilian Real in the presence of Brazilian Central Bank's interventions by means of the so-called swap transactions over 2002-2015. Initially, the work analyzes the essential properties of an open monetary economy of production and, thereafter, it presents the basic propositions of the Post-Keynesian view on the exchange rate determination, highlighting the properties of foreign exchange markets and the peculiarities of the Brazilian position into the international monetary and financial system. The research, thereby, accounts for the various segments of the Brazilian foreign exchange market. To accomplish its purpose, we first do a literature review of the Post-Keynesian literature about the topic. Then, we undertake empirical exams of the exchange rate determination using two statistical methods. On the one hand, to measure the volatility of exchange rate, we estimate Auto-regressive Conditional Heteroscedastic (ARCH) and Generalized Auto-regressive Conditional Heteroscedastic (GARCH) models. On the other hand, to measure the variance of the exchange rate in relation to real, financial variables, and the swaps, we estimate a Vector Auto-regression (VAR) model. Both experiments are performed for the nominal and real effective exchange rates. The results show that the swaps respond to exchange rate movements, trying to offset its volatility. This reveals that the exchange rate is, at least in a certain magnitude, sensitive to swaps transactions conducted by the Central Bank. In addition, another empirical result is that the real effective exchange rate responds more to the swaps auctions than the nominal rate.