84 resultados para return volatility
em Queensland University of Technology - ePrints Archive
Resumo:
Forecasts generated by time series models traditionally place greater weight on more recent observations. This paper develops an alternative semi-parametric method for forecasting that does not rely on this convention and applies it to the problem of forecasting asset return volatility. In this approach, a forecast is a weighted average of historical volatility, with the greatest weight given to periods that exhibit similar market conditions to the time at which the forecast is being formed. Weighting is determined by comparing short-term trends in volatility across time (as a measure of market conditions) by means of a multivariate kernel scheme. It is found that the semi-parametric method produces forecasts that are significantly more accurate than a number of competing approaches at both short and long forecast horizons.
Resumo:
Early models of bankruptcy prediction employed financial ratios drawn from pre-bankruptcy financial statements and performed well both in-sample and out-of-sample. Since then there has been an ongoing effort in the literature to develop models with even greater predictive performance. A significant innovation in the literature was the introduction into bankruptcy prediction models of capital market data such as excess stock returns and stock return volatility, along with the application of the Black–Scholes–Merton option-pricing model. In this note, we test five key bankruptcy models from the literature using an upto- date data set and find that they each contain unique information regarding the probability of bankruptcy but that their performance varies over time. We build a new model comprising key variables from each of the five models and add a new variable that proxies for the degree of diversification within the firm. The degree of diversification is shown to be negatively associated with the risk of bankruptcy. This more general model outperforms the existing models in a variety of in-sample and out-of-sample tests.
Resumo:
The price formation of financial assets is a complex process. It extends beyond the standard economic paradigm of supply and demand to the understanding of the dynamic behavior of price variability, the price impact of information, and the implications of trading behavior of market participants on prices. In this thesis, I study aggregate market and individual assets volatility, liquidity dimensions, and causes of mispricing for US equities over a recent sample period. How volatility forecasts are modeled, what determines intradaily jumps and causes changes in intradaily volatility and what drives the premium of traded equity indexes? Are they induced, for example, by the information content of lagged volatility and return parameters or by macroeconomic news, changes in liquidity and volatility? Besides satisfying our intellectual curiosity, answers to these questions are of direct importance to investors developing trading strategies, policy makers evaluating macroeconomic policies and to arbitrageurs exploiting mispricing in exchange-traded funds. Results show that the leverage effect and lagged absolute returns improve forecasts of continuous components of daily realized volatility as well as jumps. Implied volatility does not subsume the information content of lagged returns in forecasting realized volatility and its components. The reported results are linked to the heterogeneous market hypothesis and demonstrate the validity of extending the hypothesis to returns. Depth shocks, signed order flow, the number of trades, and resiliency are the most important determinants of intradaily volatility. In contrast, spread shock and resiliency are predictive of signed intradaily jumps. There are fewer macroeconomic news announcement surprises that cause extreme price movements or jumps than those that elevate intradaily volatility. Finally, the premium of exchange-traded funds is significantly associated with momentum in net asset value and a number of liquidity parameters including the spread, traded volume, and illiquidity. The mispricing of industry exchange traded funds suggest that limits to arbitrage are driven by potential illiquidity.
Resumo:
We test the predictive ability of investor sentiment on the return and volatility at the aggregate market level in the U.S., four largest European countries and three Asia-Pacific countries. We find that in the U.S., France and Italy periods of high consumer confidence levels are followed by low market returns. In Japan both the level and the change in consumer confidence boost the market return in the next month. Further, shifts in sentiment significantly move conditional volatility in most of the countries, and in Italy such impacts lead to an increase in returns by 4.7% in the next month.
Resumo:
This paper employs a VAR-GARCH model to investigate the return links and volatility transmission between the S&P 500 and commodity price indices for energy, food, gold and beverages over the turbulent period from 2000 to 2011. Understanding the price behavior of commodity prices and the volatility transmission mechanism between these markets and the stock exchanges are crucial for each participant, including governments, traders, portfolio managers, consumers, and producers. For return and volatility spillover, the results show significant transmission among the S&P 500 and commodity markets. The past shocks and volatility of the S&P 500 strongly influenced the oil and gold markets. This study finds that the highest conditional correlations are between the S&P 500 and gold index and the S&P 500 and WTI index. We also analyze the optimal weights and hedge ratios for commodities/S&P 500 portfolio holdings using the estimates for each index. Overall, our findings illustrate several important implications for portfolio hedgers for making optimal portfolio allocations, engaging in risk management and forecasting future volatility in equity and commodity markets. © 2013 Elsevier B.V.
Resumo:
Contemporary debates on the role of journalism in society are continuing the tradition of downplaying the role of proactive journalism - generally situated under the catchphrase of the Fourth Estate - in public policy making. This paper puts the case for the retention of a notion of a proactive form of journalism which can be broadly described as "investigative ", because it is important to the public policy process in modern democracies. It argues that critiques that downplay the potential of this form of journalism are flawed and overly deterministic. Finally. it seeks to illustrate how journalists can proactively inquire in ways that are relevant to the lives ofpeople in a range of settings, and that question elite sources in the interests ofthose people.
Resumo:
Agricultural production is one of the major industries in New Zealand and accounts for over 60% of all export trade. The farming industry comprises 70,000 entities ranging in size from small individual run farms to large corporate operations. The reliance of the New Zealand economy to the international rural sector has seen considerable volatility in the rural land markets over the past four decades, with significant shifts in rural land prices based on location, land use and underlying international rural commodity prices. With the increasing attention being paid to the rural sector, especially in relation to food production and bio-fuels, there has been an increasing corporate interest in rural land ownership in relatively low subsidised agricultural producing countries such as New Zealand and Australia. A factor that has limited this participation of institutional investors previously has been a lack of reliable and up-to-date investment performance data for this asset class. This paper is the initial starting phase in the development of a New Zealand South Island rural land investment performance index and covers the period 1990-2007. The research in this paper analyses all rural sales transactions in the South Island and develops a capital return index for rural property based on major rural property land use. Additional work on this index will cover both total return performance and geographic location.
Resumo:
As the paper’s subtitle suggests broadband has had a remarkably checkered trajectory in Australia. It was synonymous with the early 1990s information superhighway and seemed to presage a moment in which “content is [to be] king”. It disappeared almost entirely as a public priority in the mid to late 1990s as intrastructure and content were disconnected in services frameworks focused on information and communication technologies. And it came back in the 2000s as a critical infrastructure for innovation and the knowledge economy. But this time content was not king but rather an intermediate input at the service of innovating industries and processes. Broadband was a critical infrastructure for the digitally-based creative industries. Today the quality of the broadband infrastructure in Australia—itself an outcome of these different policy frameworks—is identified as “fraudband” holding back business, creativity and consumer uptake. In this paper I use the checkered trajectory of broadband on Australian political and policy horizons as a stepping off point to reflect on the ideas governing these changing governmental and public settings. This history enables me to explore how content and infrastructure are simultaneously connected and disconnected in our thinking. And, finally, I want to make some remarks about the way communication, particularly media communication, has been marginally positioned after being, initially so apparently central.
Resumo:
Particle emissions, volatility, and the concentration of reactive oxygen species (ROS) were investigated for a pre-Euro I compression ignition engine to study the potential health impacts of employing ethanol fumigation technology. Engine testing was performed in two separate experimental campaigns with most testing performed at intermediate speed with four different load settings and various ethanol substitutions. A scanning mobility particle sizer (SMPS) was used to determine particle size distributions, a volatilization tandem differential mobility analyzer (V-TDMA) was used to explore particle volatility, and a new profluorescent nitroxide probe, BPEAnit, was used to investigate the potential toxicity of particles. The greatest particulate mass reduction was achieved with ethanol fumigation at full load, which contributed to the formation of a nucleation mode. Ethanol fumigation increased the volatility of particles by coating the particles with organic material or by making extra organic material available as an external mixture. In addition, the particle-related ROS concentrations increased with ethanol fumigation and were associated with the formation of a nucleation mode. The smaller particles, the increased volatility, and the increase in potential particle toxicity with ethanol fumigation may provide a substantial barrier for the uptake of fumigation technology using ethanol as a supplementary fuel.
Resumo:
Forecasting volatility has received a great deal of research attention, with the relative performances of econometric model based and option implied volatility forecasts often being considered. While many studies find that implied volatility is the pre-ferred approach, a number of issues remain unresolved, including the relative merit of combining forecasts and whether the relative performances of various forecasts are statistically different. By utilising recent econometric advances, this paper considers whether combination forecasts of S&P 500 volatility are statistically superior to a wide range of model based forecasts and implied volatility. It is found that a combination of model based forecasts is the dominant approach, indicating that the implied volatility cannot simply be viewed as a combination of various model based forecasts. Therefore, while often viewed as a superior volatility forecast, the implied volatility is in fact an inferior forecast of S&P 500 volatility relative to model-based forecasts.
Resumo:
This paper will investigate the suitability of existing performance measures under the assumption of a clearly defined benchmark. A range of measures are examined including the Sortino Ratio, the Sharpe Selection ratio (SSR), the Student’s t-test and a decay rate measure. A simulation study is used to assess the power and bias of these measures based on variations in sample size and mean performance of two simulated funds. The Sortino Ratio is found to be the superior performance measure exhibiting more power and less bias than the SSR when the distribution of excess returns are skewed.
Resumo:
The term structure of interest rates is often summarized using a handful of yield factors that capture shifts in the shape of the yield curve. In this paper, we develop a comprehensive model for volatility dynamics in the level, slope, and curvature of the yield curve that simultaneously includes level and GARCH effects along with regime shifts. We show that the level of the short rate is useful in modeling the volatility of the three yield factors and that there are significant GARCH effects present even after including a level effect. Further, we find that allowing for regime shifts in the factor volatilities dramatically improves the model’s fit and strengthens the level effect. We also show that a regime-switching model with level and GARCH effects provides the best out-of-sample forecasting performance of yield volatility. We argue that the auxiliary models often used to estimate term structure models with simulation-based estimation techniques should be consistent with the main features of the yield curve that are identified by our model.
Resumo:
Much research has investigated the differences between option implied volatilities and econometric model-based forecasts. Implied volatility is a market determined forecast, in contrast to model-based forecasts that employ some degree of smoothing of past volatility to generate forecasts. Implied volatility has the potential to reflect information that a model-based forecast could not. This paper considers two issues relating to the informational content of the S&P 500 VIX implied volatility index. First, whether it subsumes information on how historical jump activity contributed to the price volatility, followed by whether the VIX reflects any incremental information pertaining to future jump activity relative to model-based forecasts. It is found that the VIX index both subsumes information relating to past jump contributions to total volatility and reflects incremental information pertaining to future jump activity. This issue has not been examined previously and expands our understanding of how option markets form their volatility forecasts.