927 resultados para implied volatility, VIX, volatility forecasts, informational efficiency
Resumo:
Public Works is pleased to present the following Iowa Efficiency Review Report to Governor Chet Culver and Lieutenant Governor Patty Judge. This report is the product of a collaboration between our consulting team and very dedicated Iowa state employees who worked with us to share ideas and cost‐saving proposals under very difficult circumstances caused by the national financial crisis that is affecting state budgets throughout the country. For example, during the course of this review, Iowa departments were also asked to develop across‐the‐board cuts to achieve immediate reductions in state spending. It is a credit to Iowa state government that departmental staff continued to work on this Efficiency Review Report despite these challenges of also having to develop across‐the‐board budget cuts to achieve a balanced budget. We hope that these ideas will set the stage for further future budget improvements from achieving efficiencies, eliminating outdated practices, increasing the use of information technology solutions and finding new sources of non‐tax funding. The Efficiency Review Team faced a second challenge. Statewide Efficiency Reviews usually take from nine to 12 months to complete. In Iowa, we worked with dedicated department staff to complete our work in less than 4 months. The Governor challenged all of us to work intensely and to give him our best thinking on efficiency proposals so that he could act as quickly as possible to position state government for success over the next several years.
Resumo:
Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.
Resumo:
The purpose of this project was to determine the feasibility of using pavement condition data collected for the Iowa Pavement Management Program (IPMP) as input to the Iowa Quadrennial Need Study. The need study, conducted by the Iowa Department of Transportation (Iowa DOT) every four years, currently uses manually collected highway infrastructure condition data (roughness, rutting, cracking, etc.). Because of the Iowa DOT's 10-year data collection cycles, condition data for a given highway segment may be up to 10 years old. In some cases, the need study process has resulted in wide fluctuations in funding allocated to individual Iowa counties from one study to the next. This volatility in funding levels makes it difficult for county engineers to plan and program road maintenance and improvements. One possible remedy is to input more current and less subjective infrastructure condition data. The IPMP was initially developed to satisfy the Intermodal Surface Transportation Efficiency Act (ISTEA) requirement that federal-aid-eligible highways be managed through a pavement management system. Currently all metropolitan planning organizations (MPOs) in Iowa and 15 of Iowa's 18 RPAs participate in the IPMP. The core of this program is a statewide data base of pavement condition and construction history information. The pavement data are collected by machine in two-year cycles. Using pilot areas, researchers examined the implications of using the automated data collected for the IPMP as input to the need study computer program, HWYNEEDS. The results show that using the IPMP automated data in HWYNEEDS is feasible and beneficial, resulting in less volatility in the level of total need between successive quadrennial need studies. In other words, the more current the data, the smaller the shift in total need.
Resumo:
We study the determinants of political myopia in a rational model of electoral accountability where the key elements are informational frictions and uncertainty. We build aframework where political ability is ex-ante unknown and policy choices are not perfectlyobservable. On the one hand, elections improve accountability and allow to keep well-performing incumbents. On the other, politicians invest too little in costly policies withfuture returns in an attempt to signal high ability and increase their reelection probability.Contrary to the conventional wisdom, uncertainty reduces political myopia and may, undersome conditions, increase social welfare. We use the model to study how political rewardscan be set so as to maximise social welfare and the desirability of imposing a one-term limitto governments. The predictions of our theory are consistent with a number of stylised factsand with a new empirical observation documented in this paper: aggregate uncertainty, measured by economic volatility, is associated to better fiscal discipline in a panel of 20 OECDcountries.
Resumo:
The increasing interest aroused by more advanced forecasting techniques, together with the requirement for more accurate forecasts of tourismdemand at the destination level due to the constant growth of world tourism, has lead us to evaluate the forecasting performance of neural modelling relative to that of time seriesmethods at a regional level. Seasonality and volatility are important features of tourism data, which makes it a particularly favourable context in which to compare the forecasting performance of linear models to that of nonlinear alternative approaches. Pre-processed official statistical data of overnight stays and tourist arrivals fromall the different countries of origin to Catalonia from 2001 to 2009 is used in the study. When comparing the forecasting accuracy of the different techniques for different time horizons, autoregressive integrated moving average models outperform self-exciting threshold autoregressions and artificial neural network models, especially for shorter horizons. These results suggest that the there is a trade-off between the degree of pre-processing and the accuracy of the forecasts obtained with neural networks, which are more suitable in the presence of nonlinearity in the data. In spite of the significant differences between countries, which can be explained by different patterns of consumer behaviour,we also find that forecasts of tourist arrivals aremore accurate than forecasts of overnight stays.
Resumo:
Työssä tutkittiin korkean leimahduspisteen laimentimien vaikutusta uuton tehokkuuteen ja turvallisuuteen. Kirjallisuusosa sisältää katsauksen uuttolaitoksilla tapahtuneista suuronnettomuuksista, staattisen sähkön aiheuttamista vaaroista uuttolaitoksilla ja kaupallisesti saatavista laimentimista. Lisäksi kirjallisuusosassa tarkastellaan hiilivetyjen molekyylirakenteen vaikutusta niiden leimahduspisteeseen, haihtuvuuteen, viskositeettiin ja liuotinominaisuuksiin. Kokeellisessa osassa tutkittiin uuton tehokkuutta kuvaavia ominaisuuksia, joita olivat sekoituksen pisarakoko, faasien selkeytymisnopeus,uuton ja takaisinuuton kinetiikka, orgaanisen faasin viskositeetti ja tiheys. Uuttoliuosten turvallisuusominaisuuksia tutkittiin mittaamalla synteettisten uuttoliuosten ja laimentimien leimahduspisteitä sekä sähköisesti varattujen laimentimien relaksaatioaikoja. Korkean leimahduspisteen laimentimena käytettiin Orfom SX 11-laimenninta. Vertailukohteena käytettiin Shellsol D70- ja Escaid 100- laimentimia. Malliuuttona käytettiin kuparin uuttoa hydroksioksiimireagensilla happamasta sulfaattiliuoksesta. Kokeissa havaittiin, että korkean leimahduspisteen laimentimen viskositeetti oli huomattavasti suurempi kuin Shellsol D70- laimentimella. Korkea viskositeetti hidasti faasien selkeytymistä uutossa, mutta sillä ei ollut vaikutusta uuton kinetiikkaan tai sekoituksen aiheuttamaan pisarakokoon. Uuttoliuoksen reagenssipitoisuudella havaittiin olevan vaikutusta uuttoliuoksen leimahduspisteeseen, mutta uuttoliuoksen latausasteella ei havaittu olevan vaikutusta. Sähköisesti varattujen laimentimien varauksien relaksaatioajoissa oli hieman eroja, mutta relaksaatioajat olivat kaikilla laimentimilla liian pitkiä staattisen sähkön aiheuttaman vaaran poistamiseksi.
Resumo:
The main objective of this master’s thesis was to quantitatively study the reliability of market and sales forecasts of a certain company by measuring bias, precision and accuracy of these forecasts by comparing forecasts against actual values. Secondly, the differences of bias, precision and accuracy between markets were explained by various macroeconomic variables and market characteristics. Accuracy and precision of the forecasts seems to vary significantly depending on the market that is being forecasted, the variable that is being forecasted, the estimation period, the length of the estimated period, the forecast horizon and the granularity of the data. High inflation, low income level and high year-on-year market volatility seems to be related with higher annual market forecast uncertainty and high year-on-year sales volatility with higher sales forecast uncertainty. When quarterly market size is forecasted, correlation between macroeconomic variables and forecast errors reduces. Uncertainty of the sales forecasts cannot be explained with macroeconomic variables. Longer forecasts are more uncertain, shorter estimated period leads to higher uncertainty, and usually more recent market forecasts are less uncertain. Sales forecasts seem to be more uncertain than market forecasts, because they incorporate both market size and market share risks. When lead time is more than one year, forecast risk seems to grow as a function of root forecast horizon. When lead time is less than year, sequential error terms are typically correlated, and therefore forecast errors are trending or mean-reverting. The bias of forecasts seems to change in cycles, and therefore the future forecasts cannot be systematically adjusted with it. The MASE cannot be used to measure whether the forecast can anticipate year-on-year volatility. Instead, we constructed a new relative accuracy measure to cope with this particular situation.
Resumo:
The main aim of this study was to inspect the influence of the ultrafiltration implementation on the washing and on bleaching efficiency. Four cases corresponding to four washing stages were observed: two with hardwood pulp and two with softwood pulp; each case had a reference and a trial experiment. The experiments with hardwood pulp were arranged in a manner to explore predominantly the possibility of bleaching performance improvement by applying for washing instead of untreated filtrate (reference case) the same treated one (trial case). Despite that the ultrafiltration reduced the COD of the wash filtrates allowing the decreasing of COD carry-over to the bleaching stage it didn’t affect the bleaching performance. Another set was used in the experiments with softwood pulp. It implied the ultrafiltration and recirculation of the filtrate to the same washing stage with the purpose to reduce the volumes and pollution of the bleaching effluents. In one case the negative result was obtained which was expressed by worse parameters of the pulp after bleaching. Another case showed the opportunity to replace hot water with the filtrate and reduce the fresh water consumption.
Resumo:
Työn tavoitteena oli selvittää, miten elintarviketukun suurtaloustuotteiden saatavuus voitaisiin varmistaa. Nykyisellään suurtalouselintarvikkeiden kysynnän vaihtelut ovat vaikeasti hallittavia, mikä nostaa niiden varastotasoja aiheuttaen ongelmia kohdeyrityksen ahtaaksi käyneessä varastossa. Lisäksi tuotteiden tilaaminen työllistää neljä henkilöä ja mahdollinen tilausmäärien kasvu lisäisi henkilöstötarvetta entisestään. Työn tuloksena yrityksen tuotteet sekä toimittajat jaettiin neljään eri ryhmään: paras a-ryhmä, haasteryhmä, testiryhmä ja poistoryhmä. Näiden ryhmien varastojen ja tilausten hallitsemiseksi esitettiin puolestaan kolme eri tapaa: Automaattiset ostotilaukset sopivat kaikille tasaisen kysynnän tuotteille. Suuren kysynnän vaihtelun tuotteille voidaan käyttää nykyistä tilaustapaa sekä hyödyntää mahdollisuuksien mukaan asiakkailta saatavia menekkiennusteita tilaamisen tukena. Ongelmallisten suuren kysynnän vaihtelun ja pienen menekin tuotteiden kohdalla tuot-teet voidaan joko poistaa kokonaan yrityksen valikoimasta tai niiden tilaaminen voidaan muuttaa varasto-ohjauksen sijaan tilausohjautuvaksi.
Resumo:
A trade-off between return and risk plays a central role in financial economics. The intertemporal capital asset pricing model (ICAPM) proposed by Merton (1973) provides a neoclassical theory for expected returns on risky assets. The model assumes that risk-averse investors (seeking to maximize their expected utility of lifetime consumption) demand compensation for bearing systematic market risk and the risk of unfavorable shifts in the investment opportunity set. Although the ICAPM postulates a positive relation between the conditional expected market return and its conditional variance, the empirical evidence on the sign of the risk-return trade-off is conflicting. In contrast, autocorrelation in stock returns is one of the most consistent and robust findings in empirical finance. While autocorrelation is often interpreted as a violation of market efficiency, it can also reflect factors such as market microstructure or time-varying risk premia. This doctoral thesis investigates a relation between the mixed risk-return trade-off results and autocorrelation in stock returns. The results suggest that, in the case of the US stock market, the relative contribution of the risk-return trade-off and autocorrelation in explaining the aggregate return fluctuates with volatility. This effect is then shown to be even more pronounced in the case of emerging stock markets. During high-volatility periods, expected returns can be described using rational (intertemporal) investors acting to maximize their expected utility. During lowvolatility periods, market-wide persistence in returns increases, leading to a failure of traditional equilibrium-model descriptions for expected returns. Consistent with this finding, traditional models yield conflicting evidence concerning the sign of the risk-return trade-off. The changing relevance of the risk-return trade-off and autocorrelation can be explained by heterogeneous agents or, more generally, by the inadequacy of the neoclassical view on asset pricing with unboundedly rational investors and perfect market efficiency. In the latter case, the empirical results imply that the neoclassical view is valid only under certain market conditions. This offers an economic explanation as to why it has been so difficult to detect a positive tradeoff between the conditional mean and variance of the aggregate stock return. The results highlight the importance, especially in the case of emerging stock markets, of noting both the risk-return trade-off and autocorrelation in applications that require estimates for expected returns.
Resumo:
Maritime transport is the foundation for trade in the Baltic Sea area. It represents over 15% of the world’s cargo traffic and it is predicted to increase by over 100% in the future. There are currently over 2,000 ships sailing on the Baltic Sea and both the number and the size of ships have been growing in recent years. Due to the importance of maritime traffic in the Baltic Sea Region, ports have to be ready to face future challenges and adapt to the changing operational environment. The companies within the transportation industry – in this context ports, shipowners and logistics companies – compete continuously and although the number of companies in the business is not particularly substantial because the products offered are very similar, other motives for managing the supply chain arise. The factors creating competitive advantage are often financial and related to cost efficiency, but geographical location, road infrastructure in the hinterland and vessel connections are among the most important factors. The PENTA project focuses on adding openness, transparency and sharing knowledge and information, so that the challenges of the future can be better addressed with regard to cooperation. This report presents three scenario-based traffic forecasts for routes between the PENTA ports in 2020. The chosen methodology is PESTE, in which the focus in on economic factors affecting future traffic flows. The report further analyses the findings and results of the first PENTA WP2 report “Drivers of demand in cargo and passenger traffic between PENTA ports” and utilises the same material, which was obtained through interviews and mail surveys.
Resumo:
This Master’s Thesis analyses the effectiveness of different hedging models on BRICS (Brazil, Russia, India, China, and South Africa) countries. Hedging performance is examined by comparing two different dynamic hedging models to conventional OLS regression based model. The dynamic hedging models being employed are Constant Conditional Correlation (CCC) GARCH(1,1) and Dynamic Conditional Correlation (DCC) GARCH(1,1) with Student’s t-distribution. In order to capture the period of both Great Moderation and the latest financial crisis, the sample period extends from 2003 to 2014. To determine whether dynamic models outperform the conventional one, the reduction of portfolio variance for in-sample data with contemporaneous hedge ratios is first determined and then the holding period of the portfolios is extended to one and two days. In addition, the accuracy of hedge ratio forecasts is examined on the basis of out-of-sample variance reduction. The results are mixed and suggest that dynamic hedging models may not provide enough benefits to justify harder estimation and daily portfolio adjustment. In this sense, the results are consistent with the existing literature.
Resumo:
In the last few decades, banking has strongly internationalized and become more complex. Hence, bank supervision and regulation has taken global perspective, too. The most important international regulation are the Basel frameworks by the Basel committee on banking supervision. This study examines the effects of bank supervision and regulation, especially the Basel II, on bank risk and risk-taking. In order to separate and recognize the efficiency of these effects, the co-effects of many supervisory and regulatory tools together with other relevant factors must be taken into account. The focus of the study is on the effects of asymmetric information and banking procyclicality on the efficiency of the Basel II. This study tries to find an answer, if the Basel II, implemented in 2008, has decreased bank risk in banks of European Union member states. This study examines empirically, if the volatility on bank stock returns have changed after the implementation of the Basel II. Panel data consists of 62 bank stock returns, bank-specific variables, economic variables and variables concerning regulatory environment between 2003 and 2011. Fixed effects regression is used for panel data analysis. Results indicate that volatility on bank stock returns has increased after 2008 and the implementation of the Basel II. Result is statistically very significant and robustness has been verified in different model specifications. The result of this study contradicts with the goal of the Basel II about banking system stability. Banking procyclicality and wrong incentives for regulatory arbitrage under asymmetric information explained in theoretical part may explain this result. On the other hand, simultaneously with the implementation of the Basel II, the global financial crisis emerged and caused severe losses in banks and increased stock volatility. However, it is clear that supervision and regulation was unable to prevent the global financial crisis. After the financial crisis, supervision and regulation have been reformed globally. The main problems of the Basel II, examined in the theoretical part, have been recognized in order to prevent problems of procyclicality and wrong incentives in the future.
Resumo:
In this paper, we consider testing marginal normal distributional assumptions. More precisely, we propose tests based on moment conditions implied by normality. These moment conditions are known as the Stein (1972) equations. They coincide with the first class of moment conditions derived by Hansen and Scheinkman (1995) when the random variable of interest is a scalar diffusion. Among other examples, Stein equation implies that the mean of Hermite polynomials is zero. The GMM approach we adopted is well suited for two reasons. It allows us to study in detail the parameter uncertainty problem, i.e., when the tests depend on unknown parameters that have to be estimated. In particular, we characterize the moment conditions that are robust against parameter uncertainty and show that Hermite polynomials are special examples. This is the main contribution of the paper. The second reason for using GMM is that our tests are also valid for time series. In this case, we adopt a Heteroskedastic-Autocorrelation-Consistent approach to estimate the weighting matrix when the dependence of the data is unspecified. We also make a theoretical comparison of our tests with Jarque and Bera (1980) and OPG regression tests of Davidson and MacKinnon (1993). Finite sample properties of our tests are derived through a comprehensive Monte Carlo study. Finally, three applications to GARCH and realized volatility models are presented.
Resumo:
The attached file is created with Scientific Workplace Latex