11 resultados para out-of-sample forecast
em Helda - Digital Repository of University of Helsinki
Resumo:
One of the most fundamental and widely accepted ideas in finance is that investors are compensated through higher returns for taking on non-diversifiable risk. Hence the quantification, modeling and prediction of risk have been, and still are one of the most prolific research areas in financial economics. It was recognized early on that there are predictable patterns in the variance of speculative prices. Later research has shown that there may also be systematic variation in the skewness and kurtosis of financial returns. Lacking in the literature so far, is an out-of-sample forecast evaluation of the potential benefits of these new more complicated models with time-varying higher moments. Such an evaluation is the topic of this dissertation. Essay 1 investigates the forecast performance of the GARCH (1,1) model when estimated with 9 different error distributions on Standard and Poor’s 500 Index Future returns. By utilizing the theory of realized variance to construct an appropriate ex post measure of variance from intra-day data it is shown that allowing for a leptokurtic error distribution leads to significant improvements in variance forecasts compared to using the normal distribution. This result holds for daily, weekly as well as monthly forecast horizons. It is also found that allowing for skewness and time variation in the higher moments of the distribution does not further improve forecasts. In Essay 2, by using 20 years of daily Standard and Poor 500 index returns, it is found that density forecasts are much improved by allowing for constant excess kurtosis but not improved by allowing for skewness. By allowing the kurtosis and skewness to be time varying the density forecasts are not further improved but on the contrary made slightly worse. In Essay 3 a new model incorporating conditional variance, skewness and kurtosis based on the Normal Inverse Gaussian (NIG) distribution is proposed. The new model and two previously used NIG models are evaluated by their Value at Risk (VaR) forecasts on a long series of daily Standard and Poor’s 500 returns. The results show that only the new model produces satisfactory VaR forecasts for both 1% and 5% VaR Taken together the results of the thesis show that kurtosis appears not to exhibit predictable time variation, whereas there is found some predictability in the skewness. However, the dynamic properties of the skewness are not completely captured by any of the models.
Resumo:
The four scientific articles comprising this doctoral dissertation offer new information on the presentation and construction of addiction in the mass media during the period 1968 - 2008. Diachronic surveys as well as quantitative and qualitative content analyses were undertaken to discern trends during the period in question and to investigate underlying conceptions of the problems in contemporary media presentations. The research material for the first three articles consists of a sample of 200 texts from Finland s biggest daily newspaper, Helsingin Sanomat, from the period 1968 - 2006. The fourth study examines English-language tabloid material published on the Internet in 2005 - 2008. A number of principal trends are identified. In addition to a significant increase in addiction reporting over time, the study shows that an internalisation of addiction problems took place in the media presentations under study. The phenomenon is portrayed and tackled from within the problems themselves, often from the viewpoint of the individuals concerned. The tone becomes more personal, and technical and detailed accounts are more and more frequent. Secondly, the concept of addiction is broadened. This can be dated to the 1990s. The concept undergoes a conventionalisation: it is used more frequently in a manner that is not thought to require explanation. The word riippuvuus (the closest equivalent to addiction in Finnish) was adopted more commonly in the reporting at the same time, in the 1990s. Thirdly, the results highlight individual self-governance as a superordinate principle in contemporary descriptions of addiction. If the principal demarcation in earlier texts was between us and them , it is now focused primarily on the individual s competence and ability to govern the self, to restrain and master one's behaviour. Finally, in the fourth study investigating textual constructions of female celebrities (Amy Winehouse, Britney Spears and Kate Moss) in Internet tabloids, various relations and functions of addiction problems, intoxication, body and gender were observed to function as cultural symbols. Addiction becomes a sign, or a style, that represents different significations in relation to the main characters in the tabloid stories. Tabloids, as a genre, play an important role by introducing other images of the problems than those featured in mainstream media. The study is positioned within the framework of modernity theory and its views on the need for self-reflexivity and biographies as tools for the creation and definition of the self. Traditional institutions such as the church, occupation, family etc. no longer play an important role in self-definition. This circumstance creates a need for a culture conveying stories of success and failure in relation to which the individual can position their own behaviour and life content. I propose that addiction , as a theme in media reporting, resolves the conflict that emanates from the ambivalence between the accessibility and the individualisation of consumer society, on the one hand, and the problematic behavioural patterns (addictions) that they may induce, on the other.
Resumo:
First, in Essay 1, we test whether it is possible to forecast Finnish Options Index return volatility by examining the out-of-sample predictive ability of several common volatility models with alternative well-known methods; and find additional evidence for the predictability of volatility and for the superiority of the more complicated models over the simpler ones. Secondly, in Essay 2, the aggregated volatility of stocks listed on the Helsinki Stock Exchange is decomposed into a market, industry-and firm-level component, and it is found that firm-level (i.e., idiosyncratic) volatility has increased in time, is more substantial than the two former, predicts GDP growth, moves countercyclically and as well as the other components is persistent. Thirdly, in Essay 3, we are among the first in the literature to seek for firm-specific determinants of idiosyncratic volatility in a multivariate setting, and find for the cross-section of stocks listed on the Helsinki Stock Exchange that industrial focus, trading volume, and block ownership, are positively associated with idiosyncratic volatility estimates––obtained from both the CAPM and the Fama and French three-factor model with local and international benchmark portfolios––whereas a negative relation holds between firm age as well as size and idiosyncratic volatility.
Development of Sample Pretreatment and Liquid Chromatographic Techniques for Antioxidative Compounds
Resumo:
In this study, novel methodologies for the determination of antioxidative compounds in herbs and beverages were developed. Antioxidants are compounds that can reduce, delay or inhibit oxidative events. They are a part of the human defense system and are obtained through the diet. Antioxidants are naturally present in several types of foods, e.g. in fruits, beverages, vegetables and herbs. Antioxidants can also be added to foods during manufacturing to suppress lipid oxidation and formation of free radicals under conditions of cooking or storage and to reduce the concentration of free radicals in vivo after food ingestion. There is growing interest in natural antioxidants, and effective compounds have already been identified from antioxidant classes such as carotenoids, essential oils, flavonoids and phenolic acids. The wide variety of sample matrices and analytes presents quite a challenge for the development of analytical techniques. Growing demands have been placed on sample pretreatment. In this study, three novel extraction techniques, namely supercritical fluid extraction (SFE), pressurised hot water extraction (PHWE) and dynamic sonication-assisted extraction (DSAE) were studied. SFE was used for the extraction of lycopene from tomato skins and PHWE was used in the extraction of phenolic compounds from sage. DSAE was applied to the extraction of phenolic acids from Lamiaceae herbs. In the development of extraction methodologies, the main parameters of the extraction were studied and the recoveries were compared to those achieved by conventional extraction techniques. In addition, the stability of lycopene was also followed under different storage conditions. For the separation of the antioxidative compounds in the extracts, liquid chromatographic methods (LC) were utilised. Two novel LC techniques, namely ultra performance liquid chromatography (UPLC) and comprehensive two-dimensional liquid chromatography (LCxLC) were studied and compared with conventional high performance liquid chromatography (HPLC) for the separation of antioxidants in beverages and Lamiaceae herbs. In LCxLC, the selection of LC mode, column dimensions and flow rates were studied and optimised to obtain efficient separation of the target compounds. In addition, the separation powers of HPLC, UPLC, HPLCxHPLC and HPLCxUPLC were compared. To exploit the benefits of an integrated system, in which sample preparation and final separation are performed in a closed unit, dynamic sonication-assisted extraction was coupled on-line to a liquid chromatograph via a solid-phase trap. The increased sensitivity was utilised in the extraction of phenolic acids from Lamiaceae herbs. The results were compared to those of achieved by the LCxLC system.
Resumo:
The outcome of the successfully resuscitated patient is mainly determined by the extent of hypoxic-ischemic cerebral injury, and hypothermia has multiple mechanisms of action in mitigating such injury. The present study was undertaken from 1997 to 2001 in Helsinki as a part of the European multicenter study Hypothermia after cardiac arrest (HACA) to test the neuroprotective effect of therapeutic hypothermia in patients resuscitated from out-of-hospital ventricular fibrillation (VF) cardiac arrest (CA). The aim of this substudy was to examine the neurological and cardiological outcome of these patients, and especially to study and develop methods for prediction of outcome in the hypothermia-treated patients. A total of 275 patients were randomized to the HACA trial in Europe. In Helsinki, 70 patients were enrolled in the study according to the inclusion criteria. Those randomized to hypothermia were actively cooled externally to a core temperature 33 ± 1ºC for 24 hours with a cooling device. Serum markers of ischemic neuronal injury, NSE and S-100B, were sampled at 24, 36, and 48 hours after CA. Somatosensory and brain stem auditory evoked potentials (SEPs and BAEPs) were recorded 24 to 28 hours after CA; 24-hour ambulatory electrocardiography recordings were performed three times during the first two weeks and arrhythmias and heart rate variability (HRV) were analyzed from the tapes. The clinical outcome was assessed 3 and 6 months after CA. Neuropsychological examinations were performed on the conscious survivors 3 months after the CA. Quantitative electroencephalography (Q-EEG) and auditory P300 event-related potentials were studied at the same time-point. Therapeutic hypothermia of 33ºC for 24 hours led to an increased chance of good neurological outcome and survival after out-of-hospital VF CA. In the HACA study, 55% of hypothermia-treated patients and 39% of normothermia-treated patients reached a good neurological outcome (p=0.009) at 6 months after CA. Use of therapeutic hypothermia was not associated with any increase in clinically significant arrhythmias. The levels of serum NSE, but not the levels of S-100B, were lower in hypothermia- than in normothermia-treated patients. A decrease in NSE values between 24 and 48 hours was associated with good outcome at 6 months after CA. Decreasing levels of serum NSE but not of S-100B over time may indicate selective attenuation of delayed neuronal death by therapeutic hypothermia, and the time-course of serum NSE between 24 and 48 hours after CA may help in clinical decision-making. In SEP recordings bilaterally absent N20 responses predicted permanent coma with a specificity of 100% in both treatment arms. Recording of BAEPs provided no additional benefit in outcome prediction. Preserved 24- to 48-hour HRV may be a predictor of favorable outcome in CA patients treated with hypothermia. At 3 months after CA, no differences appeared in any cognitive functions between the two groups: 67% of patients in the hypothermia and 44% patients in the normothermia group were cognitively intact or had only very mild impairment. No significant differences emerged in any of the Q-EEG parameters between the two groups. The amplitude of P300 potential was significantly higher in the hypothermia-treated group. These results give further support to the use of therapeutic hypothermia in patients with sudden out-of-hospital CA.
Resumo:
This thesis studies binary time series models and their applications in empirical macroeconomics and finance. In addition to previously suggested models, new dynamic extensions are proposed to the static probit model commonly used in the previous literature. In particular, we are interested in probit models with an autoregressive model structure. In Chapter 2, the main objective is to compare the predictive performance of the static and dynamic probit models in forecasting the U.S. and German business cycle recession periods. Financial variables, such as interest rates and stock market returns, are used as predictive variables. The empirical results suggest that the recession periods are predictable and dynamic probit models, especially models with the autoregressive structure, outperform the static model. Chapter 3 proposes a Lagrange Multiplier (LM) test for the usefulness of the autoregressive structure of the probit model. The finite sample properties of the LM test are considered with simulation experiments. Results indicate that the two alternative LM test statistics have reasonable size and power in large samples. In small samples, a parametric bootstrap method is suggested to obtain approximately correct size. In Chapter 4, the predictive power of dynamic probit models in predicting the direction of stock market returns are examined. The novel idea is to use recession forecast (see Chapter 2) as a predictor of the stock return sign. The evidence suggests that the signs of the U.S. excess stock returns over the risk-free return are predictable both in and out of sample. The new "error correction" probit model yields the best forecasts and it also outperforms other predictive models, such as ARMAX models, in terms of statistical and economic goodness-of-fit measures. Chapter 5 generalizes the analysis of univariate models considered in Chapters 2 4 to the case of a bivariate model. A new bivariate autoregressive probit model is applied to predict the current state of the U.S. business cycle and growth rate cycle periods. Evidence of predictability of both cycle indicators is obtained and the bivariate model is found to outperform the univariate models in terms of predictive power.
Resumo:
The objective of this paper is to investigate and model the characteristics of the prevailing volatility smiles and surfaces on the DAX- and ESX-index options markets. Continuing on the trend of Implied Volatility Functions, the Standardized Log-Moneyness model is introduced and fitted to historical data. The model replaces the constant volatility parameter of the Black & Scholes pricing model with a matrix of volatilities with respect to moneyness and maturity and is tested out-of-sample. Considering the dynamics, the results show support for the hypotheses put forward in this study, implying that the smile increases in magnitude when maturity and ATM volatility decreases and that there is a negative/positive correlation between a change in the underlying asset/time to maturity and implied ATM volatility. Further, the Standardized Log-Moneyness model indicates an improvement to pricing accuracy compared to previous Implied Volatility Function models, however indicating that the parameters of the models are to be re-estimated continuously for the models to fully capture the changing dynamics of the volatility smiles.
Resumo:
The growing interest for sequencing with higher throughput in the last decade has led to the development of new sequencing applications. This thesis concentrates on optimizing DNA library preparation for Illumina Genome Analyzer II sequencer. The library preparation steps that were optimized include fragmentation, PCR purification and quantification. DNA fragmentation was performed with focused sonication in different concentrations and durations. Two column based PCR purification method, gel matrix method and magnetic bead based method were compared. Quantitative PCR and gel electrophoresis in a chip were compared for DNA quantification. The magnetic bead purification was found to be the most efficient and flexible purification method. The fragmentation protocol was changed to produce longer fragments to be compatible with longer sequencing reads. Quantitative PCR correlates better with the cluster number and should thus be considered to be the default quantification method for sequencing. As a result of this study more data have been acquired from sequencing with lower costs and troubleshooting has become easier as qualification steps have been added to the protocol. New sequencing instruments and applications will create a demand for further optimizations in future.
Resumo:
This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.