790 resultados para Forecasting Volatility
Resumo:
Doutoramento em Economia
Resumo:
For climate risk management, cumulative distribution functions (CDFs) are an important source of information. They are ideally suited to compare probabilistic forecasts of primary (e.g. rainfall) or secondary data (e.g. crop yields). Summarised as CDFs, such forecasts allow an easy quantitative assessment of possible, alternative actions. Although the degree of uncertainty associated with CDF estimation could influence decisions, such information is rarely provided. Hence, we propose Cox-type regression models (CRMs) as a statistical framework for making inferences on CDFs in climate science. CRMs were designed for modelling probability distributions rather than just mean or median values. This makes the approach appealing for risk assessments where probabilities of extremes are often more informative than central tendency measures. CRMs are semi-parametric approaches originally designed for modelling risks arising from time-to-event data. Here we extend this original concept beyond time-dependent measures to other variables of interest. We also provide tools for estimating CDFs and surrounding uncertainty envelopes from empirical data. These statistical techniques intrinsically account for non-stationarities in time series that might be the result of climate change. This feature makes CRMs attractive candidates to investigate the feasibility of developing rigorous global circulation model (GCM)-CRM interfaces for provision of user-relevant forecasts. To demonstrate the applicability of CRMs, we present two examples for El Ni ? no/Southern Oscillation (ENSO)-based forecasts: the onset date of the wet season (Cairns, Australia) and total wet season rainfall (Quixeramobim, Brazil). This study emphasises the methodological aspects of CRMs rather than discussing merits or limitations of the ENSO-based predictors.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Departamento de Administração, Programa de Pós-graduação em Administração, 2016.
Resumo:
To compare the accuracy of different forecasting approaches an error measure is required. Many error measures have been proposed in the literature, however in practice there are some situations where different measures yield different decisions on forecasting approach selection and there is no agreement on which approach should be used. Generally forecasting measures represent ratios or percentages providing an overall image of how well fitted the forecasting technique is to the observations. This paper proposes a multiplicative Data Envelopment Analysis (DEA) model in order to rank several forecasting techniques. We demonstrate the proposed model by applying it to the set of yearly time series of the M3 competition. The usefulness of the proposed approach has been tested using the M3-competition where five error measures have been applied in and aggregated to a single DEA score.
Resumo:
The ontology engineering research community has focused for many years on supporting the creation, development and evolution of ontologies. Ontology forecasting, which aims at predicting semantic changes in an ontology, represents instead a new challenge. In this paper, we want to give a contribution to this novel endeavour by focusing on the task of forecasting semantic concepts in the research domain. Indeed, ontologies representing scientific disciplines contain only research topics that are already popular enough to be selected by human experts or automatic algorithms. They are thus unfit to support tasks which require the ability of describing and exploring the forefront of research, such as trend detection and horizon scanning. We address this issue by introducing the Semantic Innovation Forecast (SIF) model, which predicts new concepts of an ontology at time t + 1, using only data available at time t. Our approach relies on lexical innovation and adoption information extracted from historical data. We evaluated the SIF model on a very large dataset consisting of over one million scientific papers belonging to the Computer Science domain: the outcomes show that the proposed approach offers a competitive boost in mean average precision-at-ten compared to the baselines when forecasting over 5 years.
Resumo:
This paper empirically investigates volatility transmission among stock and foreign exchange markets in seven major world economies during the period July 1988 to January 2015. To this end, we first perform a static and dynamic analysis to measure the total volatility connectedness in the entire period (the system-wide approach) using a framework recently proposed by Diebold and Yilmaz (2014). Second, we make use of a dynamic analysis to evaluate the net directional connectedness for each market. To gain further insights, we examine the time-varying behaviour of net pair-wise directional connectedness during the financial turmoil periods experienced in the sample period Our results suggest that slightly more than half of the total variance of the forecast errors is explained by shocks across markets rather than by idiosyncratic shocks. Furthermore, we find that volatility connectedness varies over time, with a surge during periods of increasing economic and financial instability.
Resumo:
This paper estimates Bejarano and Charry (2014)’s small open economy with financial frictions model for the Colombian economy using Bayesian estimation techniques. Additionally, I compute the welfare gains of implementing an optimal response to credit spreads into an augmented Taylor rule. The main result is that a reaction to credit spreads does not imply significant welfare gains unless the economic disturbances increases its volatility, like the disruption implied by a financial crisis. Otherwise its impact over the macroeconomic variables is null.
Resumo:
We propose a method denoted as synthetic portfolio for event studies in market microstructure that is particularly interesting to use with high frequency data and thinly traded markets. The method is based on Synthetic Control Method and provides a robust data driven method to build a counterfactual for evaluating the effects of the volatility call auctions. We find that SMC could be used if the loss function is defined as the difference between the returns of the asset and the returns of a synthetic portfolio. We apply SCM to test the performance of the volatility call auction as a circuit breaker in the context of an event study. We find that for Colombian Stock Market securities, the asynchronicity of intraday data reduces the analysis to a selected group of stocks, however it is possible to build a tracking portfolio. The realized volatility increases after the auction, indicating that the mechanism is not enhancing the price discovery process.
Resumo:
This paper presents a methodology for short-term load forecasting based on genetic algorithm feature selection and artificial neural network modeling. A feed forward artificial neural network is used to model the 24-h ahead load based on past consumption, weather and stock index data. A genetic algorithm is used in order to find the best subset of variables for modeling. Three data sets of different geographical locations, encompassing areas of different dimensions with distinct load profiles are used in order to evaluate the methodology. The developed approach was found to generate models achieving a minimum mean average percentage error under 2 %. The feature selection algorithm was able to significantly reduce the number of used features and increase the accuracy of the models.
Resumo:
This article addresses the effects of the prohibition against naked CDS buying implemented by the European Union in November 2012. Three aspects of market quality are analyzed: liquidity, volatility, and price informativeness. Overall, our results suggest that the ban produced negative effects on liquidity and price informativeness. First, we find that in territories within the scope of the EU regulation, the bid–ask spreads on sovereign CDS contracts rose after the ban, but fell for countries outside its bounds. Open interest declined for both groups of CDS reference entities in our sample, but significantly more in the constraint group. Price delay increased more prominently for countries affected by the ban, whereas price precision decreased for these countries while increasing for CDSs written on other sovereign reference entities. Most notably, our findings indicate that hese negative effects were more pronounced amid reference entities exhibiting lower credit risk. With respect to volatility, the evidence suggests that the ban was successful in stabilizing the CDS market in that volatility decreased, particularly for contracts written on riskier CDS entities.