2 resultados para Robust estimates

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper demonstrates that the conventional approach of using official liberalisation dates as the only existing breakdates could lead to inaccurate conclusions as to the effect of the underlying liberalisation policies. It also proposes an alternative paradigm for obtaining more robust estimates of volatility changes around official liberalisation dates and/or other important market events. By focusing on five East Asian emerging markets, all of which liberalised their financial markets in the late, and by using recent advances in the econometrics of structural change, it shows that (i) the detected breakdates in the volatility of stock market returns can be dramatically different to official liberalisation dates and (ii) the use of official liberalisation dates as breakdates can readily entail inaccurate inference. In contrast, the use of data-driven techniques for the detection of multiple structural changes leads to a richer and inevitably more accurate pattern of volatility evolution emerges in comparison with focussing on official liberalisation dates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatically generating maps of a measured variable of interest can be problematic. In this work we focus on the monitoring network context where observations are collected and reported by a network of sensors, and are then transformed into interpolated maps for use in decision making. Using traditional geostatistical methods, estimating the covariance structure of data collected in an emergency situation can be difficult. Variogram determination, whether by method-of-moment estimators or by maximum likelihood, is very sensitive to extreme values. Even when a monitoring network is in a routine mode of operation, sensors can sporadically malfunction and report extreme values. If this extreme data destabilises the model, causing the covariance structure of the observed data to be incorrectly estimated, the generated maps will be of little value, and the uncertainty estimates in particular will be misleading. Marchant and Lark [2007] propose a REML estimator for the covariance, which is shown to work on small data sets with a manual selection of the damping parameter in the robust likelihood. We show how this can be extended to allow treatment of large data sets together with an automated approach to all parameter estimation. The projected process kriging framework of Ingram et al. [2007] is extended to allow the use of robust likelihood functions, including the two component Gaussian and the Huber function. We show how our algorithm is further refined to reduce the computational complexity while at the same time minimising any loss of information. To show the benefits of this method, we use data collected from radiation monitoring networks across Europe. We compare our results to those obtained from traditional kriging methodologies and include comparisons with Box-Cox transformations of the data. We discuss the issue of whether to treat or ignore extreme values, making the distinction between the robust methods which ignore outliers and transformation methods which treat them as part of the (transformed) process. Using a case study, based on an extreme radiological events over a large area, we show how radiation data collected from monitoring networks can be analysed automatically and then used to generate reliable maps to inform decision making. We show the limitations of the methods and discuss potential extensions to remedy these.