928 resultados para Sparse time-varying VAR models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss a general approach to dynamic sparsity modeling in multivariate time series analysis. Time-varying parameters are linked to latent processes that are thresholded to induce zero values adaptively, providing natural mechanisms for dynamic variable inclusion/selection. We discuss Bayesian model specification, analysis and prediction in dynamic regressions, time-varying vector autoregressions, and multivariate volatility models using latent thresholding. Application to a topical macroeconomic time series problem illustrates some of the benefits of the approach in terms of statistical and economic interpretations as well as improved predictions. Supplementary materials for this article are available online. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a novel statistical test is introduced to compare two locally stationary time series. The proposed approach is a Wald test considering time-varying autoregressive modeling and function projections in adequate spaces. The covariance structure of the innovations may be also time- varying. In order to obtain function estimators for the time- varying autoregressive parameters, we consider function expansions in splines and wavelet bases. Simulation studies provide evidence that the proposed test has a good performance. We also assess its usefulness when applied to a financial time series.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time-varying linear prediction has been studied in the context of speech signals, in which the auto-regressive (AR) coefficients of the system function are modeled as a linear combination of a set of known bases. Traditionally, least squares minimization is used for the estimation of model parameters of the system. Motivated by the sparse nature of the excitation signal for voiced sounds, we explore the time-varying linear prediction modeling of speech signals using sparsity constraints. Parameter estimation is posed as a 0-norm minimization problem. The re-weighted 1-norm minimization technique is used to estimate the model parameters. We show that for sparsely excited time-varying systems, the formulation models the underlying system function better than the least squares error minimization approach. Evaluation with synthetic and real speech examples show that the estimated model parameters track the formant trajectories closer than the least squares approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study compares Value-at-Risk (VaR) measures for Australian banks over a period that includes the Global Financial Crisis (GFC) to determine whether the methodology and parameter selection are important for capital adequacy holdings that will ultimately support a bank in a crisis period. VaR methodology promoted under Basel II was largely criticised during the GFC for its failure to capture downside risk. However, results from this study indicate that 1-year parametric and historical models produce better measures of VaR than models with longer time frames. VaR estimates produced using Monte Carlo simulations show a high percentage of violations but with lower average magnitude of a violation when they occur. VaR estimates produced by the ARMA GARCH model also show a relatively high percentage of violations, however, the average magnitude of a violation is quite low. Our findings support the design of the revised Basel II VaR methodology which has also been adopted under Basel III.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reconstructing 3D motion data is highly under-constrained due to several common sources of data loss during measurement, such as projection, occlusion, or miscorrespondence. We present a statistical model of 3D motion data, based on the Kronecker structure of the spatiotemporal covariance of natural motion, as a prior on 3D motion. This prior is expressed as a matrix normal distribution, composed of separable and compact row and column covariances. We relate the marginals of the distribution to the shape, trajectory, and shape-trajectory models of prior art. When the marginal shape distribution is not available from training data, we show how placing a hierarchical prior over shapes results in a convex MAP solution in terms of the trace-norm. The matrix normal distribution, fit to a single sequence, outperforms state-of-the-art methods at reconstructing 3D motion data in the presence of significant data loss, while providing covariance estimates of the imputed points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurately characterizing the time-varying interference caused to the primary users is essential in ensuring a successful deployment of cognitive radios (CR). We show that the aggregate interference at the primary receiver (PU-Rx) from multiple, randomly located cognitive users (CUs) is well modeled as a shifted lognormal random process, which is more accurate than the lognormal and the Gaussian process models considered in the literature, even for a relatively dense deployment of CUs. It also compares favorably with the asymptotically exact stable and symmetric truncated stable distribution models, except at high CU densities. Our model accounts for the effect of imperfect spectrum sensing, which depends on path-loss, shadowing, and small-scale fading of the link from the primary transmitter to the CU; the interweave and underlay modes or CR operation, which determine the transmit powers of the CUs; and time-correlated shadowing and fading of the links from the CUs to the PU-Rx. It leads to expressions for the probability distribution function, level crossing rate, and average exceedance duration. The impact of cooperative spectrum sensing is also characterized. We validate the model by applying it to redesign the primary exclusive zone to account for the time-varying nature of interference.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop methods for performing filtering and smoothing in non-linear non-Gaussian dynamical models. The methods rely on a particle cloud representation of the filtering distribution which evolves through time using importance sampling and resampling ideas. In particular, novel techniques are presented for generation of random realisations from the joint smoothing distribution and for MAP estimation of the state sequence. Realisations of the smoothing distribution are generated in a forward-backward procedure, while the MAP estimation procedure can be performed in a single forward pass of the Viterbi algorithm applied to a discretised version of the state space. An application to spectral estimation for time-varying autoregressions is described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a new approach for modeling nonlinear multivariate interest rate processes based on time-varying copulas and reducible stochastic differential equations (SDEs). In the modeling of the marginal processes, we consider a class of nonlinear SDEs that are reducible to Ornstein--Uhlenbeck (OU) process or Cox, Ingersoll, and Ross (1985) (CIR) process. The reducibility is achieved via a nonlinear transformation function. The main advantage of this approach is that these SDEs can account for nonlinear features, observed in short-term interest rate series, while at the same time leading to exact discretization and closed-form likelihood functions. Although a rich set of specifications may be entertained, our exposition focuses on a couple of nonlinear constant elasticity volatility (CEV) processes, denoted as OU-CEV and CIR-CEV, respectively. These two processes encompass a number of existing models that have closed-form likelihood functions. The transition density, the conditional distribution function, and the steady-state density function are derived in closed form as well as the conditional and unconditional moments for both processes. In order to obtain a more flexible functional form over time, we allow the transformation function to be time varying. Results from our study of U.S. and UK short-term interest rates suggest that the new models outperform existing parametric models with closed-form likelihood functions. We also find the time-varying effects in the transformation functions statistically significant. To examine the joint behavior of interest rate series, we propose flexible nonlinear multivariate models by joining univariate nonlinear processes via appropriate copulas. We study the conditional dependence structure of the two rates using Patton (2006a) time-varying symmetrized Joe--Clayton copula. We find evidence of asymmetric dependence between the two rates, and that the level of dependence is positively related to the level of the two rates. (JEL: C13, C32, G12) Copyright The Author 2010. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oxfordjournals.org, Oxford University Press.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reducible diffusions (RDs) are nonlinear transformations of analytically solvable Basic Diffusions (BDs). Hence, by construction RDs are analytically tractable and flexible diffusion processes. Existing literature on RDs has mostly focused on time-homogeneous transformations, which to a significant extent fail to explore the full potential of RDs from both theoretical and practical points of view. In this paper, we propose flexible and economically justifiable time variations to the transformations of RDs. Concentrating on the Constant Elasticity Variance (CEV) RDs, we consider nonlinear dynamics for our time-varying transformations with both deterministic and stochastic designs. Such time variations can greatly enhance the flexibility of RDs while maintaining sufficient tractability of the resulting models. In the meantime, our modeling approach enjoys the benefits of classical inferential techniques such as the Maximum Likelihood (ML). Our application to the UK and the US short-term interest rates suggests that from an empirical point of view time-varying transformations are highly relevant and statistically significant. We expect that the proposed models can describe more truthfully the dynamic time-varying behavior of economic and financial variables and potentially improve out-of-sample forecasts significantly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical tests in vector autoregressive (VAR) models are typically based on large-sample approximations, involving the use of asymptotic distributions or bootstrap techniques. After documenting that such methods can be very misleading even with fairly large samples, especially when the number of lags or the number of equations is not small, we propose a general simulation-based technique that allows one to control completely the level of tests in parametric VAR models. In particular, we show that maximized Monte Carlo tests [Dufour (2002)] can provide provably exact tests for such models, whether they are stationary or integrated. Applications to order selection and causality testing are considered as special cases. The technique developed is applied to quarterly and monthly VAR models of the U.S. economy, comprising income, money, interest rates and prices, over the period 1965-1996.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time series regression models are especially suitable in epidemiology for evaluating short-term effects of time-varying exposures on health. The problem is that potential for confounding in time series regression is very high. Thus, it is important that trend and seasonality are properly accounted for. Our paper reviews the statistical models commonly used in time-series regression methods, specially allowing for serial correlation, make them potentially useful for selected epidemiological purposes. In particular, we discuss the use of time-series regression for counts using a wide range Generalised Linear Models as well as Generalised Additive Models. In addition, recently critical points in using statistical software for GAM were stressed, and reanalyses of time series data on air pollution and health were performed in order to update already published. Applications are offered through an example on the relationship between asthma emergency admissions and photochemical air pollutants

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerous studies have documented the failure of the static and conditional capital asset pricing models to explain the difference in returns between value and growth stocks. This paper examines the post-1963 value premium by employing a model that captures the time-varying total risk of the value-minus-growth portfolios. Our results show that the time-series of value premia is strongly and positively correlated with its volatility. This conclusion is robust to the criterion used to sort stocks into value and growth portfolios and to the country under review (the US and the UK). Our paper is consistent with evidence on the possible role of idiosyncratic risk in explaining equity returns, and also with a separate strand of literature concerning the relative lack of reversibility of value firms' investment decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A numerical model embodying the concepts of the Cowley-Lockwood (Cowley and Lockwood, 1992, 1997) paradigm has been used to produce a simple Cowley– Lockwood type expanding flow pattern and to calculate the resulting change in ion temperature. Cross-correlation, fixed threshold analysis and threshold relative to peak are used to determine the phase speed of the change in convection pattern, in response to a change in applied reconnection. Each of these methods fails to fully recover the expansion of the onset of the convection response that is inherent in the simulations. The results of this study indicate that any expansion of the convection pattern will be best observed in time-series data using a threshold which is a fixed fraction of the peak response. We show that these methods used to determine the expansion velocity can be used to discriminate between the two main models for the convection response to a change in reconnection.