897 resultados para Autoregressive Processes


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we develop finite-sample inference procedures for stationary and nonstationary autoregressive (AR) models. The method is based on special properties of Markov processes and a split-sample technique. The results on Markovian processes (intercalary independence and truncation) only require the existence of conditional densities. They are proved for possibly nonstationary and/or non-Gaussian multivariate Markov processes. In the context of a linear regression model with AR(1) errors, we show how these results can be used to simplify the distributional properties of the model by conditioning a subset of the data on the remaining observations. This transformation leads to a new model which has the form of a two-sided autoregression to which standard classical linear regression inference techniques can be applied. We show how to derive tests and confidence sets for the mean and/or autoregressive parameters of the model. We also develop a test on the order of an autoregression. We show that a combination of subsample-based inferences can improve the performance of the procedure. An application to U.S. domestic investment data illustrates the method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a new class of integer-valued self-exciting threshold models, which is based on the binomial autoregressive model of order one as introduced by McKenzie (Water Resour Bull 21:645–650, 1985. doi:10.1111/j.1752-1688.1985. tb05379.x). Basic probabilistic and statistical properties of this class of models are discussed. Moreover, parameter estimation and forecasting are addressed. Finally, the performance of these models is illustrated through a simulation study and an empirical application to a set of measle cases in Germany.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a low complexity technique to generate amplitude correlated time-series with Nakagami-m distribution and phase correlated Gaussian-distributed time-series, which is useful for the simulation of ionospheric scintillation effects in GNSS signals. To generate a complex scintillation process, the technique requires solely the knowledge of parameters Sa (scintillation index) and σφ (phase standard deviation) besides the definition of models for the amplitude and phase power spectra. The concatenation of two nonlinear memoryless transformations is used to produce a Nakagami-distributed amplitude signal from a Gaussian autoregressive process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a low complexity technique to generate amplitude correlated time-series with Nakagami-m distribution and phase correlated Gaussian-distributed time-series, which is useful in the simulation of ionospheric scintillation effects during the transmission of GNSS signals. The method requires only the knowledge of parameters S4 (scintillation index) and σΦ (phase standard deviation) besides the definition of models for the amplitude and phase power spectra. The Zhang algorithm is used to produce Nakagami-distributed signals from a set of Gaussian autoregressive processes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conditional heteroskedasticity is an important feature of many macroeconomic and financial time series. Standard residual-based bootstrap procedures for dynamic regression models treat the regression error as i.i.d. These procedures are invalid in the presence of conditional heteroskedasticity. We establish the asymptotic validity of three easy-to-implement alternative bootstrap proposals for stationary autoregressive processes with m.d.s. errors subject to possible conditional heteroskedasticity of unknown form. These proposals are the fixed-design wild bootstrap, the recursive-design wild bootstrap and the pairwise bootstrap. In a simulation study all three procedures tend to be more accurate in small samples than the conventional large-sample approximation based on robust standard errors. In contrast, standard residual-based bootstrap methods for models with i.i.d. errors may be very inaccurate if the i.i.d. assumption is violated. We conclude that in many empirical applications the proposed robust bootstrap procedures should routinely replace conventional bootstrap procedures for autoregressions based on the i.i.d. error assumption.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L'objectif du présent mémoire vise à présenter des modèles de séries chronologiques multivariés impliquant des vecteurs aléatoires dont chaque composante est non-négative. Nous considérons les modèles vMEM (modèles vectoriels et multiplicatifs avec erreurs non-négatives) présentés par Cipollini, Engle et Gallo (2006) et Cipollini et Gallo (2010). Ces modèles représentent une généralisation au cas multivarié des modèles MEM introduits par Engle (2002). Ces modèles trouvent notamment des applications avec les séries chronologiques financières. Les modèles vMEM permettent de modéliser des séries chronologiques impliquant des volumes d'actif, des durées, des variances conditionnelles, pour ne citer que ces applications. Il est également possible de faire une modélisation conjointe et d'étudier les dynamiques présentes entre les séries chronologiques formant le système étudié. Afin de modéliser des séries chronologiques multivariées à composantes non-négatives, plusieurs spécifications du terme d'erreur vectoriel ont été proposées dans la littérature. Une première approche consiste à considérer l'utilisation de vecteurs aléatoires dont la distribution du terme d'erreur est telle que chaque composante est non-négative. Cependant, trouver une distribution multivariée suffisamment souple définie sur le support positif est plutôt difficile, au moins avec les applications citées précédemment. Comme indiqué par Cipollini, Engle et Gallo (2006), un candidat possible est une distribution gamma multivariée, qui impose cependant des restrictions sévères sur les corrélations contemporaines entre les variables. Compte tenu que les possibilités sont limitées, une approche possible est d'utiliser la théorie des copules. Ainsi, selon cette approche, des distributions marginales (ou marges) peuvent être spécifiées, dont les distributions en cause ont des supports non-négatifs, et une fonction de copule permet de tenir compte de la dépendance entre les composantes. Une technique d'estimation possible est la méthode du maximum de vraisemblance. Une approche alternative est la méthode des moments généralisés (GMM). Cette dernière méthode présente l'avantage d'être semi-paramétrique dans le sens que contrairement à l'approche imposant une loi multivariée, il n'est pas nécessaire de spécifier une distribution multivariée pour le terme d'erreur. De manière générale, l'estimation des modèles vMEM est compliquée. Les algorithmes existants doivent tenir compte du grand nombre de paramètres et de la nature élaborée de la fonction de vraisemblance. Dans le cas de l'estimation par la méthode GMM, le système à résoudre nécessite également l'utilisation de solveurs pour systèmes non-linéaires. Dans ce mémoire, beaucoup d'énergies ont été consacrées à l'élaboration de code informatique (dans le langage R) pour estimer les différents paramètres du modèle. Dans le premier chapitre, nous définissons les processus stationnaires, les processus autorégressifs, les processus autorégressifs conditionnellement hétéroscédastiques (ARCH) et les processus ARCH généralisés (GARCH). Nous présentons aussi les modèles de durées ACD et les modèles MEM. Dans le deuxième chapitre, nous présentons la théorie des copules nécessaire pour notre travail, dans le cadre des modèles vectoriels et multiplicatifs avec erreurs non-négatives vMEM. Nous discutons également des méthodes possibles d'estimation. Dans le troisième chapitre, nous discutons les résultats des simulations pour plusieurs méthodes d'estimation. Dans le dernier chapitre, des applications sur des séries financières sont présentées. Le code R est fourni dans une annexe. Une conclusion complète ce mémoire.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Glacier fluctuations exclusively due to internal variations in the climate system are simulated using downscaled integrations of the ECHAM4/OPYC coupled general circulation model (GCM). A process-based modeling approach using a mass balance model of intermediate complexity and a dynamic ice flow model considering simple shearing flow and sliding are applied. Multimillennia records of glacier length fluctuations for Nigardsbreen (Norway) and Rhonegletscher (Switzerland) are simulated using autoregressive processes determined by statistically downscaled GCM experiments. Return periods and probabilities of specific glacier length changes using GCM integrations excluding external forcings such as solar irradiation changes, volcanic, or anthropogenic effects are analyzed and compared to historical glacier length records. Preindustrial fluctuations of the glaciers as far as observed or reconstructed, including their advance during the “Little Ice Age,” can be explained by internal variability in the climate system as represented by a GCM. However, fluctuations comparable to the present-day glacier retreat exceed any variation simulated by the GCM control experiments and must be caused by external forcing, with anthropogenic forcing being a likely candidate.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper characterizes episodes of real appreciations and depreciations for a sample of 85 countries, approximately from 1960 to 1998. First, the equilibrium real exchange rate series are constructed for each country using Goldfajn and Valdes (1999) methodology (cointegration with fundamentals). Then, departures from equilibrium real exchange rate (misalignments) are obtained, and a Markov Switching Model is used to characterize the misalignments series as stochastic autoregressive processes governed by two states representing di¤erent means. Three are the main results we …nd: …rst, no evidence of di¤erent regimes for misalignment is found in some countries, second, some countries present one regime of no misalignment (tranquility) and the other regime with misalignment (crisis), and, third, for those countries with two misalignment regimes, the lower mean misalignment regime (appreciated) have higher persistence that the higher mean one (depreciated).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper evaluates the impact that investigation and regulation of the UK petrol industry has had on the profitability of the companies. Using a gross margin for petrol, we estimate a series of variable parameter autoregressive processes. The results demonstrate that the 1979 Monopolies and Mergers Commission investigation into the industry, caused a long term decline in profit margins in the industry, despite the fact that no recommendations or undertakings were made. This cannot however be said for subsequent investigations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this thesis is the small area estimation of an economic security indicator. Economic security is a complex concept that carries a variety of meanings. In the literature there is no a formal unambiguous definition for economic security and in this work we refer to the definition recently provided for its opposite, economic insecurity, as the “anxiety produced by the possible exposure to adverse economic events and by the anticipation of the difficulty to recover from them” (Bossert and D’Ambrosio, 2013). In the last decade interest for economic insecurity/security has grown constantly, especially since the financial crisis of 2008, but even more in the last year after the economic consequences due to the Covid-19 pandemic. In this research, economic security is measures through a longitudinal indicator that takes into account the income levels of Italian households, from 2014 to 2016. The target areas are groups of Italian provinces, for which the indicator is estimated using longitudinal data taken from EU-SILC survey. We notice that the sample size is too low to obtain reliable estimates for our target areas. Therefore we resort to some Small Area Estimation strategies to improve the reliability of the results. In particular we consider small area models specified at area level. Besides the basic Fay-Herriot area-level model, we propose to consider some longitudinal extensions, including time-specific random effects following an autoregressive processes of order 1 (AR1) and a moving average of order 1 (MA1). We found that all the small area models used show a significant efficiency gain, especially MA1 model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, a novel statistical test is introduced to compare two locally stationary time series. The proposed approach is a Wald test considering time-varying autoregressive modeling and function projections in adequate spaces. The covariance structure of the innovations may be also time- varying. In order to obtain function estimators for the time- varying autoregressive parameters, we consider function expansions in splines and wavelet bases. Simulation studies provide evidence that the proposed test has a good performance. We also assess its usefulness when applied to a financial time series.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 60J80, 60K05.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A bivariate semi-Pareto distribution is introduced and characterized using geometric minimization. Autoregressive minification models for bivariate random vectors with bivariate semi-Pareto and bivariate Pareto distributions are also discussed. Multivariate generalizations of the distributions and the processes are briefly indicated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes different estimators for the parameters of SemiPareto and Pareto autoregressive minification processes The asymptotic properties of the estimators are established by showing that the SemiPareto process is α-mixing. Asymptotic variances of different moment and maximum likelihood estimators are compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The classical methods of analysing time series by Box-Jenkins approach assume that the observed series uctuates around changing levels with constant variance. That is, the time series is assumed to be of homoscedastic nature. However, the nancial time series exhibits the presence of heteroscedasticity in the sense that, it possesses non-constant conditional variance given the past observations. So, the analysis of nancial time series, requires the modelling of such variances, which may depend on some time dependent factors or its own past values. This lead to introduction of several classes of models to study the behaviour of nancial time series. See Taylor (1986), Tsay (2005), Rachev et al. (2007). The class of models, used to describe the evolution of conditional variances is referred to as stochastic volatility modelsThe stochastic models available to analyse the conditional variances, are based on either normal or log-normal distributions. One of the objectives of the present study is to explore the possibility of employing some non-Gaussian distributions to model the volatility sequences and then study the behaviour of the resulting return series. This lead us to work on the related problem of statistical inference, which is the main contribution of the thesis