928 resultados para Sparse time-varying VAR models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper discusses how variations in the pattern of convective plasma flows should beincluded in self-consistent time-dependent models of the coupled ionosphere-thermosphere system. The author shows how these variations depend upon the mechanism by which the solar wind flow excites the convection. The modelling of these effects is not just of relevance to the polar ionosphere. This is because the influence of convection is not confined to high latitudes: the resultant heating and composition changes in the thermosphere are communicated to lower latitudes by the winds which are also greatly modified by the plasma convection. These thermospheric changes alter the global distribution of plasma by modulatingthe rates of the chemical reactions which areresponsible for the loss of plasma. Hence the modelling of these high-latitude processes is of relevanceto the design and operation of HF communication, radar and navigation systems worldwide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show how multivariate GARCH models can be used to generate a time-varying “information share” (Hasbrouck, 1995) to represent the changing patterns of price discovery in closely related securities. We find that time-varying information shares can improve credit spread predictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Thesis is the result of my Master Degree studies at the Graduate School of Economics, Getúlio Vargas Foundation, from January 2004 to August 2006. am indebted to my Thesis Advisor, Professor Luiz Renato Lima, who introduced me to the Econometrics' world. In this Thesis, we study time-varying quantile process and we develop two applications, which are presented here as Part and Part II. Each of these parts was transformed in paper. Both papers were submitted. Part shows that asymmetric persistence induces ARCH effects, but the LMARCH test has power against it. On the other hand, the test for asymmetric dynamics proposed by Koenker and Xiao (2004) has correct size under the presence of ARCH errors. These results suggest that the LM-ARCH and the Koenker-Xiao tests may be used in applied research as complementary tools. In the Part II, we compare four different Value-at-Risk (VaR) methodologies through Monte Cario experiments. Our results indicate that the method based on quantile regression with ARCH effect dominates other methods that require distributional assumption. In particular, we show that the non-robust method ologies have higher probability to predict VaRs with too many violations. We illustrate our findings with an empirical exercise in which we estimate VaR for returns of São Paulo stock exchange index, IBOVESPA, during periods of market turmoil. Our results indicate that the robust method based on quantile regression presents the least number of violations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is a theoretica1 and empirica1 study of the re1ationship between indexing po1icy and feedback mechanisms in the inflationary adjustment process in Brazil. The focus of our study is on two policy issues: (1) did the Brazilian system of indexing of interest rates, the exchange rate, and wages make inflation so dependent on its own past values that it created a significant feedback process and inertia in the behaviour of inflation in and (2) was the feedback effect of past inf1ation upon itself so strong that dominated the effect of monetary/fiscal variables upon current inflation? This paper develops a simple model designed to capture several "stylized facts" of Brazi1ian indexing po1icy. Separate ru1es of "backward indexing" for interest rates, the exchange rate, and wages, reflecting the evolution of po1icy changes in Brazil, are incorporated in a two-sector model of industrial and agricultural prices. A transfer function derived irom this mode1 shows inflation depending on three factors: (1) past values of inflation, (2) monetary and fiscal variables, and (3) supply- .shock variables. The indexing rules for interest rates, the exchange rate, and wages place restrictions on the coefficients of the transfer function. Variations in the policy-determined parameters of the indexing rules imply changes in the coefficients of the transfer function for inflation. One implication of this model, in contrast to previous results derived in analytically simpler models of indexing, is that a higher degree of indexing does not make current inflation more responsive to current monetary shocks. The empirical section of this paper studies the central hypotheses of this model through estimation of the inflation transfer function with time-varying parameters. The results show a systematic non-random variation of the transfer function coefficients closely synchronized with changes in the observed values of the wage-indexing parameters. Non-parametric tests show the variation of the transfer function coefficients to be statistically significant at the time of the changes in wage indexing rules in Brazil. As the degree of indexing increased, the inflation feadback coefficients increased, while the effect of external price and agricultura shocs progressively increased and monetary effects progressively decreased.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Several models have been designed to predict survival of patients with heart failure. These, while available and widely used for both stratifying and deciding upon different treatment options on the individual level, have several limitations. Specifically, some clinical variables that may influence prognosis may have an influence that change over time. Statistical models that include such characteristic may help in evaluating prognosis. The aim of the present study was to analyze and quantify the impact of modeling heart failure survival allowing for covariates with time-varying effects known to be independent predictors of overall mortality in this clinical setting. Methodology: Survival data from an inception cohort of five hundred patients diagnosed with heart failure functional class III and IV between 2002 and 2004 and followed-up to 2006 were analyzed by using the proportional hazards Cox model and variations of the Cox's model and also of the Aalen's additive model. Principal Findings: One-hundred and eighty eight (188) patients died during follow-up. For patients under study, age, serum sodium, hemoglobin, serum creatinine, and left ventricular ejection fraction were significantly associated with mortality. Evidence of time-varying effect was suggested for the last three. Both high hemoglobin and high LV ejection fraction were associated with a reduced risk of dying with a stronger initial effect. High creatinine, associated with an increased risk of dying, also presented an initial stronger effect. The impact of age and sodium were constant over time. Conclusions: The current study points to the importance of evaluating covariates with time-varying effects in heart failure models. The analysis performed suggests that variations of Cox and Aalen models constitute a valuable tool for identifying these variables. The implementation of covariates with time-varying effects into heart failure prognostication models may reduce bias and increase the specificity of such models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In environmental epidemiology, exposure X and health outcome Y vary in space and time. We present a method to diagnose the possible influence of unmeasured confounders U on the estimated effect of X on Y and to propose several approaches to robust estimation. The idea is to use space and time as proxy measures for the unmeasured factors U. We start with the time series case where X and Y are continuous variables at equally-spaced times and assume a linear model. We define matching estimator b(u)s that correspond to pairs of observations with specific lag u. Controlling for a smooth function of time, St, using a kernel estimator is roughly equivalent to estimating the association with a linear combination of the b(u)s with weights that involve two components: the assumptions about the smoothness of St and the normalized variogram of the X process. When an unmeasured confounder U exists, but the model otherwise correctly controls for measured confounders, the excess variation in b(u)s is evidence of confounding by U. We use the plot of b(u)s versus lag u, lagged-estimator-plot (LEP), to diagnose the influence of U on the effect of X on Y. We use appropriate linear combination of b(u)s or extrapolate to b(0) to obtain novel estimators that are more robust to the influence of smooth U. The methods are extended to time series log-linear models and to spatial analyses. The LEP plot gives us a direct view of the magnitude of the estimators for each lag u and provides evidence when models did not adequately describe the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis work we develop a new generative model of social networks belonging to the family of Time Varying Networks. The importance of correctly modelling the mechanisms shaping the growth of a network and the dynamics of the edges activation and inactivation are of central importance in network science. Indeed, by means of generative models that mimic the real-world dynamics of contacts in social networks it is possible to forecast the outcome of an epidemic process, optimize the immunization campaign or optimally spread an information among individuals. This task can now be tackled taking advantage of the recent availability of large-scale, high-quality and time-resolved datasets. This wealth of digital data has allowed to deepen our understanding of the structure and properties of many real-world networks. Moreover, the empirical evidence of a temporal dimension in networks prompted the switch of paradigm from a static representation of graphs to a time varying one. In this work we exploit the Activity-Driven paradigm (a modeling tool belonging to the family of Time-Varying-Networks) to develop a general dynamical model that encodes fundamental mechanism shaping the social networks' topology and its temporal structure: social capital allocation and burstiness. The former accounts for the fact that individuals does not randomly invest their time and social interactions but they rather allocate it toward already known nodes of the network. The latter accounts for the heavy-tailed distributions of the inter-event time in social networks. We then empirically measure the properties of these two mechanisms from seven real-world datasets and develop a data-driven model, analytically solving it. We then check the results against numerical simulations and test our predictions with real-world datasets, finding a good agreement between the two. Moreover, we find and characterize a non-trivial interplay between burstiness and social capital allocation in the parameters phase space. Finally, we present a novel approach to the development of a complete generative model of Time-Varying-Networks. This model is inspired by the Kaufman's adjacent possible theory and is based on a generalized version of the Polya's urn. Remarkably, most of the complex and heterogeneous feature of real-world social networks are naturally reproduced by this dynamical model, together with many high-order topological properties (clustering coefficient, community structure etc.).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews some basic issues and methods involved in using neural networks to respond in a desired fashion to a temporally-varying environment. Some popular network models and training methods are introduced. A speech recognition example is then used to illustrate the central difficulty of temporal data processing: learning to notice and remember relevant contextual information. Feedforward network methods are applicable to cases where this problem is not severe. The application of these methods are explained and applications are discussed in the areas of pure mathematics, chemical and physical systems, and economic systems. A more powerful but less practical algorithm for temporal problems, the moving targets algorithm, is sketched and discussed. For completeness, a few remarks are made on reinforcement learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simple models of time-varying risk premia are used to measure the risk premia in long-term UK government bonds. The parameters of the models can be estimated using nonlinear seemingly unrelated regression (NL-SUR), which permits efficient use of information across the entire yield curve and facilitates the testing of various cross-sectional restrictions. The estimated time-varying premia are found to be substantially different to those estimated using models that assume constant risk premia. © 2004 Taylor and Francis Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Are persistent marketing effects most likely to appear right after the introduction of a product? The authors give an affirmative answer to this question by developing a model that explicitly reports how persistent and transient marketing effects evolve over time. The proposed model provides managers with a valuable tool to evaluate their allocation of marketing expenditures over time. An application of the model to many pharmaceutical products, estimated through (exact initial) Kalman filtering, indicates that both persistent and transient effects occur predominantly immediately after a brand's introduction. Subsequently, the size of the effects declines. The authors theoretically and empirically compare their methodology with methodology based on unit root testing and demonstrate that the need for unit root tests creates difficulties in applying conventional persistence modeling. The authors recommend that marketing models should either accommodate persistent effects that change over time or be applied to mature brands or limited time windows only.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper aims to help supply chain managers to determine the value of retailer-supplier partnership initiatives beyond information sharing (IS) according to their specific business environment under time-varying demand conditions. For this purpose, we use integer linear programming models to quantify the benefits that can be accrued by a retailer, a supplier and system as a whole from shift in inventory ownership and shift in decision-making power with that of IS. The results of a detailed numerical study pertaining to static time horizon reveal that the shift in inventory ownership provides system-wide cost benefits in specific settings. Particularly, when it induces the retailer to order larger quantities and the supplier also prefers such orders due to significantly high setup and shipment costs. We observe that the relative benefits of shift in decision-making power are always higher than the shift in inventory ownership under all the conditions. The value of the shift in decision-making power is greater than IS particularly when the variability of underlying demand is low and time-dependent variation in production cost is high. However, when the shipment cost is negligible and order issuing efficiency of the supplier is low, the cost benefits of shift in decision-making power beyond IS are not significant. © 2012 Taylor & Francis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine how the most prevalent stochastic properties of key financial time series have been affected during the recent financial crises. In particular we focus on changes associated with the remarkable economic events of the last two decades in the volatility dynamics, including the underlying volatility persistence and volatility spillover structure. Using daily data from several key stock market indices, the results of our bivariate GARCH models show the existence of time varying correlations as well as time varying shock and volatility spillovers between the returns of FTSE and DAX, and those of NIKKEI and Hang Seng, which became more prominent during the recent financial crisis. Our theoretical considerations on the time varying model which provides the platform upon which we integrate our multifaceted empirical approaches are also of independent interest. In particular, we provide the general solution for time varying asymmetric GARCH specifications, which is a long standing research topic. This enables us to characterize these models by deriving, first, their multistep ahead predictors, second, the first two time varying unconditional moments, and third, their covariance structure.

Relevância:

100.00% 100.00%

Publicador: