999 resultados para VAR models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study aims to assess the empirical adherence of the permanent income theory and the consumption smoothing view in Latin America. Two present value models are considered, one describing household behavior and the other open economy macroeconomics. Following the methodology developed in Campbell and Schiller (1987), Bivariate Vector Autoregressions are estimated for the saving ratio and the real growth rate of income concerning the household behavior model and for the current account and the change in national cash ‡ow regarding the open economy model. The countries in the sample are considered separately in the estimation process (individual system estimation) as well as jointly (joint system estimation). Ordinary Least Squares (OLS) and Seemingly Unrelated Regressions (SURE) estimates of the coe¢cients are generated. Wald Tests are then conducted to verify if the VAR coe¢cient estimates are in conformity with those predicted by the theory. While the empirical results are sensitive to the estimation method and discount factors used, there is only weak evidence in favor of the permanent income theory and consumption smoothing view in the group of countries analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the commonly held belief that aggregate data display short-run comovement, there has been little discussion about the econometric consequences of this feature of the data. We use exhaustive Monte-Carlo simulations to investigate the importance of restrictions implied by common-cyclical features for estimates and forecasts based on vector autoregressive models. First, we show that the ìbestî empirical model developed without common cycle restrictions need not nest the ìbestî model developed with those restrictions. This is due to possible differences in the lag-lengths chosen by model selection criteria for the two alternative models. Second, we show that the costs of ignoring common cyclical features in vector autoregressive modelling can be high, both in terms of forecast accuracy and efficient estimation of variance decomposition coefficients. Third, we find that the Hannan-Quinn criterion performs best among model selection criteria in simultaneously selecting the lag-length and rank of vector autoregressions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the belief, supported byrecentapplied research, thataggregate datadisplay short-run comovement, there has been little discussion about the econometric consequences ofthese data “features.” W e use exhaustive M onte-Carlo simulations toinvestigate theimportance ofrestrictions implied by common-cyclicalfeatures for estimates and forecasts based on vectorautoregressive and errorcorrection models. First, weshowthatthe“best” empiricalmodeldevelopedwithoutcommoncycles restrictions neednotnestthe“best” modeldevelopedwiththoserestrictions, duetothe use ofinformation criteria forchoosingthe lagorderofthe twoalternative models. Second, weshowthatthecosts ofignoringcommon-cyclicalfeatures inV A R analysis may be high in terms offorecastingaccuracy and e¢ciency ofestimates ofvariance decomposition coe¢cients. A lthough these costs are more pronounced when the lag orderofV A R modelsareknown, theyarealsonon-trivialwhenitis selectedusingthe conventionaltoolsavailabletoappliedresearchers. T hird, we…ndthatifthedatahave common-cyclicalfeatures andtheresearcherwants touseaninformationcriterium to selectthelaglength, theH annan-Q uinn criterium is themostappropriate, sincethe A kaike and theSchwarz criteriahave atendency toover- and under-predictthe lag lengthrespectivelyinoursimulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers the general problem of Feasible Generalized Least Squares Instrumental Variables (FG LS IV) estimation using optimal instruments. First we summarize the sufficient conditions for the FG LS IV estimator to be asymptotic ally equivalent to an optimal G LS IV estimator. Then we specialize to stationary dynamic systems with stationary VAR errors, and use the sufficient conditions to derive new moment conditions for these models. These moment conditions produce useful IVs from the lagged endogenous variables, despite the correlation between errors and endogenous variables. This use of the information contained in the lagged endogenous variables expands the class of IV estimators under consideration and there by potentially improves both asymptotic and small-sample efficiency of the optimal IV estimator in the class. Some Monte Carlo experiments compare the new methods with those of Hatanaka [1976]. For the DG P used in the Monte Carlo experiments, asymptotic efficiency is strictly improved by the new IVs, and experimental small-sample efficiency is improved as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O objetivo deste estudo é propor a implementação de um modelo estatístico para cálculo da volatilidade, não difundido na literatura brasileira, o modelo de escala local (LSM), apresentando suas vantagens e desvantagens em relação aos modelos habitualmente utilizados para mensuração de risco. Para estimação dos parâmetros serão usadas as cotações diárias do Ibovespa, no período de janeiro de 2009 a dezembro de 2014, e para a aferição da acurácia empírica dos modelos serão realizados testes fora da amostra, comparando os VaR obtidos para o período de janeiro a dezembro de 2014. Foram introduzidas variáveis explicativas na tentativa de aprimorar os modelos e optou-se pelo correspondente americano do Ibovespa, o índice Dow Jones, por ter apresentado propriedades como: alta correlação, causalidade no sentido de Granger, e razão de log-verossimilhança significativa. Uma das inovações do modelo de escala local é não utilizar diretamente a variância, mas sim a sua recíproca, chamada de “precisão” da série, que segue uma espécie de passeio aleatório multiplicativo. O LSM captou todos os fatos estilizados das séries financeiras, e os resultados foram favoráveis a sua utilização, logo, o modelo torna-se uma alternativa de especificação eficiente e parcimoniosa para estimar e prever volatilidade, na medida em que possui apenas um parâmetro a ser estimado, o que representa uma mudança de paradigma em relação aos modelos de heterocedasticidade condicional.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Farmers are interested in producing popcorn under organic production systems and propane flaming could be a significant component of an integrated weed management program. The objective of this study was to collect baseline information on popcorn tolerance to broadcast flaming as influenced by propane dose and crop growth stage at the time of flaming. Field experiments were conducted at the Haskell Agricultural Laboratory of the University of Nebraska, Concord, NE in 2008 and 2009 using five propane doses (0, 13, 24, 44 and 85 kg ha(-1)) applied at the 2-leaf, 5-leaf and 7-leaf growth stages. Propane was applied using a custom-built research flamer driven at a constant speed of 6.4 km h(-1). Crop response to propane dose was described by log-logistic models on the basis of visual estimates of crop injury, yield components (plants m(-2), ears plant(-1), kernels cob(-1) and 100-kernel weight) and grain yield. Popcorn response to flaming was influenced by the crop growth stage and propane dose. Based on various parameters evaluated, popcorn flamed at the 5-leaf showed the highest tolerance while the 2-leaf was the most susceptible stage. The maximum yield reductions were 45%, 9% and 16% for the 2-leaf, 5-leaf and 7-leaf stages, respectively. In addition, propane doses that resulted in a 5% yield loss were 23 kg ha(-1) for the 2-leaf and 7-leaf and 30 kg ha(-1) for the 5-leaf stage. Flaming has a potential to be used effectively in organic popcorn production if properly used. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A combined methodology consisting of successive linear programming (SLP) and a simple genetic algorithm (SGA) solves the reactive planning problem. The problem is divided into operating and planning subproblems; the operating subproblem, which is a nonlinear, ill-conditioned and nonconvex problem, consists of determining the voltage control and the adjustment of reactive sources. The planning subproblem consists of obtaining the optimal reactive source expansion considering operational, economical and physical characteristics of the system. SLP solves the optimal reactive dispatch problem related to real variables, while SGA is used to determine the necessary adjustments of both the binary and discrete variables existing in the modelling problem. Once the set of candidate busbars has been defined, the program implemented gives the location and size of the reactive sources needed, if any, to maintain the operating and security constraints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new methodology for solving the optimal VAr planning problem in multi-area electric power systems, using the Dantzig-Wolfe decomposition. The original multi-area problem is decomposed into subproblems (one for each area) and a master problem (coordinator). The solution of the VAr planning problem in each area is based on the application of successive linear programming, and the coordination scheme is based on the reactive power marginal costs in the border bus. The aim of the model is to provide coordinated mechanisms to carry out the VAr planning studies maximizing autonomy and confidentiality for each area, assuring global economy to the whole system. Using the mathematical model and computational implementation of the proposed methodology, numerical results are presented for two interconnected systems, each of them composed of three equal subsystems formed by IEEE30 and IEEE118 test systems. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this dissertation is to use statistical tools to analyze specific financial risks that have played dominant roles in the US financial crisis of 2008-2009. The first risk relates to the level of aggregate stress in the financial markets. I estimate the impact of financial stress on economic activity and monetary policy using structural VAR analysis. The second set of risks concerns the US housing market. There are in fact two prominent risks associated with a US mortgage, as borrowers can both prepay or default on a mortgage. I test the existence of unobservable heterogeneity in the borrower's decision to default or prepay on his mortgage by estimating a multinomial logit model with borrower-specific random coefficients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses Bayesian vector autoregressive models to examine the usefulness of leading indicators in predicting US home sales. The benchmark Bayesian model includes home sales, the price of homes, the mortgage rate, real personal disposable income, and the unemployment rate. We evaluate the forecasting performance of six alternative leading indicators by adding each, in turn, to the benchmark model. Out-of-sample forecast performance over three periods shows that the model that includes building permits authorized consistently produces the most accurate forecasts. Thus, the intention to build in the future provides good information with which to predict home sales. Another finding suggests that leading indicators with longer leads outperform the short-leading indicators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The magnitude and the chronology of anthropogenic impregnation by Hg and other trace metals of environmental concern (V, Cr, Ni, Cu, Zn, Ag, Cd and Pb, including its stable isotopes) in the sediments are determined at the DYFAMED station, a site in the Ligurian Sea (Northwestern Mediterranean) chosen for its supposed open-sea characteristics. The DYFAMED site (VD) is located on the right levee of the Var Canyon turbidite system, at the end of the Middle Valley. In order to trace the influence of the gravity current coming from the canyon on trace metal distribution in the sediment, we studied an additional sediment core (VA) from a terrace of the Var Canyon, and material collected in sediment traps at the both sites at 20 m above sea bottom. The patterns of Hg and other trace element distribution profiles are interpreted using stable Pb isotope ratios as proxies for its sources, taking into account the sedimentary context (turbidites, redox conditions, and sedimentation rates). Major element distributions, coupled with the stratigraphic examination of the sediment cores point out the high heterogeneity of the deposits at VA, and major turbiditic events at both sites. At the DYFAMED site, we observed direct anthropogenic influence in the upper sediment layer (<2 cm), while on the Var Canyon site (VA), the anthropization concerns the whole sedimentary column sampled (19 cm). Turbiditic events superimpose their specific signature on trace metal distributions. According to the 210Pbxs-derived sedimentation rate at the DYFAMED site (0.4 mm yr-1), the Hg-enriched layer of the top core corresponds to the sediment accumulation of the last 50 years, which is the period of the highest increase in Hg deposition on a global scale. With the hypothesis of the absence of significant post-depositional redistribution of Hg, the Hg/C-org ratio changes between the surface and below are used to estimate the anthropogenic contribution to the Hg flux accumulated in the sediment. The Hg enrichment, from pre-industrial to the present time is calculated to be around 60%, consistent with estimations of global Hg models. However, based on the chemical composition of the trapped material collected in sediment traps, we calculated that epibenthic mobilization of Hg would reach 73%. Conversely, the Cd/C-org ratio decreases in the upper 5 cm, which may reflect the recent decrease of atmospheric Cd inputs or losses due to diagenetic processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regular vine copulas are multivariate dependence models constructed from pair-copulas (bivariate copulas). In this paper, we allow the dependence parameters of the pair-copulas in a D-vine decomposition to be potentially time-varying, following a nonlinear restricted ARMA(1,m) process, in order to obtain a very flexible dependence model for applications to multivariate financial return data. We investigate the dependence among the broad stock market indexes from Germany (DAX), France (CAC 40), Britain (FTSE 100), the United States (S&P 500) and Brazil (IBOVESPA) both in a crisis and in a non-crisis period. We find evidence of stronger dependence among the indexes in bear markets. Surprisingly, though, the dynamic D-vine copula indicates the occurrence of a sharp decrease in dependence between the indexes FTSE and CAC in the beginning of 2011, and also between CAC and DAX during mid-2011 and in the beginning of 2008, suggesting the absence of contagion in these cases. We also evaluate the dynamic D-vine copula with respect to Value-at-Risk (VaR) forecasting accuracy in crisis periods. The dynamic D-vine outperforms the static D-vine in terms of predictive accuracy for our real data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of risk measures associated with price series data movements and its predictions are of strategic importance in the financial markets as well as to policy makers in particular for short- and longterm planning for setting up economic growth targets. For example, oilprice risk-management focuses primarily on when and how an organization can best prevent the costly exposure to price risk. Value-at-Risk (VaR) is the commonly practised instrument to measure risk and is evaluated by analysing the negative/positive tail of the probability distributions of the returns (profit or loss). In modelling applications, least-squares estimation (LSE)-based linear regression models are often employed for modeling and analyzing correlated data. These linear models are optimal and perform relatively well under conditions such as errors following normal or approximately normal distributions, being free of large size outliers and satisfying the Gauss-Markov assumptions. However, often in practical situations, the LSE-based linear regression models fail to provide optimal results, for instance, in non-Gaussian situations especially when the errors follow distributions with fat tails and error terms possess a finite variance. This is the situation in case of risk analysis which involves analyzing tail distributions. Thus, applications of the LSE-based regression models may be questioned for appropriateness and may have limited applicability. We have carried out the risk analysis of Iranian crude oil price data based on the Lp-norm regression models and have noted that the LSE-based models do not always perform the best. We discuss results from the L1, L2 and L∞-norm based linear regression models. ACM Computing Classification System (1998): B.1.2, F.1.3, F.2.3, G.3, J.2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prosopis rubriflora and Prosopis ruscifolia are important species in the Chaquenian regions of Brazil. Because of the restriction and frequency of their physiognomy, they are excellent models for conservation genetics studies. The use of microsatellite markers (Simple Sequence Repeats, SSRs) has become increasingly important in recent years and has proven to be a powerful tool for both ecological and molecular studies. In this study, we present the development and characterization of 10 new markers for P. rubriflora and 13 new markers for P. ruscifolia. The genotyping was performed using 40 P. rubriflora samples and 48 P. ruscifolia samples from the Chaquenian remnants in Brazil. The polymorphism information content (PIC) of the P. rubriflora markers ranged from 0.073 to 0.791, and no null alleles or deviation from Hardy-Weinberg equilibrium (HW) were detected. The PIC values for the P. ruscifolia markers ranged from 0.289 to 0.883, but a departure from HW and null alleles were detected for certain loci; however, this departure may have resulted from anthropic activities, such as the presence of livestock, which is very common in the remnant areas. In this study, we describe novel SSR polymorphic markers that may be helpful in future genetic studies of P. rubriflora and P. ruscifolia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In acquired immunodeficiency syndrome (AIDS) studies it is quite common to observe viral load measurements collected irregularly over time. Moreover, these measurements can be subjected to some upper and/or lower detection limits depending on the quantification assays. A complication arises when these continuous repeated measures have a heavy-tailed behavior. For such data structures, we propose a robust structure for a censored linear model based on the multivariate Student's t-distribution. To compensate for the autocorrelation existing among irregularly observed measures, a damped exponential correlation structure is employed. An efficient expectation maximization type algorithm is developed for computing the maximum likelihood estimates, obtaining as a by-product the standard errors of the fixed effects and the log-likelihood function. The proposed algorithm uses closed-form expressions at the E-step that rely on formulas for the mean and variance of a truncated multivariate Student's t-distribution. The methodology is illustrated through an application to an Human Immunodeficiency Virus-AIDS (HIV-AIDS) study and several simulation studies.