967 resultados para Time series studies
Resumo:
It is well known that cointegration between the level of two variables (e.g. prices and dividends) is a necessary condition to assess the empirical validity of a present-value model (PVM) linking them. The work on cointegration,namelyon long-run co-movements, has been so prevalent that it is often over-looked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. This amounts to investigate whether short-run co-movememts steming from common cyclical feature restrictions are also present in such a system. In this paper we test for the presence of such co-movement on long- and short-term interest rates and on price and dividend for the U.S. economy. We focuss on the potential improvement in forecasting accuracies when imposing those two types of restrictions coming from economic theory.
Resumo:
This paper has two original contributions. First, we show that the present value model (PVM hereafter), which has a wide application in macroeconomics and fi nance, entails common cyclical feature restrictions in the dynamics of the vector error-correction representation (Vahid and Engle, 1993); something that has been already investigated in that VECM context by Johansen and Swensen (1999, 2011) but has not been discussed before with this new emphasis. We also provide the present value reduced rank constraints to be tested within the log-linear model. Our second contribution relates to forecasting time series that are subject to those long and short-run reduced rank restrictions. The reason why appropriate common cyclical feature restrictions might improve forecasting is because it finds natural exclusion restrictions preventing the estimation of useless parameters, which would otherwise contribute to the increase of forecast variance with no expected reduction in bias. We applied the techniques discussed in this paper to data known to be subject to present value restrictions, i.e. the online series maintained and up-dated by Shiller. We focus on three different data sets. The fi rst includes the levels of interest rates with long and short maturities, the second includes the level of real price and dividend for the S&P composite index, and the third includes the logarithmic transformation of prices and dividends. Our exhaustive investigation of several different multivariate models reveals that better forecasts can be achieved when restrictions are applied to them. Moreover, imposing short-run restrictions produce forecast winners 70% of the time for target variables of PVMs and 63.33% of the time when all variables in the system are considered.
Resumo:
Using a sequence of nested multivariate models that are VAR-based, we discuss different layers of restrictions imposed by present-value models (PVM hereafter) on the VAR in levels for series that are subject to present-value restrictions. Our focus is novel - we are interested in the short-run restrictions entailed by PVMs (Vahid and Engle, 1993, 1997) and their implications for forecasting. Using a well-known database, kept by Robert Shiller, we implement a forecasting competition that imposes different layers of PVM restrictions. Our exhaustive investigation of several different multivariate models reveals that better forecasts can be achieved when restrictions are applied to the unrestricted VAR. Moreover, imposing short-run restrictions produces forecast winners 70% of the time for the target variables of PVMs and 63.33% of the time when all variables in the system are considered.
Resumo:
Researchers often rely on the t-statistic to make inference on parameters in statistical models. It is common practice to obtain critical values by simulation techniques. This paper proposes a novel numerical method to obtain an approximately similar test. This test rejects the null hypothesis when the test statistic islarger than a critical value function (CVF) of the data. We illustrate this procedure when regressors are highly persistent, a case in which commonly-used simulation methods encounter dificulties controlling size uniformly. Our approach works satisfactorily, controls size, and yields a test which outperforms the two other known similar tests.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Energy policies and technological progress in the development of wind turbines have made wind power the fastest growing renewable power source worldwide. The inherent variability of this resource requires special attention when analyzing the impacts of high penetration on the distribution network. A time-series steady-state analysis is proposed that assesses technical issues such as energy export, losses, and short-circuit levels. A multiobjective programming approach based on the nondominated sorting genetic algorithm (NSGA) is applied in order to find configurations that maximize the integration of distributed wind power generation (DWPG) while satisfying voltage and thermal limits. The approach has been applied to a medium voltage distribution network considering hourly demand and wind profiles for part of the U.K. The Pareto optimal solutions obtained highlight the drawbacks of using a single demand and generation scenario, and indicate the importance of appropriate substation voltage settings for maximizing the connection of MPG.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
PURPOSE: To evaluate the efficacy of surgical treatment for esophageal perforation. METHODS: A systematic review of the literature was performed. We conducted a search strategy in the main electronic databases such as PubMed, Embase and Lilacs to identify all case series. RESULTS: Thirty three case series met the inclusion criteria with a total of 1417 participants. The predominant etiology was iatrogenic (54.2%) followed by spontaneous cause (20.4%) and in 66.1% the localization was thoracic. In 65.4% and 33.4% surgical and conservative therapy, respectively, was considered the first choice. There was a statistically significance different with regards mortality rate favoring the surgical group (16.3%) versus conservative treatment (21.2%) (p<0.05). CONCLUSION: Surgical treatment was more effective and safe than conservative treatment concerning mortality rates, although the possibility of bias due to clinical and methodological heterogeneity among the included studies and the level of evidence that cannot be ruled out.
Resumo:
The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary. (C) 2002 Elsevier B.V. B.V. All rights reserved.
Resumo:
In this work we compared the estimates of the parameters of ARCH models using a complete Bayesian method and an empirical Bayesian method in which we adopted a non-informative prior distribution and informative prior distribution, respectively. We also considered a reparameterization of those models in order to map the space of the parameters into real space. This procedure permits choosing prior normal distributions for the transformed parameters. The posterior summaries were obtained using Monte Carlo Markov chain methods (MCMC). The methodology was evaluated by considering the Telebras series from the Brazilian financial market. The results show that the two methods are able to adjust ARCH models with different numbers of parameters. The empirical Bayesian method provided a more parsimonious model to the data and better adjustment than the complete Bayesian method.
Resumo:
OBJECTIVETo determine the current status of the literature regarding the clinical efficacy and complication rates of cryoablation vs radiofrequency ablation in the treatment of small renal tumours.METHODSA review of the literature was conducted. There was no language restriction. Studies were obtained from the following sources: MEDLINE, EMBASE and LILACS.Inclusion criteria were (i) case series design with more than one case reported, (ii) use of cryoablation or radiofrequency ablation, (iii) patients with renal cell carcinoma and, (iv) outcome reported as clinical efficacy.When available, we also quantified the complication rates from each included study.Proportional meta-analysis was performed on both outcomes with a random-effects model. The 95% confidential intervals were also calculated.RESULTSThirty-one case series (20 cryoablation, 11 radiofrequency ablation) met all inclusion criteria.The pooled proportion of clinical efficacy was 89% in cryoablation therapy from a total of 457 cases. There was a statistically significant heterogeneity between these studies showing the inconsistency of clinical and methodological aspects.The pooled proportion of clinical efficacy was 90% in radiofrequency ablation therapy from a total of 426 cases. There was no statistically significant heterogeneity between these studies.There was no statistically significant difference regarding complications rate between cryoablation and radiofrequency ablation.CONCLUSIONSThis review shows that both ablation therapies have similar efficacy and complication rates.There is urgency for performing clinical trials with long-term data to establish which intervention is most suitable for the treatment of small renal masses.