629 resultados para predictability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This letter discusses blind separability based on temporal predictability (Stone, 2001; Xie, He, & Fu, 2005). Our results show that the sources are separable using the temporal predictability method if and only if they have different temporal structures (i.e., autocorrelations). Consequently, the applicability and limitations of the temporal predictability method are clarified. In addition, instead of using generalized eigendecomposition, we suggest using joint approximate diagonalization algorithms to improve the robustness of the method. A new criterion is presented to evaluate the separation results. Numerical simulations are performed to demonstrate the validity of the theoretical results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a simple panel data test for stock return predictability that is flexible enough to accommodate three key salient features of the data, namely, predictor persistency and endogeneity, and cross-sectional dependence. Using a large panel of Chinese stock market data comprising more than one million observations, we show that most financial and macroeconomic predictors are in fact able to predict returns. We also show how the extent of the predictability varies across industries and firm sizes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A common explanation for the inability of the monetary model to beat the random walk in forecasting future exchange rates is that conventional time series tests may have low power, and that panel data should generate more powerful tests. This paper provides an extensive evaluation of this power argument to the use of panel data in the forecasting context. In particular, by using simulations it is shown that although pooling of the individual prediction tests can lead to substantial power gains, pooling only the parameters of the forecasting equation, as has been suggested in the previous literature, does not seem to generate more powerful tests. The simulation results are illustrated through an empirical application. Copyright © 2007 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© The Author, 2014. Most studies of the predictability of returns are based on time series data, and whenever panel data are used, the testing is almost always conducted in an unrestricted unit-by-unit fashion, which makes for a very heavy parametrization of the model. On the other hand, the few panel tests that exist are too restrictive in the sense that they are based on homogeneity assumptions that might not be true. As a response to this, the current study proposes new predictability tests in the context of a random coefficient panel data model, in which the null of no predictability corresponds to the joint restriction that the predictive slope has zero mean and variance. The tests are applied to a large panel of stocks listed at the New York Stock Exchange. The results suggest that while the predictive slopes tend to average to zero, in case of book-to-market and cash flow-to-price the variance of the slopes is positive, which we take as evidence of predictability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este Trabalho Discute as Perspectivas da Regulação Econômica no Brasil. para Tanto, Primeiramente Apresenta-Se a Evolução Histórica da Regulação no País, Discutindo as Principais Questões Relacionadas Às Agências Reguladoras Federais. em Segundo Lugar, os Marcos Regulatórios de Cinco Diferentes Setores (Telecomunicações, Eletricidade, Saneamento Básico, Petróleo e Gás Natural) são Analisados. em Terceiro Lugar, a Questão do Financiamento de Investimentos em Infra-Estrutura é Tratada, Enfatizando o Papel das Parcerias Público-Privadas (Ppps). uma Seção Final Cont~Em um Possível Agenda para a Regulação no Brasil

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, we verify the existence of predictability in the Brazilian equity market. Unlike other studies in the same sense, which evaluate original series for each stock, we evaluate synthetic series created on the basis of linear models of stocks. Following Burgess (1999), we use the “stepwise regression” model for the formation of models of each stock. We then use the variance ratio profile together with a Monte Carlo simulation for the selection of models with potential predictability. Unlike Burgess (1999), we carry out White’s Reality Check (2000) in order to verify the existence of positive returns for the period outside the sample. We use the strategies proposed by Sullivan, Timmermann & White (1999) and Hsu & Kuan (2005) amounting to 26,410 simulated strategies. Finally, using the bootstrap methodology, with 1,000 simulations, we find strong evidence of predictability in the models, including transaction costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We evaluate the forecasting performance of a number of systems models of US shortand long-term interest rates. Non-linearities, induding asymmetries in the adjustment to equilibrium, are shown to result in more accurate short horizon forecasts. We find that both long and short rates respond to disequilibria in the spread in certain circumstances, which would not be evident from linear representations or from single-equation analyses of the short-term interest rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a new novel to calculate tail risks incorporating risk-neutral information without dependence on options data. Proceeding via a non parametric approach we derive a stochastic discount factor that correctly price a chosen panel of stocks returns. With the assumption that states probabilities are homogeneous we back out the risk neutral distribution and calculate five primitive tail risk measures, all extracted from this risk neutral probability. The final measure is than set as the first principal component of the preliminary measures. Using six Fama-French size and book to market portfolios to calculate our tail risk, we find that it has significant predictive power when forecasting market returns one month ahead, aggregate U.S. consumption and GDP one quarter ahead and also macroeconomic activity indexes. Conditional Fama-Macbeth two-pass cross-sectional regressions reveal that our factor present a positive risk premium when controlling for traditional factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: The aim of this study was to compare by means of McNamara as well as Legan and Burstone's cephalometric analyses, both manual and digitized (by Dentofacial Planner Plus and Dolphin Image software) prediction tracings to post-surgical results. METHODS: Pre and post-surgical teleradiographs (6 months) of 25 long face patients subjected to combined orthognathic surgery were selected. Manual and computerized prediction tracings of each patient were performed and cephalometrically compared to post-surgical outcomes. This protocol was repeated in order to evaluate the method error and statistical evaluation was conducted by means of analysis of variance and Tukey's test. RESULTS: A higher frequency of cephalometric variables, which were not statistically different from the actual post-surgical results for the manual method, was observed. It was followed by DFPlus and Dolphin software; in which similar cephalometric values for most variables were observed. CONCLUSION: It was concluded that the manual method seemed more reliable, although the predictability of the evaluated methods (computerized and manual) proved to be reasonably satisfactory and similar.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: In clinical situations where severe bone resorption has occurred following tooth loss, implant treatment options may comprise either a previous bone reconstruction or only the use of short implants. Objective: This non-systematic review summarizes and discusses some aspects of the use of short implants, such as: biomechanical aspects, success rate, longevity and surgical-prosthetic planning. Literature review: Current and relevant references were selected in order to compare short dental implants to conventional ones. Several studies have highlighted the great importance of wide-diameter implants. Dental short implants have shown high predictability and success rates when some biomechanical aspects are taken into consideration. Conclusion: Placement of short dental implants is a viable treatment method for patients with decreased bone height.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The classification of texts has become a major endeavor with so much electronic material available, for it is an essential task in several applications, including search engines and information retrieval. There are different ways to define similarity for grouping similar texts into clusters, as the concept of similarity may depend on the purpose of the task. For instance, in topic extraction similar texts mean those within the same semantic field, whereas in author recognition stylistic features should be considered. In this study, we introduce ways to classify texts employing concepts of complex networks, which may be able to capture syntactic, semantic and even pragmatic features. The interplay between various metrics of the complex networks is analyzed with three applications, namely identification of machine translation (MT) systems, evaluation of quality of machine translated texts and authorship recognition. We shall show that topological features of the networks representing texts can enhance the ability to identify MT systems in particular cases. For evaluating the quality of MT texts, on the other hand, high correlation was obtained with methods capable of capturing the semantics. This was expected because the golden standards used are themselves based on word co-occurrence. Notwithstanding, the Katz similarity, which involves semantic and structure in the comparison of texts, achieved the highest correlation with the NIST measurement, indicating that in some cases the combination of both approaches can improve the ability to quantify quality in MT. In authorship recognition, again the topological features were relevant in some contexts, though for the books and authors analyzed good results were obtained with semantic features as well. Because hybrid approaches encompassing semantic and topological features have not been extensively used, we believe that the methodology proposed here may be useful to enhance text classification considerably, as it combines well-established strategies. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The subject of this work concerns the study of the immigration phenomenon, with emphasis on the aspects related to the integration of an immigrant population in a hosting one. Aim of this work is to show the forecasting ability of a recent finding where the behavior of integration quantifiers was analyzed and investigated with a mathematical model of statistical physics origins (a generalization of the monomer dimer model). After providing a detailed literature review of the model, we show that not only such a model is able to identify the social mechanism that drives a particular integration process, but it also provides correct forecast. The research reported here proves that the proposed model of integration and its forecast framework are simple and effective tools to reduce uncertainties about how integration phenomena emerge and how they are likely to develop in response to increased migration levels in the future.