561 resultados para predictability
Resumo:
The first multi-model study to estimate the predictability of a boreal Sudden Stratospheric Warming (SSW) is performed using five NWP systems. During the 2012-2013 boreal winter, anomalous upward propagating planetary wave activity was observed towards the end of December, which followed by a rapid deceleration of the westerly circulation around 2 January 2013, and on 7 January 2013 the zonal mean zonal wind at 60°N and 10 hPa reversed to easterly. This stratospheric dynamical activity was followed by an equatorward shift of the tropospheric jet stream and by a high pressure anomaly over the North Atlantic, which resulted in severe cold conditions in the UK and Northern Europe. In most of the five models, the SSW event was predicted 10 days in advance. However, only some ensemble members in most of the models predicted weakening of westerly wind when the models were initialized 15 days in advance of the SSW. Further dynamical analysis of the SSW shows that this event was characterized by the anomalous planetary wave-1 amplification followed by the anomalous wave-2 amplification in the stratosphere, which resulted in a split vortex occurring between 6 January 2013 and 8 January 2013. The models have some success in reproducing wave-1 activity when initialized 15 days in advance, they but generally failed to produce the wave-2 activity during the final days of the event. Detailed analysis shows that models have reasonably good skill in forecasting tropospheric blocking features that stimulate wave-2 amplification in the troposphere, but they have limited skill in reproducing wave-2 amplification in the stratosphere.
Resumo:
The use of kilometre-scale ensembles in operational forecasting provides new challenges for forecast interpretation and evaluation to account for uncertainty on the convective scale. A new neighbourhood based method is presented for evaluating and characterising the local predictability variations from convective scale ensembles. Spatial scales over which ensemble forecasts agree (agreement scales, S^A) are calculated at each grid point ij, providing a map of the spatial agreement between forecasts. By comparing the average agreement scale obtained from ensemble member pairs (S^A(mm)_ij), with that between members and radar observations (S^A(mo)_ij), this approach allows the location-dependent spatial spread-skill relationship of the ensemble to be assessed. The properties of the agreement scales are demonstrated using an idealised experiment. To demonstrate the methods in an operational context the S^A(mm)_ij and S^A(mo)_ij are calculated for six convective cases run with the Met Office UK Ensemble Prediction System. The S^A(mm)_ij highlight predictability differences between cases, which can be linked to physical processes. Maps of S^A(mm)_ij are found to summarise the spatial predictability in a compact and physically meaningful manner that is useful for forecasting and for model interpretation. Comparison of S^A(mm)_ij and S^A(mo)_ij demonstrates the case-by-case and temporal variability of the spatial spread-skill, which can again be linked to physical processes.
Resumo:
Decadal predictions on timescales from one year to one decade are gaining importance since this time frame falls within the planning horizon of politics, economy and society. The present study examines the decadal predictability of regional wind speed and wind energy potentials in three generations of the MiKlip (‘Mittelfristige Klimaprognosen’) decadal prediction system. The system is based on the global Max-Planck-Institute Earth System Model (MPI-ESM), and the three generations differ primarily in the ocean initialisation. Ensembles of uninitialised historical and yearly initialised hindcast experiments are used to assess the forecast skill for 10 m wind speeds and wind energy output (Eout) over Central Europe with lead times from one year to one decade. With this aim, a statistical-dynamical downscaling (SDD) approach is used for the regionalisation. Its added value is evaluated by comparison of skill scores for MPI-ESM large-scale wind speeds and SDD-simulated regional wind speeds. All three MPI-ESM ensemble generations show some forecast skill for annual mean wind speed and Eout over Central Europe on yearly and multi-yearly time scales. This forecast skill is mostly limited to the first years after initialisation. Differences between the three ensemble generations are generally small. The regionalisation preserves and sometimes increases the forecast skills of the global runs but results depend on lead time and ensemble generation. Moreover, regionalisation often improves the ensemble spread. Seasonal Eout skills are generally lower than for annual means. Skill scores are lowest during summer and persist longest in autumn. A large-scale westerly weather type with strong pressure gradients over Central Europe is identified as potential source of the skill for wind energy potentials, showing a similar forecast skill and a high correlation with Eout anomalies. These results are promising towards the establishment of a decadal prediction system for wind energy applications over Central Europe.
Resumo:
Skillful sea ice forecasts from days to years ahead are becoming increasingly important for the operation and planning of human activities in the Arctic. Here we analyze the potential predictability of the Arctic sea ice edge in six climate models. We introduce the integrated ice-edge error (IIEE), a user-relevant verification metric defined as the area where the forecast and the “truth” disagree on the ice concentration being above or below 15%. The IIEE lends itself to decomposition into an absolute extent error, corresponding to the common sea ice extent error, and a misplacement error. We find that the often-neglected misplacement error makes up more than half of the climatological IIEE. In idealized forecast ensembles initialized on 1 July, the IIEE grows faster than the absolute extent error. This means that the Arctic sea ice edge is less predictable than sea ice extent, particularly in September, with implications for the potential skill of end-user relevant forecasts.
Resumo:
The influence of visual stimuli intensity on manual reaction time (RT) was investigated under two different attentional settings: high (Experiment 1) and low (Experiment 2) stimulus location predictability. These two experiments were also run under both binocular and monocular viewing conditions. We observed that RT decreased as stimulus intensity increased. It also decreased as the viewing condition was changed from monocular to binocular as well as the location predictability shifted from low to high. A significant interaction was found between stimulus intensity and viewing condition, but no interaction was observed between neither of these factors and location predictability. These findings support the idea that the stimulus intensity effect arises from purely sensory, pre-attentive mechanisms rather than deriving from more efficient attentional capture. (C) 2010 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Previous studies have documented a subjective temporal attraction between actions and their effects. This finding, named intentional binding, is thought to be the result of a cognitive function that links actions to their consequences. Although several studies have tried to outline the necessary and sufficient conditions for intentional binding, a quantitative comparison between the roles of temporal contiguity, predictability and voluntary action and the evaluation of their interactions is difficult due to the high variability of the temporal binding measurements. In the present study, we used a novel methodology to investigate the properties of intentional binding. Subjects judged whether an auditory stimulus, which could either be triggered by a voluntary finger lift or be presented after a visual temporal marker unrelated to any action, was presented synchronously with a reference stimulus. In three experiments, the predictability, the interval between action and consequence and the presence of action itself were manipulated. The results indicate that (1) action is a necessary condition for temporal binding; (2) a fixed interval between the two events is not sufficient to cause the effect and (3) only in the presence of voluntary action do temporal predictability and contiguity play a significant role in modulating the effect.These findings are discussed in the context of the relationship between intentional binding and temporal expectation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Most studies involving statistical time series analysis rely on assumptions of linearity, which by its simplicity facilitates parameter interpretation and estimation. However, the linearity assumption may be too restrictive for many practical applications. The implementation of nonlinear models in time series analysis involves the estimation of a large set of parameters, frequently leading to overfitting problems. In this article, a predictability coefficient is estimated using a combination of nonlinear autoregressive models and the use of support vector regression in this model is explored. We illustrate the usefulness and interpretability of results by using electroencephalographic records of an epileptic patient.
Resumo:
Este Trabalho Discute as Perspectivas da Regulação Econômica no Brasil. para Tanto, Primeiramente Apresenta-Se a Evolução Histórica da Regulação no País, Discutindo as Principais Questões Relacionadas Às Agências Reguladoras Federais. em Segundo Lugar, os Marcos Regulatórios de Cinco Diferentes Setores (Telecomunicações, Eletricidade, Saneamento Básico, Petróleo e Gás Natural) são Analisados. em Terceiro Lugar, a Questão do Financiamento de Investimentos em Infra-Estrutura é Tratada, Enfatizando o Papel das Parcerias Público-Privadas (Ppps). uma Seção Final Cont~Em um Possível Agenda para a Regulação no Brasil
Resumo:
In this study, we verify the existence of predictability in the Brazilian equity market. Unlike other studies in the same sense, which evaluate original series for each stock, we evaluate synthetic series created on the basis of linear models of stocks. Following Burgess (1999), we use the “stepwise regression” model for the formation of models of each stock. We then use the variance ratio profile together with a Monte Carlo simulation for the selection of models with potential predictability. Unlike Burgess (1999), we carry out White’s Reality Check (2000) in order to verify the existence of positive returns for the period outside the sample. We use the strategies proposed by Sullivan, Timmermann & White (1999) and Hsu & Kuan (2005) amounting to 26,410 simulated strategies. Finally, using the bootstrap methodology, with 1,000 simulations, we find strong evidence of predictability in the models, including transaction costs.
Resumo:
We evaluate the forecasting performance of a number of systems models of US shortand long-term interest rates. Non-linearities, induding asymmetries in the adjustment to equilibrium, are shown to result in more accurate short horizon forecasts. We find that both long and short rates respond to disequilibria in the spread in certain circumstances, which would not be evident from linear representations or from single-equation analyses of the short-term interest rate.
Resumo:
Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.
Resumo:
This paper proposes a new novel to calculate tail risks incorporating risk-neutral information without dependence on options data. Proceeding via a non parametric approach we derive a stochastic discount factor that correctly price a chosen panel of stocks returns. With the assumption that states probabilities are homogeneous we back out the risk neutral distribution and calculate five primitive tail risk measures, all extracted from this risk neutral probability. The final measure is than set as the first principal component of the preliminary measures. Using six Fama-French size and book to market portfolios to calculate our tail risk, we find that it has significant predictive power when forecasting market returns one month ahead, aggregate U.S. consumption and GDP one quarter ahead and also macroeconomic activity indexes. Conditional Fama-Macbeth two-pass cross-sectional regressions reveal that our factor present a positive risk premium when controlling for traditional factors.
Resumo:
OBJECTIVE: The aim of this study was to compare by means of McNamara as well as Legan and Burstone's cephalometric analyses, both manual and digitized (by Dentofacial Planner Plus and Dolphin Image software) prediction tracings to post-surgical results. METHODS: Pre and post-surgical teleradiographs (6 months) of 25 long face patients subjected to combined orthognathic surgery were selected. Manual and computerized prediction tracings of each patient were performed and cephalometrically compared to post-surgical outcomes. This protocol was repeated in order to evaluate the method error and statistical evaluation was conducted by means of analysis of variance and Tukey's test. RESULTS: A higher frequency of cephalometric variables, which were not statistically different from the actual post-surgical results for the manual method, was observed. It was followed by DFPlus and Dolphin software; in which similar cephalometric values for most variables were observed. CONCLUSION: It was concluded that the manual method seemed more reliable, although the predictability of the evaluated methods (computerized and manual) proved to be reasonably satisfactory and similar.
Resumo:
Introduction: In clinical situations where severe bone resorption has occurred following tooth loss, implant treatment options may comprise either a previous bone reconstruction or only the use of short implants. Objective: This non-systematic review summarizes and discusses some aspects of the use of short implants, such as: biomechanical aspects, success rate, longevity and surgical-prosthetic planning. Literature review: Current and relevant references were selected in order to compare short dental implants to conventional ones. Several studies have highlighted the great importance of wide-diameter implants. Dental short implants have shown high predictability and success rates when some biomechanical aspects are taken into consideration. Conclusion: Placement of short dental implants is a viable treatment method for patients with decreased bone height.
Resumo:
The classification of texts has become a major endeavor with so much electronic material available, for it is an essential task in several applications, including search engines and information retrieval. There are different ways to define similarity for grouping similar texts into clusters, as the concept of similarity may depend on the purpose of the task. For instance, in topic extraction similar texts mean those within the same semantic field, whereas in author recognition stylistic features should be considered. In this study, we introduce ways to classify texts employing concepts of complex networks, which may be able to capture syntactic, semantic and even pragmatic features. The interplay between various metrics of the complex networks is analyzed with three applications, namely identification of machine translation (MT) systems, evaluation of quality of machine translated texts and authorship recognition. We shall show that topological features of the networks representing texts can enhance the ability to identify MT systems in particular cases. For evaluating the quality of MT texts, on the other hand, high correlation was obtained with methods capable of capturing the semantics. This was expected because the golden standards used are themselves based on word co-occurrence. Notwithstanding, the Katz similarity, which involves semantic and structure in the comparison of texts, achieved the highest correlation with the NIST measurement, indicating that in some cases the combination of both approaches can improve the ability to quantify quality in MT. In authorship recognition, again the topological features were relevant in some contexts, though for the books and authors analyzed good results were obtained with semantic features as well. Because hybrid approaches encompassing semantic and topological features have not been extensively used, we believe that the methodology proposed here may be useful to enhance text classification considerably, as it combines well-established strategies. (c) 2012 Elsevier B.V. All rights reserved.