967 resultados para forecasting models
Resumo:
We evaluate conditional predictive densities for U.S. output growth and inflationusing a number of commonly used forecasting models that rely on a large number ofmacroeconomic predictors. More specifically, we evaluate how well conditional predictive densities based on the commonly used normality assumption fit actual realizationsout-of-sample. Our focus on predictive densities acknowledges the possibility that, although some predictors can improve or deteriorate point forecasts, they might have theopposite effect on higher moments. We find that normality is rejected for most modelsin some dimension according to at least one of the tests we use. Interestingly, however,combinations of predictive densities appear to be correctly approximated by a normaldensity: the simple, equal average when predicting output growth and Bayesian modelaverage when predicting inflation.
Resumo:
This paper proposes new methodologies for evaluating out-of-sample forecastingperformance that are robust to the choice of the estimation window size. The methodologies involve evaluating the predictive ability of forecasting models over a wide rangeof window sizes. We show that the tests proposed in the literature may lack the powerto detect predictive ability and might be subject to data snooping across differentwindow sizes if used repeatedly. An empirical application shows the usefulness of themethodologies for evaluating exchange rate models' forecasting ability.
Resumo:
Bacterial transcription activators of the XylR/DmpR subfamily exert their expression control via σ(54)-dependent RNA polymerase upon stimulation by a chemical effector, typically an aromatic compound. Where the chemical effector interacts with the transcription regulator protein to achieve activation is still largely unknown. Here we focus on the HbpR protein from Pseudomonas azelaica, which is a member of the XylR/DmpR subfamily and responds to biaromatic effectors such as 2-hydroxybiphenyl. We use protein structure modeling to predict folding of the effector recognition domain of HbpR and molecular docking to identify the region where 2-hydroxybiphenyl may interact with HbpR. A large number of site-directed HbpR mutants of residues in- and outside the predicted interaction area was created and their potential to induce reporter gene expression in Escherichia coli from the cognate P(C) promoter upon activation with 2-hydroxybiphenyl was studied. Mutant proteins were purified to study their conformation. Critical residues for effector stimulation indeed grouped near the predicted area, some of which are conserved among XylR/DmpR subfamily members in spite of displaying different effector specificities. This suggests that they are important for the process of effector activation, but not necessarily for effector specificity recognition.
Resumo:
The loss of autonomy at advanced ages is not only associated with ageing, but also with the characteristics of the physical and social environment. Recent investigations have shown that social networks, social engagement and participation act like predictors of disability among the elderly. The aim of this study is to determine whether social networks are related to the development and progression of disability in the early years of old age. The source of data is the first wave of the survey "Processes of Vulnerability among Spanish Elderly", carried out in 2005 to a sample of 1 244 individuals. The population object of study is the cohort aged 70 to 74 years in metropolitan areas (Madrid and Barcelona) and not institutionalized. Disability is measured by the development of basic activities of daily life (ADL), and instrumental activities of daily life (IADL). The structural aspects of the social relationships are measured through the diversity of social networks and participation. We used the social network index (SNI). For each point over the SNI, the risk of developing any type of disability decreased by 49% (HR = 0.51, 95%CI = 0.31-0.82). The SNI was a decisive factor in all forecasting models constructed with some hazard ratios (HR) that ranged from 0.29 (95%CI = 0.14-0.59) in the first model to 0.43 (95%CI 0.20-0.90) in the full model. The results of the present study showed a strong association between an active social life, emotional support provided by friends and confidents and disability. These findings suggest a protective effect of social networks on disability. Also, these results indicate that some family and emotional ties have a significant effect on both the prevalence and the incidence of disability.
Resumo:
Tässä diplomityössä käydään läpi materiaalihallintaa strategisista näkökulmista, joihin kuuluvat strateginen suunnittelu ja johtaminen, hankintatoimi, materiaalihallinta ennustamisen ja varastoimisen kautta sekä yleinen toiminnan kehittäminen. Näiden osa-alueiden kokonaissummasta tulee käsite strateginen materiaalihallinta, jonka avulla tässä työssä yritetään ratkaista tutkimuskohteena olevan yrityksen materiaalihallinnan haasteita. Strateginen materiaalihallinnan suunnittelu pitää aloittaa hahmottamalla oma ympäristö, esimerkiksi portfolioanalyysiä ja Ishikawan kalanruotokaaviota hyväksikäyttäen. Tämän jälkeen omat materiaalihallinnan kyvykkyydet voidaan analysoida, esimerkiksi SWOT-analyysin avulla, jolla pystytään kartoittamaan omat vahvuudet ja heikkoudet, sekä uhat että mahdollisuudet. Vasta kun ympäristö ja omat kyvykkyydet ovat analysoitu, voidaan asettaa päämäärät ja tavoitteet, joilla pyritään tukemaan kyseistä liiketoimintaa strategisen materiaalihallinnan avulla. Näiden tavoitteiden ja päämäärien saavuttamista on tärkeää myös seurata ja mitata. Strategista materiaalihallintaa voidaan optimoida eri tavoin, esimerkiksi erilaisilla hankinta-, varastointi, ja ennustemalleilla. Myös ABC-analyysin avulla voidaan ohjata eri ABC-luokkien materiaalihallintaa. Strateginen materiaalihallinta pyrkii siis tukemaan strategisesti liiketoimintojen päämääriä ja samalla vastaamaan asiakaskysyntään määritetyllä toimituskyvyllä minimi kokonaiskustannuksin.
Resumo:
This paper develops and estimates a game-theoretical model of inflation targeting where the central banker's preferences are asymmetric around the targeted rate. In particular, positive deviations from the target can be weighted more, or less, severely than negative ones in the central banker's loss function. It is shown that some of the previous results derived under the assumption of symmetry are not robust to the generalization of preferences. Estimates of the central banker's preference parameters for Canada, Sweden, and the United Kingdom are statistically different from the ones implied by the commonly used quadratic loss function. Econometric results are robust to different forecasting models for the rate of unemployment but not to the use of measures of inflation broader than the one targeted.
Resumo:
This study is concerned with Autoregressive Moving Average (ARMA) models of time series. ARMA models form a subclass of the class of general linear models which represents stationary time series, a phenomenon encountered most often in practice by engineers, scientists and economists. It is always desirable to employ models which use parameters parsimoniously. Parsimony will be achieved by ARMA models because it has only finite number of parameters. Even though the discussion is primarily concerned with stationary time series, later we will take up the case of homogeneous non stationary time series which can be transformed to stationary time series. Time series models, obtained with the help of the present and past data is used for forecasting future values. Physical science as well as social science take benefits of forecasting models. The role of forecasting cuts across all fields of management-—finance, marketing, production, business economics, as also in signal process, communication engineering, chemical processes, electronics etc. This high applicability of time series is the motivation to this study.
Resumo:
Thermodynamic parameters of the atmosphere form part of the input to numerical forecasting models. Usually these parameters are evaluated from a thermodynamic diagram. Here, a technique is developed to evaluate these parameters quickly and accurately using a Fortran program. This technique is tested with four sets of randomly selected data and the results are in agreement with the results from the conventional method. This technique is superior to the conventional method in three respects: more accuracy, less computation time, and evaluation of additional parameters. The computation time for all the parameters on a PC AT 286 machine is II sec. This software, with appropriate modifications, can be used, for verifying various lines on a thermodynamic diagram
Resumo:
In this study, the oceanic regions that are associated with anomalous Ethiopian summer rains were identified and the teleconnection mechanisms that give rise to these associations have been investigated. Because of the complexities of rainfall climate in the horn of Africa, Ethiopia has been subdivided into six homogeneous rainfall zones and the influence of SST anomalies was analysed separately for each zone. The investigation made use of composite analysis and modelling experiments. Two sets of composites of atmospheric fields were generated, one based on excess/deficit rainfall anomalies and the other based on warm/cold SST anomalies in specific oceanic regions. The aim of the composite analysis was to determine the link between SST and rainfall in terms of large scale features. The modelling experiments were intended to explore the causality of these linkage. The results show that the equatorial Pacific, the midlatitude northwest Pacific and the Gulf of Guinea all exert an influence on the summer rainfall in various part of the country. The results demonstrate that different mechanisms linked to sea surface temperature control variations in rainfall in different parts of Ethiopia. This has important consequences for seasonal forecasting models which are based on statistical correlations between SST and seasonal rainfall totals. It is clear that such statistical models should take account of the local variations in teleconnections.
Resumo:
Currently, most operational forecasting models use latitude-longitude grids, whose convergence of meridians towards the poles limits parallel scaling. Quasi-uniform grids might avoid this limitation. Thuburn et al, JCP, 2009 and Ringler et al, JCP, 2010 have developed a method for arbitrarily-structured, orthogonal C-grids (TRiSK), which has many of the desirable properties of the C-grid on latitude-longitude grids but which works on a variety of quasi-uniform grids. Here, five quasi-uniform, orthogonal grids of the sphere are investigated using TRiSK to solve the shallow-water equations. We demonstrate some of the advantages and disadvantages of the hexagonal and triangular icosahedra, a Voronoi-ised cubed sphere, a Voronoi-ised skipped latitude-longitude grid and a grid of kites in comparison to a full latitude-longitude grid. We will show that the hexagonal-icosahedron gives the most accurate results (for least computational cost). All of the grids suffer from spurious computational modes; this is especially true of the kite grid, despite it having exactly twice as many velocity degrees of freedom as height degrees of freedom. However, the computational modes are easiest to control on the hexagonal icosahedron since they consist of vorticity oscillations on the dual grid which can be controlled using a diffusive advection scheme for potential vorticity.
Resumo:
There are several scoring rules that one can choose from in order to score probabilistic forecasting models or estimate model parameters. Whilst it is generally agreed that proper scoring rules are preferable, there is no clear criterion for preferring one proper scoring rule above another. This manuscript compares and contrasts some commonly used proper scoring rules and provides guidance on scoring rule selection. In particular, it is shown that the logarithmic scoring rule prefers erring with more uncertainty, the spherical scoring rule prefers erring with lower uncertainty, whereas the other scoring rules are indifferent to either option.
Resumo:
We consider tests of forecast encompassing for probability forecasts, for both quadratic and logarithmic scoring rules. We propose test statistics for the null of forecast encompassing, present the limiting distributions of the test statistics, and investigate the impact of estimating the forecasting models' parameters on these distributions. The small-sample performance is investigated, in terms of small numbers of forecasts and model estimation sample sizes. We show the usefulness of the tests for the evaluation of recession probability forecasts from logit models with different leading indicators as explanatory variables, and for evaluating survey-based probability forecasts.
Resumo:
We consider forecasting using a combination, when no model coincides with a non-constant data generation process (DGP). Practical experience suggests that combining forecasts adds value, and can even dominate the best individual device. We show why this can occur when forecasting models are differentially mis-specified, and is likely to occur when the DGP is subject to location shifts. Moreover, averaging may then dominate over estimated weights in the combination. Finally, it cannot be proved that only non-encompassed devices should be retained in the combination. Empirical and Monte Carlo illustrations confirm the analysis.
Resumo:
Quantile forecasts are central to risk management decisions because of the widespread use of Value-at-Risk. A quantile forecast is the product of two factors: the model used to forecast volatility, and the method of computing quantiles from the volatility forecasts. In this paper we calculate and evaluate quantile forecasts of the daily exchange rate returns of five currencies. The forecasting models that have been used in recent analyses of the predictability of daily realized volatility permit a comparison of the predictive power of different measures of intraday variation and intraday returns in forecasting exchange rate variability. The methods of computing quantile forecasts include making distributional assumptions for future daily returns as well as using the empirical distribution of predicted standardized returns with both rolling and recursive samples. Our main findings are that the Heterogenous Autoregressive model provides more accurate volatility and quantile forecasts for currencies which experience shifts in volatility, such as the Canadian dollar, and that the use of the empirical distribution to calculate quantiles can improve forecasts when there are shifts
Resumo:
This paper explores a number of statistical models for predicting the daily stock return volatility of an aggregate of all stocks traded on the NYSE. An application of linear and non-linear Granger causality tests highlights evidence of bidirectional causality, although the relationship is stronger from volatility to volume than the other way around. The out-of-sample forecasting performance of various linear, GARCH, EGARCH, GJR and neural network models of volatility are evaluated and compared. The models are also augmented by the addition of a measure of lagged volume to form more general ex-ante forecasting models. The results indicate that augmenting models of volatility with measures of lagged volume leads only to very modest improvements, if any, in forecasting performance.