984 resultados para forecast deviation
Resumo:
This paper investigates the usefulness of switching Gaussian state space models as a tool for implementing dynamic model selecting (DMS) or averaging (DMA) in time-varying parameter regression models. DMS methods allow for model switching, where a different model can be chosen at each point in time. Thus, they allow for the explanatory variables in the time-varying parameter regression model to change over time. DMA will carry out model averaging in a time-varying manner. We compare our exact approach to DMA/DMS to a popular existing procedure which relies on the use of forgetting factor approximations. In an application, we use DMS to select different predictors in an in ation forecasting application. We also compare different ways of implementing DMA/DMS and investigate whether they lead to similar results.
Resumo:
We study the impact of anticipated fiscal policy changes in a Ramsey economy where agents form long-horizon expectations using adaptive learning. We extend the existing framework by introducing distortionary taxes as well as elastic labour supply, which makes agents. decisions non-predetermined but more realistic. We detect that the dynamic responses to anticipated tax changes under learning have oscillatory behaviour that can be interpreted as self-fulfilling waves of optimism and pessimism emerging from systematic forecast errors. Moreover, we demonstrate that these waves can have important implications for the welfare consequences of .scal reforms. (JEL: E32, E62, D84)
Resumo:
OBJECTIVE: To test the accuracy of a new pulse oximeter sensor based on transmittance and reflectance. This sensor makes transillumination of tissue unnecessary and allows measurements on the hand, forearm, foot, and lower limb. DESIGN: Prospective, open, nonrandomized criterion standard study. SETTING: Neonatal intensive care unit, tertiary care center. PATIENTS: Sequential sample of 54 critically ill neonates (gestational age 27 to 42 wks; postnatal age 1 to 28 days) with arterial catheters in place. MEASUREMENTS AND MAIN RESULTS: A total of 99 comparisons between pulse oximetry and arterial saturation were obtained. Comparison of femoral or umbilical arterial blood with transcutaneous measurements on the lower limb (n = 66) demonstrated an excellent correlation (r2 = .96). The mean difference was +1.44% +/- 3.51 (SD) % (range -11% to +8%). Comparison of the transcutaneous values with the radial artery saturation from the corresponding upper limb (n = 33) revealed a correlation coefficient of 0.94 with a mean error of +0.66% +/- 3.34% (range -6% to +7%). The mean difference between noninvasive and invasive measurements was least with the test sensor on the hand, intermediate on the calf and arm, and greatest on the foot. The mean error and its standard deviation were slightly larger for arterial saturation values < 90% than for values > or = 90%. CONCLUSION: Accurate pulse oximetry saturation can be acquired from the hand, forearm, foot, and calf of critically ill newborns using this new sensor.
Resumo:
An important disconnect in the news driven view of the business cycle formalized by Beaudry and Portier (2004), is the lack of agreement between different—VAR and DSGE—methodologies over the empirical plausibility of this view. We argue that this disconnect can be largely resolved once we augment a standard DSGE model with a financial channel that provides amplification to news shocks. Both methodologies suggest news shocks to the future growth prospects of the economy to be significant drivers of U.S. business cycles in the post-Greenspan era (1990-2011), explaining as much as 50% of the forecast error variance in hours worked in cyclical frequencies
Resumo:
We analyse the role of time-variation in coefficients and other sources of uncertainty in exchange rate forecasting regressions. Our techniques incorporate the notion that the relevant set of predictors and their corresponding weights, change over time. We find that predictive models which allow for sudden rather than smooth, changes in coefficients significantly beat the random walk benchmark in out-of-sample forecasting exercise. Using innovative variance decomposition scheme, we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients' variability, as the main factors hindering models' forecasting performance. The uncertainty regarding the choice of the predictor is small.
Resumo:
We study the asymmetric and dynamic dependence between financial assets and demonstrate, from the perspective of risk management, the economic significance of dynamic copula models. First, we construct stock and currency portfolios sorted on different characteristics (ex ante beta, coskewness, cokurtosis and order flows), and find substantial evidence of dynamic evolution between the high beta (respectively, coskewness, cokurtosis and order flow) portfolios and the low beta (coskewness, cokurtosis and order flow) portfolios. Second, using three different dependence measures, we show the presence of asymmetric dependence between these characteristic-sorted portfolios. Third, we use a dynamic copula framework based on Creal et al. (2013) and Patton (2012) to forecast the portfolio Value-at-Risk of long-short (high minus low) equity and FX portfolios. We use several widely used univariate and multivariate VaR models for the purpose of comparison. Backtesting our methodology, we find that the asymmetric dynamic copula models provide more accurate forecasts, in general, and, in particular, perform much better during the recent financial crises, indicating the economic significance of incorporating dynamic and asymmetric dependence in risk management.
Resumo:
One of the cornerstone of financial anomalies is that there exists money making opportunities. Shiller’s excess volatility theory is re-investigated from the perspective of a trading strategy where the present value is computed using a series of simple econometric models to forecast the present value. The results show that the excess volatility may not be exploited given the data available until time t. However, when learning is introduced empirically, the simple trading strategy may offer profits, but which are likely to disappear once transaction costs are considered.
Resumo:
In this paper, we forecast EU-area inflation with many predictors using time-varying parameter models. The facts that time-varying parameter models are parameter-rich and the time span of our data is relatively short motivate a desire for shrinkage. In constant coefficient regression models, the Bayesian Lasso is gaining increasing popularity as an effective tool for achieving such shrinkage. In this paper, we develop econometric methods for using the Bayesian Lasso with time-varying parameter models. Our approach allows for the coefficient on each predictor to be: i) time varying, ii) constant over time or iii) shrunk to zero. The econometric methodology decides automatically which category each coefficient belongs in. Our empirical results indicate the benefits of such an approach.
Resumo:
We re-examine the dynamics of returns and dividend growth within the present-value framework of stock prices. We find that the finite sample order of integration of returns is approximately equal to the order of integration of the first-differenced price-dividend ratio. As such, the traditional return forecasting regressions based on the price-dividend ratio are invalid. Moreover, the nonstationary long memory behaviour of the price-dividend ratio induces antipersistence in returns. This suggests that expected returns should be modelled as an AFIRMA process and we show this improves the forecast ability of the present-value model in-sample and out-of-sample.
Resumo:
One of the cornerstone of financial anomalies is that there exists money making opportunities. Shiller’s excess volatility theory is re-investigated from the perspective of a trading strategy where the present value is computed using a series of simple econometric models to forecast the present value. The results show that the excess volatility may not be exploited given the data available until time t. However, when learning is introduced empirically, the simple trading strategy may offer profits, but which are likely to disappear once transaction costs are considered.
Resumo:
A large bibliographic survey provided data on Trypanosoma cruzi serology covering the period l948-l984. Epidemiological-demographic methods provided an estimate of 11% for the prevalenceof positive serology in Brazil, by 1984. Significant temporal trends were observed for most of the Brazilian geographical regions as well as for Brazil, as a whole. The parabolic curve that fit best for the entire country, indicates that by 1991, the incidence of new positive serology would be close to zero. This conclusion needs further fine-adjustment, since the forecast point is somewhat distant from the measured period.
Resumo:
Colistin is a last resort's antibacterial treatment in critically ill patients with multi-drug resistant Gram-negative infections. As appropriate colistin exposure is the key for maximizing efficacy while minimizing toxicity, individualized dosing optimization guided by therapeutic drug monitoring is a top clinical priority. Objective of the present work was to develop a rapid and robust HPLC-MS/MS assay for quantification of colistin plasma concentrations. This novel methodology validated according to international standards simultaneously quantifies the microbiologically active compounds colistin A and B, plus the pro-drug colistin methanesulfonate (colistimethate, CMS). 96-well micro-Elution SPE on Oasis Hydrophilic-Lipophilic-Balanced (HLB) followed by direct analysis by Hydrophilic Interaction Liquid Chromatography (HILIC) with Ethylene Bridged Hybrid - BEH - Amide phase column coupled to tandem mass spectrometry allows a high-throughput with no significant matrix effect. The technique is highly sensitive (limit of quantification 0.014 and 0.006μg/mL for colistin A and B), precise (intra-/inter-assay CV 0.6-8.4%) and accurate (intra-/inter-assay deviation from nominal concentrations -4.4 to +6.3%) over the clinically relevant analytical range 0.05-20μg/mL. Colistin A and B in plasma and whole blood samples are reliably quantified over 48h at room temperature and at +4°C (<6% deviation from nominal values) and after three freeze-thaw cycles. Colistimethate acidic hydrolysis (1M H2SO4) to colistin A and B in plasma was completed in vitro after 15min of sonication while the pro-drug hydrolyzed spontaneously in plasma ex vivo after 4h at room temperature: this information is of utmost importance for interpretation of analytical results. Quantification is precise and accurate when using serum, citrated or EDTA plasma as biological matrix, while use of heparin plasma is not appropriate. This new analytical technique providing optimized quantification in real-life conditions of the microbiologically active compounds colistin A and B offers a highly efficient tool for routine therapeutic drug monitoring aimed at individualizing drug dosing against life-threatening infections.
Resumo:
Matrix effects, which represent an important issue in liquid chromatography coupled to mass spectrometry or tandem mass spectrometry detection, should be closely assessed during method development. In the case of quantitative analysis, the use of stable isotope-labelled internal standard with physico-chemical properties and ionization behaviour similar to the analyte is recommended. In this paper, an example of the choice of a co-eluting deuterated internal standard to compensate for short-term and long-term matrix effect in the case of chiral (R,S)-methadone plasma quantification is reported. The method was fully validated over a concentration range of 5-800 ng/mL for each methadone enantiomer with satisfactory relative bias (-1.0 to 1.0%), repeatability (0.9-4.9%) and intermediate precision (1.4-12.0%). From the results obtained during validation, a control chart process during 52 series of routine analysis was established using both intermediate precision standard deviation and FDA acceptance criteria. The results of routine quality control samples were generally included in the +/-15% variability around the target value and mainly in the two standard deviation interval illustrating the long-term stability of the method. The intermediate precision variability estimated in method validation was found to be coherent with the routine use of the method. During this period, 257 trough concentration and 54 peak concentration plasma samples of patients undergoing (R,S)-methadone treatment were successfully analysed for routine therapeutic drug monitoring.
Resumo:
Traffic forecasts provide essential input for the appraisal of transport investment projects. However, according to recent empirical evidence, long-term predictions are subject to high levels of uncertainty. This paper quantifies uncertainty in traffic forecasts for the tolled motorway network in Spain. Uncertainty is quantified in the form of a confidence interval for the traffic forecast that includes both model uncertainty and input uncertainty. We apply a stochastic simulation process based on bootstrapping techniques. Furthermore, the paper proposes a new methodology to account for capacity constraints in long-term traffic forecasts. Specifically, we suggest a dynamic model in which the speed of adjustment is related to the ratio between the actual traffic flow and the maximum capacity of the motorway. This methodology is applied to a specific public policy that consists of suppressing the toll on a certain motorway section before the concession expires.
Resumo:
A large influenza epidemic took place in Havana during the winter of 1988. The epidemiologic surveillance unit of the Pedro Kouri Institute of Tropical Medicine detected the begining of the epidemic wave. The Rvachev-Baroyan mathematical model of the geographic spread of an epidemic was used to forecast this epidemic under routine conditions of the public health system. The expected number of individuals who would attend outpatient services, because of influenza-like illness, was calculated and communicated to the health authorities within enough time to permit the introduction of available control measures. The approximate date of the epidemic peak, the daily expected number of individuals attending medical services, and the approximate time of the end of the epidemic wave were estimated. The prediction error was 12%. The model was sufficienty accurate to warrant its use as a pratical forecasting tool in the Cuban public health system.