948 resultados para Bayesian
Resumo:
We analyse the role of time-variation in coefficients and other sources of uncertainty in exchange rate forecasting regressions. Our techniques incorporate the notion that the relevant set of predictors and their corresponding weights, change over time. We find that predictive models which allow for sudden rather than smooth, changes in coefficients significantly beat the random walk benchmark in out-of-sample forecasting exercise. Using innovative variance decomposition scheme, we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients' variability, as the main factors hindering models' forecasting performance. The uncertainty regarding the choice of the predictor is small.
Resumo:
Time-lapse crosshole ground-penetrating radar (GPR) data, collected while infiltration occurs, can provide valuable information regarding the hydraulic properties of the unsaturated zone. In particular, the stochastic inversion of such data provides estimates of parameter uncertainties, which are necessary for hydrological prediction and decision making. Here, we investigate the effect of different infiltration conditions on the stochastic inversion of time-lapse, zero-offset-profile, GPR data. Inversions are performed using a Bayesian Markov-chain-Monte-Carlo methodology. Our results clearly indicate that considering data collected during a forced infiltration test helps to better refine soil hydraulic properties compared to data collected under natural infiltration conditions
Resumo:
We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.
Resumo:
Time varying parameter (TVP) models have enjoyed an increasing popularity in empirical macroeconomics. However, TVP models are parameter-rich and risk over-fitting unless the dimension of the model is small. Motivated by this worry, this paper proposes several Time Varying dimension (TVD) models where the dimension of the model can change over time, allowing for the model to automatically choose a more parsimonious TVP representation, or to switch between different parsimonious representations. Our TVD models all fall in the category of dynamic mixture models. We discuss the properties of these models and present methods for Bayesian inference. An application involving US inflation forecasting illustrates and compares the different TVD models. We find our TVD approaches exhibit better forecasting performance than several standard benchmarks and shrink towards parsimonious specifications.
Resumo:
In this paper, we forecast EU-area inflation with many predictors using time-varying parameter models. The facts that time-varying parameter models are parameter-rich and the time span of our data is relatively short motivate a desire for shrinkage. In constant coefficient regression models, the Bayesian Lasso is gaining increasing popularity as an effective tool for achieving such shrinkage. In this paper, we develop econometric methods for using the Bayesian Lasso with time-varying parameter models. Our approach allows for the coefficient on each predictor to be: i) time varying, ii) constant over time or iii) shrunk to zero. The econometric methodology decides automatically which category each coefficient belongs in. Our empirical results indicate the benefits of such an approach.
Resumo:
This paper extends the Nelson-Siegel linear factor model by developing a flexible macro-finance framework for modeling and forecasting the term structure of US interest rates. Our approach is robust to parameter uncertainty and structural change, as we consider instabilities in parameters and volatilities, and our model averaging method allows for investors' model uncertainty over time. Our time-varying parameter Nelson-Siegel Dynamic Model Averaging (NS-DMA) predicts yields better than standard benchmarks and successfully captures plausible time-varying term premia in real time. The proposed model has significant in-sample and out-of-sample predictability for excess bond returns, and the predictability is of economic value.
Resumo:
Bayesian model averaging (BMA) methods are regularly used to deal with model uncertainty in regression models. This paper shows how to introduce Bayesian model averaging methods in quantile regressions, and allow for different predictors to affect different quantiles of the dependent variable. I show that quantile regression BMA methods can help reduce uncertainty regarding outcomes of future inflation by providing superior predictive densities compared to mean regression models with and without BMA.
Resumo:
There is a vast literature that specifies Bayesian shrinkage priors for vector autoregressions (VARs) of possibly large dimensions. In this paper I argue that many of these priors are not appropriate for multi-country settings, which motivates me to develop priors for panel VARs (PVARs). The parametric and semi-parametric priors I suggest not only perform valuable shrinkage in large dimensions, but also allow for soft clustering of variables or countries which are homogeneous. I discuss the implications of these new priors for modelling interdependencies and heterogeneities among different countries in a panel VAR setting. Monte Carlo evidence and an empirical forecasting exercise show clear and important gains of the new priors compared to existing popular priors for VARs and PVARs.
Resumo:
This paper investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in bond yields of seven advanced economies is due to global co-movement, which is mainly attributed to shocks to non-fundamentals. Global fundamentals, especially global inflation, affect yields through a ‘policy channel’ and a ‘risk compensation channel’, but the effects through two channels are offset. This evidence explains the unsatisfactory performance of fundamentals-driven term structure models. Our approach delineates asymmetric spillovers in global bond markets connected to diverging monetary policies. The proposed model is robust as identified factors has significant explanatory power of excess returns. The finding that global inflation uncertainty is useful in explaining realized excess returns does not rule out regime changing as a source of non-fundamental fluctuations.
Resumo:
Most of the literature estimating DSGE models for monetary policy analysis ignores fiscal policy and assumes that monetary policy follows a simple rule. In this paper we allow both fiscal and monetary policy to be described by rules and/or optimal policy which are subject to switches over time. We find that US monetary and fiscal policy have often been in conflict, and that it is relatively rare that we observe the benign policy combination of an conservative monetary policy paired with a debt stabilizing fiscal policy. In a series of counterfactuals, a conservative central bank following a time-consistent fiscal policy leader would come close to mimicking the cooperative Ramsey policy. However, if policy makers cannot credibly commit to such a regime, monetary accommodation of the prevailing fiscal regime may actually be welfare improving.
Resumo:
This paper aims at providing a Bayesian parametric framework to tackle the accessibility problem across space in urban theory. Adopting continuous variables in a probabilistic setting we are able to associate with the distribution density to the Kendall's tau index and replicate the general issues related to the role of proximity in a more general context. In addition, by referring to the Beta and Gamma distribution, we are able to introduce a differentiation feature in each spatial unit without incurring in any a-priori definition of territorial units. We are also providing an empirical application of our theoretical setting to study the density distribution of the population across Massachusetts.
Resumo:
The Athlete Biological Passport (ABP) is an individual electronic document that collects data regarding a specific athlete that is useful in differentiating between natural physiologic variations of selected biomarkers and deviations caused by artificial manipulations. A subsidiary of the endocrine module of the ABP, that which here is called Athlete Steroidal Passport (ASP), collects data on markers of an altered metabolism of endogenous steroidal hormones measured in urine samples. The ASP aims to identify not only doping with anabolic-androgenic steroids, but also most indirect steroid doping strategies such as doping with estrogen receptor antagonists and aromatase inhibitors. Development of specific markers of steroid doping, use of the athlete's previous measurements to define individual limits, with the athlete becoming his or her own reference, the inclusion of heterogeneous factors such as the UDPglucuronosyltransferase B17 genotype of the athlete, the knowledge of potentially confounding effects such as heavy alcohol consumption, the development of an external quality control system to control analytical uncertainty, and finally the use of Bayesian inferential methods to evaluate the value of indirect evidence have made the ASP a valuable alternative to deter steroid doping in elite sports. The ASP can be used to target athletes for gas chromatography/combustion/ isotope ratio mass spectrometry (GC/C/IRMS) testing, to withdraw temporarily the athlete from competing when an abnormality has been detected, and ultimately to lead to an antidoping infraction if that abnormality cannot be explained by a medical condition. Although the ASP has been developed primarily to ensure fairness in elite sports, its application in endocrinology for clinical purposes is straightforward in an evidence-based medicine paradigm.
Resumo:
What genotype should the scientist specify for conducting a database search to try to find the source of a low-template-DNA (lt-DNA) trace? When the scientist answers this question, he or she makes a decision. Here, we approach this decision problem from a normative point of view by defining a decision-theoretic framework for answering this question for one locus. This framework combines the probability distribution describing the uncertainty over the trace's donor's possible genotypes with a loss function describing the scientist's preferences concerning false exclusions and false inclusions that may result from the database search. According to this approach, the scientist should choose the genotype designation that minimizes the expected loss. To illustrate the results produced by this approach, we apply it to two hypothetical cases: (1) the case of observing one peak for allele xi on a single electropherogram, and (2) the case of observing one peak for allele xi on one replicate, and a pair of peaks for alleles xi and xj, i ≠ j, on a second replicate. Given that the probabilities of allele drop-out are defined as functions of the observed peak heights, the threshold values marking the turning points when the scientist should switch from one designation to another are derived in terms of the observed peak heights. For each case, sensitivity analyses show the impact of the model's parameters on these threshold values. The results support the conclusion that the procedure should not focus on a single threshold value for making this decision for all alleles, all loci and in all laboratories.
Resumo:
Background: Retrospective analyses suggest that personalized PK-based dosage might be useful for imatinib, as treatment response correlates with trough concentrations (Cmin) in cancer patients. Our objectives were to improve the interpretation of randomly measured concentrations and to confirm its efficiency before evaluating the clinical usefulness of systematic PK-based dosage in chronic myeloid leukemia patients. Methods and Results: A Bayesian method was validated for the prediction of individual Cmin on the basis of a single random observation, and was applied in a prospective multicenter randomized controlled clinical trial. 28 out of 56 patients were enrolled in the systematic dosage individualization arm and had 44 follow-up visits (their clinical follow-up is ongoing). PK-dose-adjustments were proposed in 39% having predicted Cmin significantly away from the target (1000 ng/ml). Recommendations were taken up by physicians in 57%, patients were considered non-compliant in 27%. Median Cmin at study inclusion was 754 ng/ml and differed significantly from the target (p=0.02, Wilcoxon test). On follow-up, Cmin was 984 ng/ml (p=0.82) in the compliant group. CV decreased from 46% to 27% (p=0.02, F-test). Conclusion: PK-based (Bayesian) dosage adjustment is able to bring individual drug exposure closer to a given therapeutic target. Its influence on therapeutic response remains to be evaluated.
Resumo:
As the evolutionary significance of hybridization is largely dictated by its extent beyond the first generation, we broadly surveyed patterns of introgression across a sympatric zone of two native poplars (Populus balsamifera, Populus deltoides) in Quebec, Canada within which European exotic Populus nigra and its hybrids have been extensively planted since the 1800s. Single nucleotide polymorphisms (SNPs) that appeared fixed within each species were characterized by DNA-sequencing pools of pure individuals. Thirty-five of these diagnostic SNPs were employed in a high-throughput assay that genotyped 635 trees of different age classes, sampled from 15 sites with various degrees of anthropogenic disturbance. The degree of admixture within sampled trees was then assessed through Bayesian clustering of genotypes. Hybrids were present in seven of the populations, with 2.4% of all sampled trees showing spontaneous admixture. Sites with hybrids were significantly more disturbed than pure stands, while hybrids comprised both immature juveniles and trees of reproductive age. All three possible F1s were detected. Advanced-generation hybrids were consistently biased towards P. balsamifera regardless of whether hybridization had occurred with P. deltoides or P. nigra. Gene exchange between P. deltoides and P. nigra was not detected beyond the F1 generation; however, detection of a trihybrid demonstrates that even this apparent reproductive isolation does not necessarily result in an evolutionary dead end. Collectively, results demonstrate the natural fertility of hybrid poplars and suggest that introduced genes could potentially affect the genetic integrity of native trees, similar to that arising from introgression between natives.