976 resultados para Bayesian models
Resumo:
This paper considers the lag structures of dynamic models in economics, arguing that the standard approach is too simple to capture the complexity of actual lag structures arising, for example, from production and investment decisions. It is argued that recent (1990s) developments in the the theory of functional differential equations provide a means to analyse models with generalised lag structures. The stability and asymptotic stability of two growth models with generalised lag structures are analysed. The paper concludes with some speculative discussion of time-varying parameters.
Resumo:
Employing an endogenous growth model with human capital, this paper explores how productivity shocks in the goods and human capital producing sectors contribute to explaining aggregate fluctuations in output, consumption, investment and hours. Given the importance of accounting for both the dynamics and the trends in the data not captured by the theoretical growth model, we introduce a vector error correction model (VECM) of the measurement errors and estimate the model’s posterior density function using Bayesian methods. To contextualize our findings with those in the literature, we also assess whether the endogenous growth model or the standard real business cycle model better explains the observed variation in these aggregates. In addressing these issues we contribute to both the methods of analysis and the ongoing debate regarding the effects of innovations to productivity on macroeconomic activity.
Resumo:
We present a stylized intertemporal forward-looking model able that accommodates key regional economic features, an area where the literature is not well developed. The main difference, from the standard applications, is the role of saving and its implication for the balance of payments. Though maintaining dynamic forward-looking behaviour for agents, the rate of private saving is exogenously determined and so no neoclassical financial adjustment is needed. Also, we focus on the similarities and the differences between myopic and forward-looking models, highlighting the divergences among the main adjustment equations and the resulting simulation outcomes.
Resumo:
This paper considers Bayesian variable selection in regressions with a large number of possibly highly correlated macroeconomic predictors. I show that by acknowledging the correlation structure in the predictors can improve forecasts over existing popular Bayesian variable selection algorithms.
Resumo:
Faced with the problem of pricing complex contingent claims, an investor seeks to make his valuations robust to model uncertainty. We construct a notion of a model- uncertainty-induced utility function and show that model uncertainty increases the investor's eff ective risk aversion. Using the model-uncertainty-induced utility function, we extend the \No Good Deals" methodology of Cochrane and Sa a-Requejo [2000] to compute lower and upper good deal bounds in the presence of model uncertainty. We illustrate the methodology using some numerical examples.
Resumo:
AIMS/HYPOTHESIS: MicroRNAs are key regulators of gene expression involved in health and disease. The goal of our study was to investigate the global changes in beta cell microRNA expression occurring in two models of obesity-associated type 2 diabetes and to assess their potential contribution to the development of the disease. METHODS: MicroRNA profiling of pancreatic islets isolated from prediabetic and diabetic db/db mice and from mice fed a high-fat diet was performed by microarray. The functional impact of the changes in microRNA expression was assessed by reproducing them in vitro in primary rat and human beta cells. RESULTS: MicroRNAs differentially expressed in both models of obesity-associated type 2 diabetes fall into two distinct categories. A group including miR-132, miR-184 and miR-338-3p displays expression changes occurring long before the onset of diabetes. Functional studies indicate that these expression changes have positive effects on beta cell activities and mass. In contrast, modifications in the levels of miR-34a, miR-146a, miR-199a-3p, miR-203, miR-210 and miR-383 primarily occur in diabetic mice and result in increased beta cell apoptosis. These results indicate that obesity and insulin resistance trigger adaptations in the levels of particular microRNAs to allow sustained beta cell function, and that additional microRNA deregulation negatively impacting on insulin-secreting cells may cause beta cell demise and diabetes manifestation. CONCLUSIONS/INTERPRETATION: We propose that maintenance of blood glucose homeostasis or progression toward glucose intolerance and type 2 diabetes may be determined by the balance between expression changes of particular microRNAs.
Resumo:
This paper investigates the usefulness of switching Gaussian state space models as a tool for implementing dynamic model selecting (DMS) or averaging (DMA) in time-varying parameter regression models. DMS methods allow for model switching, where a different model can be chosen at each point in time. Thus, they allow for the explanatory variables in the time-varying parameter regression model to change over time. DMA will carry out model averaging in a time-varying manner. We compare our exact approach to DMA/DMS to a popular existing procedure which relies on the use of forgetting factor approximations. In an application, we use DMS to select different predictors in an in ation forecasting application. We also compare different ways of implementing DMA/DMS and investigate whether they lead to similar results.
Resumo:
This paper discusses the challenges faced by the empirical macroeconomist and methods for surmounting them. These challenges arise due to the fact that macroeconometric models potentially include a large number of variables and allow for time variation in parameters. These considerations lead to models which have a large number of parameters to estimate relative to the number of observations. A wide range of approaches are surveyed which aim to overcome the resulting problems. We stress the related themes of prior shrinkage, model averaging and model selection. Subsequently, we consider a particular modelling approach in detail. This involves the use of dynamic model selection methods with large TVP-VARs. A forecasting exercise involving a large US macroeconomic data set illustrates the practicality and empirical success of our approach.
Resumo:
Most of the literature estimating DSGE models for monetary policy analysis assume that policy follows a simple rule. In this paper we allow policy to be described by various forms of optimal policy - commitment, discretion and quasi-commitment. We find that, even after allowing for Markov switching in shock variances, the inflation target and/or rule parameters, the data preferred description of policy is that the US Fed operates under discretion with a marked increase in conservatism after the 1970s. Parameter estimates are similar to those obtained under simple rules, except that the degree of habits is significantly lower and the prevalence of cost-push shocks greater. Moreover, we find that the greatest welfare gains from the ‘Great Moderation’ arose from the reduction in the variances in shocks hitting the economy, rather than increased inflation aversion. However, much of the high inflation of the 1970s could have been avoided had policy makers been able to commit, even without adopting stronger anti-inflation objectives. More recently the Fed appears to have temporarily relaxed policy following the 1987 stock market crash, and has lost, without regaining, its post-Volcker conservatism following the bursting of the dot-com bubble in 2000.
Resumo:
An expanding literature articulates the view that Taylor rules are helpful in predicting exchange rates. In a changing world however, Taylor rule parameters may be subject to structural instabilities, for example during the Global Financial Crisis. This paper forecasts exchange rates using such Taylor rules with Time Varying Parameters (TVP) estimated by Bayesian methods. In core out-of-sample results, we improve upon a random walk benchmark for at least half, and for as many as eight out of ten, of the currencies considered. This contrasts with a constant parameter Taylor rule model that yields a more limited improvement upon the benchmark. In further results, Purchasing Power Parity and Uncovered Interest Rate Parity TVP models beat a random walk benchmark, implying our methods have some generality in exchange rate prediction.
Resumo:
The paper considers the use of artificial regression in calculating different types of score test when the log
Resumo:
We analyse the role of time-variation in coefficients and other sources of uncertainty in exchange rate forecasting regressions. Our techniques incorporate the notion that the relevant set of predictors and their corresponding weights, change over time. We find that predictive models which allow for sudden rather than smooth, changes in coefficients significantly beat the random walk benchmark in out-of-sample forecasting exercise. Using innovative variance decomposition scheme, we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients' variability, as the main factors hindering models' forecasting performance. The uncertainty regarding the choice of the predictor is small.
Resumo:
Time-inconsistency is an essential feature of many policy problems (Kydland and Prescott, 1977). This paper presents and compares three methods for computing Markov-perfect optimal policies in stochastic nonlinear business cycle models. The methods considered include value function iteration, generalized Euler-equations, and parameterized shadow prices. In the context of a business cycle model in which a scal authority chooses government spending and income taxation optimally, while lacking the ability to commit, we show that the solutions obtained using value function iteration and generalized Euler equations are somewhat more accurate than that obtained using parameterized shadow prices. Among these three methods, we show that value function iteration can be applied easily, even to environments that include a risk-sensitive scal authority and/or inequality constraints on government spending. We show that the risk-sensitive scal authority lowers government spending and income-taxation, reducing the disincentive households face to accumulate wealth.
Resumo:
Bayesian model averaging (BMA) methods are regularly used to deal with model uncertainty in regression models. This paper shows how to introduce Bayesian model averaging methods in quantile regressions, and allow for different predictors to affect different quantiles of the dependent variable. I show that quantile regression BMA methods can help reduce uncertainty regarding outcomes of future inflation by providing superior predictive densities compared to mean regression models with and without BMA.
Resumo:
This paper investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in bond yields of seven advanced economies is due to global co-movement, which is mainly attributed to shocks to non-fundamentals. Global fundamentals, especially global inflation, affect yields through a ‘policy channel’ and a ‘risk compensation channel’, but the effects through two channels are offset. This evidence explains the unsatisfactory performance of fundamentals-driven term structure models. Our approach delineates asymmetric spillovers in global bond markets connected to diverging monetary policies. The proposed model is robust as identified factors has significant explanatory power of excess returns. The finding that global inflation uncertainty is useful in explaining realized excess returns does not rule out regime changing as a source of non-fundamental fluctuations.