962 resultados para Bayesian adaptive design
Resumo:
Mucosal surfaces represent the main sites in which environmental microorganisms and antigens interact with the host. Sentinel cells, including epithelial cells, lumenal macrophages, and intraepithelial dendritic cells, continuously sense the environment and coordinate defenses for the protection of mucosal tissues. The mucosal epithelial cells are crucial actors in coordinating defenses. They sense the outside world and respond to environmental signals by releasing chemokines and cytokines that recruit inflammatory and immune cells to control potential infectious agents and to attract cells able to trigger immune responses. Among immune cells, dendritic cells (DC) play a key role in controlling adaptive immune responses, due to their capacity to internalize foreign materials and to present antigens to naive T and B lymphocytes, locally or in draining organized lymphoid tissues. Immune cells recruited in epithelial tissues can, in turn, act upon the epithelial cells and change their phenotype in a process referred to as epithelial metaplasia.
Resumo:
There are both theoretical and empirical reasons for believing that the parameters of macroeconomic models may vary over time. However, work with time-varying parameter models has largely involved Vector autoregressions (VARs), ignoring cointegration. This is despite the fact that cointegration plays an important role in informing macroeconomists on a range of issues. In this paper we develop time varying parameter models which permit cointegration. Time-varying parameter VARs (TVP-VARs) typically use state space representations to model the evolution of parameters. In this paper, we show that it is not sensible to use straightforward extensions of TVP-VARs when allowing for cointegration. Instead we develop a specification which allows for the cointegrating space to evolve over time in a manner comparable to the random walk variation used with TVP-VARs. The properties of our approach are investigated before developing a method of posterior simulation. We use our methods in an empirical investigation involving a permanent/transitory variance decomposition for inflation.
Resumo:
This paper uses an infinite hidden Markov model (IIHMM) to analyze U.S. inflation dynamics with a particular focus on the persistence of inflation. The IHMM is a Bayesian nonparametric approach to modeling structural breaks. It allows for an unknown number of breakpoints and is a flexible and attractive alternative to existing methods. We found a clear structural break during the recent financial crisis. Prior to that, inflation persistence was high and fairly constant.
Resumo:
This paper studies the implications for monetary policy of heterogeneous expectations in a New Keynesian model. The assumption of rational expectations is replaced with parsimonious forecasting models where agents select between predictors that are underparameterized. In a Misspecification Equilibrium agents only select the best-performing statistical models. We demonstrate that, even when monetary policy rules satisfy the Taylor principle by adjusting nominal interest rates more than one for one with inflation, there may exist equilibria with Intrinsic Heterogeneity. Under certain conditions, there may exist multiple misspecification equilibria. We show that these findings have important implications for business cycle dynamics and for the design of monetary policy.
Resumo:
These notes try to clarify some discussions on the formulation of individual intertemporal behavior under adaptive learning in representative agent models. First, we discuss two suggested approaches and related issues in the context of a simple consumption-saving model. Second, we show that the analysis of learning in the NewKeynesian monetary policy model based on “Euler equations” provides a consistent and valid approach.
Resumo:
In recent years there has been increasing concern about the identification of parameters in dynamic stochastic general equilibrium (DSGE) models. Given the structure of DSGE models it may be difficult to determine whether a parameter is identified. For the researcher using Bayesian methods, a lack of identification may not be evident since the posterior of a parameter of interest may differ from its prior even if the parameter is unidentified. We show that this can even be the case even if the priors assumed on the structural parameters are independent. We suggest two Bayesian identification indicators that do not suffer from this difficulty and are relatively easy to compute. The first applies to DSGE models where the parameters can be partitioned into those that are known to be identified and the rest where it is not known whether they are identified. In such cases the marginal posterior of an unidentified parameter will equal the posterior expectation of the prior for that parameter conditional on the identified parameters. The second indicator is more generally applicable and considers the rate at which the posterior precision gets updated as the sample size (T) is increased. For identified parameters the posterior precision rises with T, whilst for an unidentified parameter its posterior precision may be updated but its rate of update will be slower than T. This result assumes that the identified parameters are pT-consistent, but similar differential rates of updates for identified and unidentified parameters can be established in the case of super consistent estimators. These results are illustrated by means of simple DSGE models.
Resumo:
In an effort to meet its obligations under the Kyoto Protocol, in 2005 the European Union introduced a cap-and-trade scheme where mandated installations are allocated permits to emit CO2. Financial markets have developed that allow companies to trade these carbon permits. For the EU to achieve reductions in CO2 emissions at a minimum cost, it is necessary that companies make appropriate investments and policymakers design optimal policies. In an effort to clarify the workings of the carbon market, several recent papers have attempted to statistically model it. However, the European carbon market (EU ETS) has many institutional features that potentially impact on daily carbon prices (and associated nancial futures). As a consequence, the carbon market has properties that are quite different from conventional financial assets traded in mature markets. In this paper, we use dynamic model averaging (DMA) in order to forecast in this newly-developing market. DMA is a recently-developed statistical method which has three advantages over conventional approaches. First, it allows the coefficients on the predictors in a forecasting model to change over time. Second, it allows for the entire fore- casting model to change over time. Third, it surmounts statistical problems which arise from the large number of potential predictors that can explain carbon prices. Our empirical results indicate that there are both important policy and statistical bene ts with our approach. Statistically, we present strong evidence that there is substantial turbulence and change in the EU ETS market, and that DMA can model these features and forecast accurately compared to conventional approaches. From a policy perspective, we discuss the relative and changing role of different price drivers in the EU ETS. Finally, we document the forecast performance of DMA and discuss how this relates to the efficiency and maturity of this market.
Resumo:
This paper considers the instrumental variable regression model when there is uncertainty about the set of instruments, exogeneity restrictions, the validity of identifying restrictions and the set of exogenous regressors. This uncertainty can result in a huge number of models. To avoid statistical problems associated with standard model selection procedures, we develop a reversible jump Markov chain Monte Carlo algorithm that allows us to do Bayesian model averaging. The algorithm is very exible and can be easily adapted to analyze any of the di¤erent priors that have been proposed in the Bayesian instrumental variables literature. We show how to calculate the probability of any relevant restriction (e.g. the posterior probability that over-identifying restrictions hold) and discuss diagnostic checking using the posterior distribution of discrepancy vectors. We illustrate our methods in a returns-to-schooling application.
Resumo:
This paper is motivated by the recent interest in the use of Bayesian VARs for forecasting, even in cases where the number of dependent variables is large. In such cases, factor methods have been traditionally used but recent work using a particular prior suggests that Bayesian VAR methods can forecast better. In this paper, we consider a range of alternative priors which have been used with small VARs, discuss the issues which arise when they are used with medium and large VARs and examine their forecast performance using a US macroeconomic data set containing 168 variables. We nd that Bayesian VARs do tend to forecast better than factor methods and provide an extensive comparison of the strengths and weaknesses of various approaches. Our empirical results show the importance of using forecast metrics which use the entire predictive density, instead of using only point forecasts.
Resumo:
We propose a non-equidistant Q rate matrix formula and an adaptive numerical algorithm for a continuous time Markov chain to approximate jump-diffusions with affine or non-affine functional specifications. Our approach also accommodates state-dependent jump intensity and jump distribution, a flexibility that is very hard to achieve with other numerical methods. The Kolmogorov-Smirnov test shows that the proposed Markov chain transition density converges to the one given by the likelihood expansion formula as in Ait-Sahalia (2008). We provide numerical examples for European stock option pricing in Black and Scholes (1973), Merton (1976) and Kou (2002).
Resumo:
The Conservative Party emerged from the 2010 United Kingdom General Election as the largest single party, but their support was not geographically uniform. In this paper, we estimate a hierarchical Bayesian spatial probit model that tests for the presence of regional voting effects. This model allows for the estimation of individual region-specic effects on the probability of Conservative Party success, incorporating information on the spatial relationships between the regions of the mainland United Kingdom. After controlling for a range of important covariates, we find that these spatial relationships are significant and that our individual region-specic effects estimates provide additional evidence of North-South variations in Conservative Party support.
Resumo:
This paper considers Bayesian variable selection in regressions with a large number of possibly highly correlated macroeconomic predictors. I show that by acknowledging the correlation structure in the predictors can improve forecasts over existing popular Bayesian variable selection algorithms.