48 resultados para New Keynesian model, Bayesian methods, Monetary policy, Great Inflation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we apply GMM estimation to assess the relevance of domestic versus external determinants of CPI inflation dynamics in a sample of OECD countries typically classified as open economies. The analysis is based on a variant of the small open-economy New Keynesian Phillips Curve derived in Galí and Monacelli (Rev Econ Stud 72:707–734, 2005), where the novel feature is that expectations about fluctuations in the terms of trade enter explicitly. For most countries in our sample the expected relative change in the terms of trade emerges as the more relevant inflation driver than the contemporaneous domestic output gap.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper demonstrates that recent influential contributions to monetary policy imply an emerging consensus whereby neither rigid rules nor complete discretion are found optimal. Instead, middle-ground monetary regimes based on rules (operative under 'normal' circumstances) to anchor inflation expectations over the long run, but designed with enough flexibility to mitigate the short-run effect of shocks (with communicated discretion in 'exceptional' circumstances temporarily overriding these rules), are gaining support in theoretical models and policy formulation and implementation. The opposition of 'rules versus discretion' has, thus, reappeared as the synthesis of 'rules cum discretion', in essence as inflation-forecast targeting. But such synthesis is not without major theoretical problems, as we argue in this contribution. Furthermore, the very recent real-world events have made it obvious that the inflation targeting strategy of monetary policy, which rests upon the new consensus paradigm in modern macroeconomics is at best a 'fair weather' model. In the turbulent economic climate of highly unstable inflation, deep financial crisis and world-wide, abrupt economic slowdown nowadays this approach needs serious rethinking to say the least, if not abandoning it altogether

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this correspondence new robust nonlinear model construction algorithms for a large class of linear-in-the-parameters models are introduced to enhance model robustness via combined parameter regularization and new robust structural selective criteria. In parallel to parameter regularization, we use two classes of robust model selection criteria based on either experimental design criteria that optimizes model adequacy, or the predicted residual sums of squares (PRESS) statistic that optimizes model generalization capability, respectively. Three robust identification algorithms are introduced, i.e., combined A- and D-optimality with regularized orthogonal least squares algorithm, respectively; and combined PRESS statistic with regularized orthogonal least squares algorithm. A common characteristic of these algorithms is that the inherent computation efficiency associated with the orthogonalization scheme in orthogonal least squares or regularized orthogonal least squares has been extended such that the new algorithms are computationally efficient. Numerical examples are included to demonstrate effectiveness of the algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new model of dispersion has been developed to simulate the impact of pollutant discharges on river systems. The model accounts for the main dispersion processes operating in rivers as well as the dilution from incoming tributaries and first-order kinetic decay processes. The model is dynamic and simulates the hourly behaviour of river flow and pollutants along river systems. The model has been applied to the Aries and Mures River System in Romania and has been used to assess the impacts of potential dam releases from the Roia Montan Mine in Transylvania, Romania. The question of mine water release is investigated under a range of scenarios. The impacts on pollution levels downstream at key sites and at the border with Hungary are investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of a phase H clinical trial is to decide whether or not to develop an experimental therapy further through phase III clinical evaluation. In this paper, we present a Bayesian approach to the phase H trial, although we assume that subsequent phase III clinical trials will hat,e standard frequentist analyses. The decision whether to conduct the phase III trial is based on the posterior predictive probability of a significant result being obtained. This fusion of Bayesian and frequentist techniques accepts the current paradigm for expressing objective evidence of therapeutic value, while optimizing the form of the phase II investigation that leads to it. By using prior information, we can assess whether a phase II study is needed at all, and how much or what sort of evidence is required. The proposed approach is illustrated by the design of a phase II clinical trial of a multi-drug resistance modulator used in combination with standard chemotherapy in the treatment of metastatic breast cancer. Copyright (c) 2005 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The assimilation of observations with a forecast is often heavily influenced by the description of the error covariances associated with the forecast. When a temperature inversion is present at the top of the boundary layer (BL), a significant part of the forecast error may be described as a vertical positional error (as opposed to amplitude error normally dealt with in data assimilation). In these cases, failing to account for positional error explicitly is shown t o r esult in an analysis for which the inversion structure is erroneously weakened and degraded. In this article, a new assimilation scheme is proposed to explicitly include the positional error associated with an inversion. This is done through the introduction of an extra control variable to allow position errors in the a priori to be treated simultaneously with the usual amplitude errors. This new scheme, referred to as the ‘floating BL scheme’, is applied to the one-dimensional (vertical) variational assimilation of temperature. The floating BL scheme is tested with a series of idealised experiments a nd with real data from radiosondes. For each idealised experiment, the floating BL scheme gives an analysis which has the inversion structure and position in agreement with the truth, and outperforms the a ssimilation which accounts only for forecast a mplitude error. When the floating BL scheme is used to assimilate a l arge sample of radiosonde data, its ability to give an analysis with an inversion height in better agreement with that observed is confirmed. However, it is found that the use of Gaussian statistics is an inappropriate description o f t he error statistics o f t he extra c ontrol variable. This problem is alleviated by incorporating a non-Gaussian description of the new control variable in the new scheme. Anticipated challenges in implementing the scheme operationally are discussed towards the end of the article.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the early 1920s, before Virginia Woolf wrote her now well-known essays “The New Biography” and “The Art of Biography,” the Hogarth Press published four biographies of Tolstoy. Each of these English translations of Russian works takes a different approach to biographical composition, and as a group they offer multiple and contradictory perspectives on Tolstoy’s character and on the genre of biography in the early twentieth century. These works show that Leonard and Virginia Woolf’s Hogarth Press took a multi-perspectival, modernist approach to publishing literary lives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to explore the role of the housing market in the monetary policy transmission to consumption among euro area member states. It has been argued that the housing market in one country is then important when its mortgage market is well developed. The countries in the euro area follow unitary monetary policy, however, their housing and mortgage markets show some heterogeneity, which may lead to different policy effects on aggregate consumption through the housing market. Design/methodology/approach – The housing market can act as a channel of monetary policy shocks to household consumption through changes in house prices and residential investment – the housing market channel. We estimate vector autoregressive models for each country and conduct a counterfactual analysis in order to disentangle the housing market channel and assess its importance across the euro area member states. Findings – We find little evidence for heterogeneity of the monetary policy transmission through house prices across the euro area countries. Housing market variations in the euro area seem to be better captured by changes in residential investment rather than by changes in house prices. As a result we do not find significantly large house price channels. For some of the countries however, we observe a monetary policy channel through residential investment. The existence of a housing channel may depend on institutional features of both the labour market or with institutional factors capturing the degree of household debt as is the LTV ratio. Originality/value – The study contributes to the existing literature by assessing whether a unitary monetary policy has a different impact on consumption across the euro area countries through their housing and mortgage markets. We disentangle monetary-policy-induced effects on consumption associated with variations on the housing markets due to either house price variations or residential investment changes. We show that the housing market can play a role in the monetary transmission mechanism even in countries with less developed mortgage markets through variations in residential investment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the mixed logit (ML) using Bayesian methods was employed to examine willingness-to-pay (WTP) to consume bread produced with reduced levels of pesticides so as to ameliorate environmental quality, from data generated by a choice experiment. Model comparison used the marginal likelihood, which is preferable for Bayesian model comparison and testing. Models containing constant and random parameters for a number of distributions were considered, along with models in ‘preference space’ and ‘WTP space’ as well as those allowing for misreporting. We found: strong support for the ML estimated in WTP space; little support for fixing the price coefficient a common practice advocated and adopted in the environmental economics literature; and, weak evidence for misreporting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the factors precipitating market entry where smallholders make decisions about participation (a discrete choice about whether to sell quantities of products) and supply (a continuous-valued choice about how much quantity to sell) in a cross-section of smallholders in Northern Luzon, Philippines, in a model that combines basic probit and Tobit ideas, is implemented using Bayesian methods, and generates precise estimates of the inputs required in order to effect entry among the non-participants. We estimate the total amounts of (cattle, buffalo, pig and chicken) livestock input required to effect entry and compare and contrast the alternative input requirements. To the extent that our smallholder sample may be representative of a wide and broader set of circumstances, our findings shed light on offsetting impacts of conflicting factors that complicate the roles for policy in the context of expanding the density of participation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.