908 resultados para New-Keynesian models.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wage stickiness is incorporated to a New-Keynesian model with variable capital to drive endogenous unemployment uctuations de ned as the log di¤erence between aggregate labor supply and aggregate labor demand. We estimated such model using Bayesian econometric techniques and quarterly U.S. data. The second-moment statistics of the unemployment rate in the model give a good t to those observed in U.S. data. Our results also show that wage-push shocks, demand shifts and monetary policy shocks are the three major determinants of unemployment fl uctuations. Compared to an estimated New-Keynesian model without unemployment (Smets and Wouters, 2007): wage stickiness is higher, labor supply elasticity is lower, the slope of the New-Keynesian Phillips curve is flatter, and the importance of technology innovations on output variability increases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we use identification-robust methods to assess the empirical adequacy of a New Keynesian Phillips Curve (NKPC) equation. We focus on the Gali and Gertler’s (1999) specification, on both U.S. and Canadian data. Two variants of the model are studied: one based on a rationalexpectations assumption, and a modification to the latter which consists in using survey data on inflation expectations. The results based on these two specifications exhibit sharp differences concerning: (i) identification difficulties, (ii) backward-looking behavior, and (ii) the frequency of price adjustments. Overall, we find that there is some support for the hybrid NKPC for the U.S., whereas the model is not suited to Canada. Our findings underscore the need for employing identificationrobust inference methods in the estimation of expectations-based dynamic macroeconomic relations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital production and distribution technologies may create new opportunities for filmmaking in Australia. A culture of new approaches to filmmaking is emerging driven by ‘next generation filmmakers’ who are willing to consider new business models: from online web series to short films produced for mobile phones. At the same time cultural representation itself is transforming within an interactive, social media driven environment. Yet there is very little research into next generation filmmaking. The aim of this paper is to scope and discuss three key aspects of next generation filmmaking, namely: digital trends in film distribution and marketing; processes and strategies of ‘next generation’ filmmakers; and case studies of viable next generation business models and filmmaking practices. We conclude with a brief examination of the implications for media and cultural policy which suggests the future possibility of a rapprochement between creative industries discourse and cultural policy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper uses a new method for describing dynamic comovement and persistence in economic time series which builds on the contemporaneous forecast error method developed in den Haan (2000). This data description method is then used to address issues in New Keynesian model performance in two ways. First, well known data patterns, such as output and inflation leads and lags and inflation persistence, are decomposed into forecast horizon components to give a more complete description of the data patterns. These results show that the well known lead and lag patterns between output and inflation arise mostly in the medium term forecasts horizons. Second, the data summary method is used to investigate a rich New Keynesian model with many modeling features to see which of these features can reproduce lead, lag and persistence patterns seen in the data. Many studies have suggested that a backward looking component in the Phillips curve is needed to match the data, but our simulations show this is not necessary. We show that a simple general equilibrium model with persistent IS curve shocks and persistent supply shocks can reproduce the lead, lag and persistence patterns seen in the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Published as article in: Journal of Economic Dynamics and Control (2008), 32(May), pp. 1466-1488.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper estimates a standard version of the New Keynesian monetary (NKM) model under alternative specifications of the monetary policy rule using U.S. and Eurozone data. The estimation procedure implemented is a classical method based on the indirect inference principle. An unrestricted VAR is considered as the auxiliary model. On the one hand, the estimation method proposed overcomes some of the shortcomings of using a structural VAR as the auxiliary model in order to identify the impulse response that defines the minimum distance estimator implemented in the literature. On the other hand, by following a classical approach we can further assess the estimation results found in recent papers that follow a maximum-likelihood Bayesian approach. The estimation results show that some structural parameter estimates are quite sensitive to the specification of monetary policy. Moreover, the estimation results in the U.S. show that the fit of the NKM under an optimal monetary plan is much worse than the fit of the NKM model assuming a forward-looking Taylor rule. In contrast to the U.S. case, in the Eurozone the best fit is obtained assuming a backward-looking Taylor rule, but the improvement is rather small with respect to assuming either a forward-looking Taylor rule or an optimal plan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes an extended version of the basic New Keynesian monetary (NKM) model which contemplates revision processes of output and inflation data in order to assess the importance of data revisions on the estimated monetary policy rule parameters and the transmission of policy shocks. Our empirical evidence based on a structural econometric approach suggests that although the initial announcements of output and inflation are not rational forecasts of revised output and inflation data, ignoring the presence of non well-behaved revision processes may not be a serious drawback in the analysis of monetary policy in this framework. However, the transmission of inflation-push shocks is largely affected by considering data revisions. The latter being especially true when the nominal stickiness parameter is estimated taking into account data revision processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the discovery of the Higgs boson at the LHC, its use as a probe to search for beyond the standard model physics, such as supersymmetry, has become important, as seen in a recent search by the CMS experiment using razor variables in the diphoton final state. Motivated by this search, this thesis examines the LHC discovery potential of a SUSY scenario involving bottom squark pair production with a Higgs boson in the final state. We design and implement a software-based trigger using the razor variables for the CMS experiment to record events with a bottom quark-antiquark pair from a Higgs boson. We characterize the full range of signatures at the LHC from this Higgs-aware SUSY scenario and demonstrate the sensitivity of the CMS data to this model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increase in antibiotic resistance and the dearth of novel antibiotics have become a growing concern among policy-makers. A combination of financial, scientific, and regulatory challenges poses barriers to antibiotic innovation. However, each of these three challenges provides an opportunity to develop pathways for new business models to bring novel antibiotics to market. Pull-incentives that pay for the outputs of research and development (R&D) and push-incentives that pay for the inputs of R&D can be used to increase innovation for antibiotics. Financial incentives might be structured to promote delinkage of a company's return on investment from revenues of antibiotics. This delinkage strategy might not only increase innovation, but also reinforce rational use of antibiotics. Regulatory approval, however, should not and need not compromise safety and efficacy standards to bring antibiotics with novel mechanisms of action to market. Instead regulatory agencies could encourage development of companion diagnostics, test antibiotic combinations in parallel, and pool and make transparent clinical trial data to lower R&D costs. A tax on non-human use of antibiotics might also create a disincentive for non-therapeutic use of these drugs. Finally, the new business model for antibiotic innovation should apply the 3Rs strategy for encouraging collaborative approaches to R&D in innovating novel antibiotics: sharing resources, risks, and rewards.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, we've focussed on the question of how to make a system easy to code the first time, or perhaps on how to ease the system's continued evolution. But if we look at life cycle costs, then we must conclude that the important question is how to make a system easy to operate. To do this we need to make it easy for the operators to see what's going on and to then manipulate the system so that it does what it is supposed to. This is a radically different criterion for success. What makes a computer system visible and controllable? This is a difficult question, but it's clear that today's modern operating systems with nearly 50 million source lines of code are neither. Strikingly, the MIT Lisp Machine and its commercial successors provided almost the same functionality as today's mainstream sytsems, but with only 1 Million lines of code. This paper is a retrospective examination of the features of the Lisp Machine hardware and software system. Our key claim is that by building the Object Abstraction into the lowest tiers of the system, great synergy and clarity were obtained. It is our hope that this is a lesson that can impact tomorrow's designs. We also speculate on how the spirit of the Lisp Machine could be extended to include a comprehensive access control model and how new layers of abstraction could further enrich this model.