48 resultados para New Keynesian model, Bayesian methods, Monetary policy, Great Inflation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we evaluate the relative influence of external versus domestic inflation drivers in the 12 new European Union (EU) member countries. Our empirical analysis is based on the New Keynesian Phillips Curve (NKPC) derived in Galí and Monacelli (2005) for small open economies (SOE). Employing the generalized method of moments (GMM), we find that the SOE NKPC is well supported in the new EU member states. We also find that the inflation process is dominated by domestic variables in the larger countries of our sample, whereas external variables are mostly relevant in the smaller countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a new neurofuzzy model construction algorithm for nonlinear dynamic systems based upon basis functions that are Bezier-Bernstein polynomial functions. This paper is generalized in that it copes with n-dimensional inputs by utilising an additive decomposition construction to overcome the curse of dimensionality associated with high n. This new construction algorithm also introduces univariate Bezier-Bernstein polynomial functions for the completeness of the generalized procedure. Like the B-spline expansion based neurofuzzy systems, Bezier-Bernstein polynomial function based neurofuzzy networks hold desirable properties such as nonnegativity of the basis functions, unity of support, and interpretability of basis function as fuzzy membership functions, moreover with the additional advantages of structural parsimony and Delaunay input space partition, essentially overcoming the curse of dimensionality associated with conventional fuzzy and RBF networks. This new modeling network is based on additive decomposition approach together with two separate basis function formation approaches for both univariate and bivariate Bezier-Bernstein polynomial functions used in model construction. The overall network weights are then learnt using conventional least squares methods. Numerical examples are included to demonstrate the effectiveness of this new data based modeling approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling cyanobacterial behaviour in freshwaters is an important tool for understanding their population dynamics and predicting the location and timing of the bloom events in lakes, reservoirs and rivers. A new deterministic–mathematical model was developed, which simulates the growth and movement of cyanobacterial blooms in river systems. The model focuses on the mathematical description of the bloom formation, vertical migration and lateral transport of colonies within river environments by taking into account the major factors that affect the cyanobacterial bloom formation in rivers including light, nutrients and temperature. A parameter sensitivity analysis using a one-at-a-time approach was carried out. There were two objectives of the sensitivity analysis presented in this paper: to identify the key parameters controlling the growth and movement patterns of cyanobacteria and to provide a means for model validation. The result of the analysis suggested that maximum growth rate and day length period were the most significant parameters in determining the population growth and colony depth, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: The overall objective of the research was to assess the impact of provider diversity on quality and innovation in the English NHS. The aims were to map the extent of diverse provider activity, identify the differences in performance between Third Sector Organisations (TSOs), for-profit private enterprises, and incumbent organisations within the NHS, and the factors that affect the entry and growth of new private and TSOs. Methods: Case studies of four Local Health Economies (LHEs). Data included: semi-structured interviews with 48 managerial and clinical staff from NHS organizations and providers from the private and Third Sector; some documentary evidence; a focus group with service users; and routine data from the Care Quality Commission and Companies House. Data collection was mainly between November 2008 and November 2009. Results: Involvement of diverse providers in the NHS is limited. Commissioners’ local strategies influence degrees of diversity. Barriers to the entry for TSOs include lack of economies of scale in the bidding process. Private providers have greater concern to improve patient pathways and patient experience, whereas TSOs deliver quality improvements by using a more holistic approach and a greater degree of community involvement. Entry of new providers drives NHS Trusts to respond by making improvements. Information sharing diminishes as competition intensifies. Conclusions: There is scope to increase the participation of diverse providers in the NHS, but care must be taken not to damage public accountability, overall productivity, equity and NHS providers (especially acute hospitals, which are likely to remain in the NHS) in the process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this chapter is to trace the emergence of a new security imaginary in the foreign policy discourse in Germany during the 1990s and to determine whether it constitutes a return of Geopolitik in German foreign policy making. Does the re- appearance of geopolitical terms and expressions in the official and the academic discourses in post-unification Germany indicate such a shift? The essay will argue that the claims about a return of Geopolitik cannot be sustained. To the extent that the rhetoric of German government officials changes during the 1990s, this does not produce a coherent geopolitical security imaginary that stands diametrically opposed to the definition of political and institutional spaces of the Bonner Republik.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents organisational semiotics (OS) as an approach for identifying organisational readiness factors for internal use of social media within information intensive organisations (IIO). The paper examines OS methods, such as organisational morphology, containment analysis and collateral analysis to reveal factors of readiness within an organisation. These models also help to identify the essential patterns of activities needed for social media use within an organisation, which can provide a basis for future analysis. The findings confirmed many of the factors, previously identified in literature, while also revealing new factors using OS methods. The factors for organisational readiness for internal use of social media include resources, organisational climate, processes, motivational readiness, benefit and organisational control factors. Organisational control factors revealed are security/privacy, policies, communication procedures, accountability and fallback.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The agricultural policy agenda has been broadened with farm policy issues now interlinking with other policy domains (food safety, energy supplies, environmental protection, development aid, etc.). New actors promoting values which sometimes conflict, or which are not always easily reconcilable, with those previously guiding agricultural policy have entered the broader agricultural and food policy domain. The studies of various new policy issues inter-linking with the agricultural policy domain included in this special issue show that value conflicts are addressed in different ways and thus result in inter-institutional coordination and conflict unfolding differently. Studies of inter-institutional policy making in the agricultural policy sector have the potential to contribute to theoretical developments in public policy analysis in much the same way as agricultural policy studies did in the past.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Iatrogenic errors and patient safety in clinical processes are an increasing concern. The quality of process information in hardcopy or electronic form can heavily influence clinical behaviour and decision making errors. Little work has been undertaken to assess the safety impact of clinical process planning documents guiding the clinical actions and decisions. This paper investigates the clinical process documents used in elective surgery and their impact on latent and active clinical errors. Eight clinicians from a large health trust underwent extensive semi- structured interviews to understand their use of clinical documents, and their perceived impact on errors and patient safety. Samples of the key types of document used were analysed. Theories of latent organisational and active errors from the literature were combined with the EDA semiotics model of behaviour and decision making to propose the EDA Error Model. This model enabled us to identify perceptual, evaluation, knowledge and action error types and approaches to reducing their causes. The EDA error model was then used to analyse sample documents and identify error sources and controls. Types of knowledge artefact structures used in the documents were identified and assessed in terms of safety impact. This approach was combined with analysis of the questionnaire findings using existing error knowledge from the literature. The results identified a number of document and knowledge artefact issues that give rise to latent and active errors and also issues concerning medical culture and teamwork together with recommendations for further work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors study the role of ocean heat transport (OHT) in the maintenance of a warm, equable, ice-free climate. An ensemble of idealized aquaplanet GCM calculations is used to assess the equilibrium sensitivity of global mean surface temperature and its equator-to-pole gradient (ΔT) to variations in OHT, prescribed through a simple analytical formula representing export out of the tropics and poleward convergence. Low-latitude OHT warms the mid- to high latitudes without cooling the tropics; increases by 1°C and ΔT decreases by 2.6°C for every 0.5-PW increase in OHT across 30° latitude. This warming is relatively insensitive to the detailed meridional structure of OHT. It occurs in spite of near-perfect atmospheric compensation of large imposed variations in OHT: the total poleward heat transport is nearly fixed. The warming results from a convective adjustment of the extratropical troposphere. Increased OHT drives a shift from large-scale to convective precipitation in the midlatitude storm tracks. Warming arises primarily from enhanced greenhouse trapping associated with convective moistening of the upper troposphere. Warming extends to the poles by atmospheric processes even in the absence of high-latitude OHT. A new conceptual model for equable climates is proposed, in which OHT plays a key role by driving enhanced deep convection in the midlatitude storm tracks. In this view, the climatic impact of OHT depends on its effects on the greenhouse properties of the atmosphere, rather than its ability to increase the total poleward energy transport.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the dynamic interactions between real estate markets, in the US and the UK and their macroeconomic environments. We apply a new approach based on a dynamic coherence function (DCF) to study these interactions bringing together different real estate markets (the securitized market, the commercial market and the residential market). The results suggest that there is a common trend that drives the different real estate markets in the UK and the US, particularly in the long run, since they have a similar shape of the DCF. We also find that, in the US, wealth and housing expenditure channels are very conductive during real estate crises. However, in the UK, only the wealth effect is significant as a transmission channel during real estate market downturns. In addition, real estate markets in the UK and the US react differently to institutional shocks. This brings some insights on the conduct of monetary policy in order to avoid disturbances in real estate markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a new sparse model construction method aimed at maximizing a model’s generalisation capability for a large class of linear-in-the-parameters models. The coordinate descent optimization algorithm is employed with a modified l1- penalized least squares cost function in order to estimate a single parameter and its regularization parameter simultaneously based on the leave one out mean square error (LOOMSE). Our original contribution is to derive a closed form of optimal LOOMSE regularization parameter for a single term model, for which we show that the LOOMSE can be analytically computed without actually splitting the data set leading to a very simple parameter estimation method. We then integrate the new results within the coordinate descent optimization algorithm to update model parameters one at the time for linear-in-the-parameters models. Consequently a fully automated procedure is achieved without resort to any other validation data set for iterative model evaluation. Illustrative examples are included to demonstrate the effectiveness of the new approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Hadley Centre Global Environmental Model (HadGEM) includes two aerosol schemes: the Coupled Large-scale Aerosol Simulator for Studies in Climate (CLASSIC), and the new Global Model of Aerosol Processes (GLOMAP-mode). GLOMAP-mode is a modal aerosol microphysics scheme that simulates not only aerosol mass but also aerosol number, represents internally-mixed particles, and includes aerosol microphysical processes such as nucleation. In this study, both schemes provide hindcast simulations of natural and anthropogenic aerosol species for the period 2000–2006. HadGEM simulations of the aerosol optical depth using GLOMAP-mode compare better than CLASSIC against a data-assimilated aerosol re-analysis and aerosol ground-based observations. Because of differences in wet deposition rates, GLOMAP-mode sulphate aerosol residence time is two days longer than CLASSIC sulphate aerosols, whereas black carbon residence time is much shorter. As a result, CLASSIC underestimates aerosol optical depths in continental regions of the Northern Hemisphere and likely overestimates absorption in remote regions. Aerosol direct and first indirect radiative forcings are computed from simulations of aerosols with emissions for the year 1850 and 2000. In 1850, GLOMAP-mode predicts lower aerosol optical depths and higher cloud droplet number concentrations than CLASSIC. Consequently, simulated clouds are much less susceptible to natural and anthropogenic aerosol changes when the microphysical scheme is used. In particular, the response of cloud condensation nuclei to an increase in dimethyl sulphide emissions becomes a factor of four smaller. The combined effect of different 1850 baselines, residence times, and abilities to affect cloud droplet number, leads to substantial differences in the aerosol forcings simulated by the two schemes. GLOMAP-mode finds a presentday direct aerosol forcing of −0.49Wm−2 on a global average, 72% stronger than the corresponding forcing from CLASSIC. This difference is compensated by changes in first indirect aerosol forcing: the forcing of −1.17Wm−2 obtained with GLOMAP-mode is 20% weaker than with CLASSIC. Results suggest that mass-based schemes such as CLASSIC lack the necessary sophistication to provide realistic input to aerosol-cloud interaction schemes. Furthermore, the importance of the 1850 baseline highlights how model skill in predicting present-day aerosol does not guarantee reliable forcing estimates. Those findings suggest that the more complex representation of aerosol processes in microphysical schemes improves the fidelity of simulated aerosol forcings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the monetary policy independence of European nations in the years before European Economic and Monetary Union (EMU) is investigated using cointegration techniques. Daily data is used to assess pairwise relationships between individual EMU nations and ‘lead’ nation Germany, to assess the hypothesis that Germany was the dominant European nation prior to EMU. By and large our econometric investigations support this hypothesis, and lead us to conclude that the only European nation to lose monetary policy independence in the light of monetary union was Germany. Our results have important policy implications. Given that the loss of monetary policy independence is generally viewed as the main cost of monetary unification, our findings suggest a reconsideration of the costs and benefits of monetary integration. A country can only lose what it has, and in Europe the countries that joined EMU — spare Germany — apparently did not have much to lose, at least not in terms of monetary independence. Instead, they actually gained monetary policy influence by getting a seat in the ECB's governing council which is responsible for setting interest policy in the euro area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present five new cloud detection algorithms over land based on dynamic threshold or Bayesian techniques, applicable to the Advanced Along Track Scanning Radiometer (AATSR) instrument and compare these with the standard threshold based SADIST cloud detection scheme. We use a manually classified dataset as a reference to assess algorithm performance and quantify the impact of each cloud detection scheme on land surface temperature (LST) retrieval. The use of probabilistic Bayesian cloud detection methods improves algorithm true skill scores by 8-9 % over SADIST (maximum score of 77.93 % compared to 69.27 %). We present an assessment of the impact of imperfect cloud masking, in relation to the reference cloud mask, on the retrieved AATSR LST imposing a 2 K tolerance over a 3x3 pixel domain. We find an increase of 5-7 % in the observations falling within this tolerance when using Bayesian methods (maximum of 92.02 % compared to 85.69 %). We also demonstrate that the use of dynamic thresholds in the tests employed by SADIST can significantly improve performance, applicable to cloud-test data to provided by the Sea and Land Surface Temperature Radiometer (SLSTR) due to be launched on the Sentinel 3 mission (estimated 2014).