920 resultados para New Keynesian model, Bayesian methods, Monetary policy, Great Inflation
Resumo:
Recently, kernel-based Machine Learning methods have gained great popularity in many data analysis and data mining fields: pattern recognition, biocomputing, speech and vision, engineering, remote sensing etc. The paper describes the use of kernel methods to approach the processing of large datasets from environmental monitoring networks. Several typical problems of the environmental sciences and their solutions provided by kernel-based methods are considered: classification of categorical data (soil type classification), mapping of environmental and pollution continuous information (pollution of soil by radionuclides), mapping with auxiliary information (climatic data from Aral Sea region). The promising developments, such as automatic emergency hot spot detection and monitoring network optimization are discussed as well.
Resumo:
This paper proposes a model of financial markets and corporate finance,with asymmetric information and no taxes, where equity issues, Bankdebt and Bond financing may all co-exist in equilibrium. The paperemphasizes the relationship Banking aspect of financial intermediation:firms turn to banks as a source of investment mainly because banks aregood at helping them through times of financial distress. The debtrestructuring service that banks may offer, however, is costly. Therefore,the firms which do not expect to be financially distressed prefer toobtain a cheaper market source of funding through bond or equity issues.This explains why bank lending and bond financing may co-exist inequilibrium. The reason why firms or banks also issue equity in our modelis simply to avoid bankruptcy. Banks have the additional motive that theyneed to satisfy minimum capital adequacy requeriments. Several types ofequilibria are possible, one of which has all the main characteristics ofa "credit crunch". This multiplicity implies that the channels of monetarypolicy may depend on the type of equilibrium that prevails, leadingsometimes to support a "credit view" and other times the classical "moneyview".
Resumo:
This paper examines the properties of G-7 cycles using a multicountry Bayesian panelVAR model with time variations, unit specific dynamics and cross country interdependences.We demonstrate the presence of a significant world cycle and show that country specificindicators play a much smaller role. We detect differences across business cycle phasesbut, apart from an increase in synchronicity in the late 1990s, find little evidence of major structural changes. We also find no evidence of the existence of an Euro area specific cycle or of its emergence in the 1990s.
Resumo:
In spite of increasing representation of women in politics, little is known about their impact onpolicies. Comparing outcomes of parliaments with different shares of female members does not identifytheir causal impact because of possible differences in the underlying electorate. This paper usesa unique data set on voting decisions to sheds new light on gender gaps in policy making. Ouranalysis focuses on Switzerland, where all citizens can directly decide on a broad range of policiesin referendums and initiatives. We show that there are large gender gaps in the areas of health,environmental protection, defense spending and welfare policy which typically persist even conditionalon socio-economic characteristics. We also find that female policy makers have a substantial effect onthe composition of public spending, but a small effect on the overall size of government.
Resumo:
In monetary unions, monetary policy is typically made by delegates of the member countries. This procedure raises the possibility of strategic delegation - that countries may choose the types of delegates to influence outcomes in their favor. We show that without commitment in monetary policy, strategic delegation arises if and only if three conditions are met: shocks affecting individual countries are not perfectly correlated, risk-sharing across countries is imperfect, and the Phillips Curve is nonlinear. Moreover, inflation rates are inefficiently high. We argue that ways of solving the commitment problem, including the emphasis on price stability in the agreements constituting the European Union are especially valuable when strategic delegation is a problem.
Resumo:
We investigate macroeconomic fluctuations in the Mediterranean basin, their similarities and convergence. A model with four indicators, roughly covering theWest, the East and the Middle East and the North Africa portions of theMediterranean, characterizes well the historical experience since the early 1980.Idiosyncratic causes still dominate domestic cyclical fluctuations in many countries. Convergence and divergence coexist are local and transitory. The cyclicaloutlook for the next few years is rosier for the East than for the West.
Resumo:
We analyze risk sharing and fiscal spending in a two-region model withcomplete markets. Fiscal policy determines tax rates for each state ofnature. When fiscal policy is decentralized, it can be used to affect prices of securities. To manipulate prices to their beneffit, regionschoose pro-cyclical fiscal spending. This leads to incomplete risk sharing,despite the existence of complete markets and the absence of aggregaterisk. When a fiscal union centralizes fiscal policy, securities pricescan no longer be manipulated and complete risk sharing ensues. If regionsare homogeneous, median income residents of both regions prefer the fiscalunion. If they are heterogeneous, the median resident of the rich regionprefers the decentralized setting.
Resumo:
Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.
Resumo:
This paper presents evidence that the existence of deposit and lending facilities combined with an averaging provision for the reserve requirement are powerful tools to stabilize the overnight rate. We reach this conclusion by comparing the behavior of this rate in Germany before and after thestart of Stage III of the EMU. The analysis of the German experience is useful because it allows us to isolate the effects on the overnight rate of these particular instruments of monetary policy. To show that this outcome is a general conclusion and not a particular result of the German market, we develop a theoretical model of reserve management which isable to reproduce our empirical findings.
Resumo:
The Network Revenue Management problem can be formulated as a stochastic dynamic programming problem (DP or the\optimal" solution V *) whose exact solution is computationally intractable. Consequently, a number of heuristics have been proposed in the literature, the most popular of which are the deterministic linear programming (DLP) model, and a simulation based method, the randomized linear programming (RLP) model. Both methods give upper bounds on the optimal solution value (DLP and PHLP respectively). These bounds are used to provide control values that can be used in practice to make accept/deny decisions for booking requests. Recently Adelman [1] and Topaloglu [18] have proposed alternate upper bounds, the affine relaxation (AR) bound and the Lagrangian relaxation (LR) bound respectively, and showed that their bounds are tighter than the DLP bound. Tight bounds are of great interest as it appears from empirical studies and practical experience that models that give tighter bounds also lead to better controls (better in the sense that they lead to more revenue). In this paper we give tightened versions of three bounds, calling themsAR (strong Affine Relaxation), sLR (strong Lagrangian Relaxation) and sPHLP (strong Perfect Hindsight LP), and show relations between them. Speciffically, we show that the sPHLP bound is tighter than sLR bound and sAR bound is tighter than the LR bound. The techniques for deriving the sLR and sPHLP bounds can potentially be applied to other instances of weakly-coupled dynamic programming.
Resumo:
We develop and estimate a structural model of inflation that allowsfor a fraction of firms that use a backward looking rule to setprices. The model nests the purely forward looking New KeynesianPhillips curve as a particular case. We use measures of marginalcosts as the relevant determinant of inflation, as the theorysuggests, instead of an ad-hoc output gap. Real marginal costsare a significant and quantitatively important determinant ofinflation. Backward looking price setting, while statisticallysignificant, is not quantitatively important. Thus, we concludethat the New Keynesian Phillips curve provides a good firstapproximation to the dynamics of inflation.
Resumo:
In a closed economy context there is common agreement on price inflation stabilization being one of the objects of monetary policy. Moving to an open economy context gives rise to the coexistence of two measures of inflation: domestic inflation (DI) and consumer price inflation (CPI). Which one of the two measures should be the target variable? This is the question addressed in this paper. In particular, I use a small open economy model to show that once sticky wages indexed to past CPI inflation are introduced, a complete inward looking monetary policy is no more optimal. I first, derive a loss function from a secondorder approximation of the utility function and then, I compute the fully optimalmonetary policy under commitment. Then, I use the optimal monetary policy as a benchmark to compare the performance of different monetary policy rules. The main result is that once a positive degree of indexation is introduced in the model the rule performing better (among the Taylor type rules considered) is the one targeting wage inflation and CPI inflation. Moreover this rule delivers results very close to the one obtained under the fully optimal monetary policy with commitment.
Resumo:
We characterize the macroeconomic performance of a set of industrialized economies in the aftermath of the oil price shocks of the 1970s and of the last decade, focusing on the differences across episodes. We examine four different hypotheses for the mild effects on inflation and economic activity of the recent increase in the price of oil: (a) good luck (i.e. lack of concurrent adverse shocks), (b) smaller share of oil in production, (c) more flexible labor markets, and (d) improvements in monetary policy. Weconclude that all four have played an important role.
Resumo:
In this paper we study the relationship between labor market institutions and monetary policy. We use a simple macroeconomic framework to show how optimal monetary policy rules depend on labor institutions (labor adjustment costs, and nominal and real wage rigitidy) and social preferences regarding inflation, employment, and real wages. We also calibrate our model tocompute how the change in social welfare brought about by giving up monetary policy as a result of joining the Economic and Monetary Union (EMU) depends on institutions and preferences. We then use the calibrated model to analyze how EMU affects the incentives for labor market reform, both for reformsthat increase the economy's adjustment potential and for those that affect the long-run unemployment rate.
Resumo:
A taxonomic key for the genera of Elmidae (Coleoptera, Byrrhoidea) occurring in Goiás State, Brazil, including new records and distributional notes. Despite their great diversity and high abundance in Neotropical aquatic environments, the fauna of Elmidae remains practically unknown in some areas and even entire biomes in this region. In this work we bring, for the first time, faunistic data for the Elmidae of central Brazil. The aim of this work was to inventory the Elmidae fauna in central, southwestern and southeastern Goiás State, Brazil and to produce a taxonomic key, at genus level, for adults from the studied region. The taxonomic key presented herein offers means for the identification of all the 13 genera known to occur in Goiás, 11 of them being new records for the State. Moreover, the number of named species registered for Goiás increased from one to nine.