920 resultados para New Keynesian model, Bayesian methods, Monetary policy, Great Inflation
Resumo:
The paper shows the advantages and handicaps of implementing an inflation target (IT) regime, from a Post-Keynesian and, thus, an institutional stance. It is Post-Keynesian as long as it does not perceive any benefit in the mainstream split between monetary and fiscal policies. And it is institutional insofar as it assumes that there are several ways of implementing a policy, such that the chosen one is determined by historical factors, as it is illustrated by the Brazilian case. One could even support IT policies if their targets were seen just as “focusing devices” guiding economic policy, notwithstanding other targets, as, in the short run, output growth and employment and, in the long run, technology and human development. Nevertheless, an IT is not necessary, although it can be admitted, mainly if the target is hidden from the public, in order to increase the flexibility of the Central Bank.
Resumo:
Inflation targeting regime is a monetary policy adopted by several countries in the 1990s, Brazil being among them, which adopted it in 1999 after a currency crisis. With a theoretical framework inspired by the new-classical theory, this regime is adopted by countries attempting to achieve price stability and it brings the prior announcement of a numerical target for inflation as a key feature. The present work aims at discussing the use of IPCA (Consumer Price Index) as a measuring index for Brazil's inflation after briefly explain the theoretical basis of the IT regime.
Resumo:
We introduce a new kind of likelihood function based on the sequence of moments of the data distribution. Both binned and unbinned data samples are discussed, and the multivariate case is also derived. Building on this approach we lay out the formalism of shape analysis for signal searches. In addition to moment-based likelihoods, standard likelihoods and approximate statistical tests are provided. Enough material is included to make the paper self-contained from the perspective of shape analysis. We argue that the moment-based likelihoods can advantageously replace unbinned standard likelihoods for the search of nonlocal signals, by avoiding the step of fitting Monte Carlo generated distributions. This benefit increases with the number of variables simultaneously analyzed. The moment-based signal search is exemplified and tested in various 1D toy models mimicking typical high-energy signal-background configurations. Moment-based techniques should be particularly appropriate for the searches for effective operators at the LHC.
Resumo:
Modeling is a step to perform a finite element analysis. Different methods of model construction are reported in literature, as the Bio-CAD modeling. The purpose of this study was to perform a model evaluation and application using two methods of Bio-CAD modeling from human edentulous hemi-mandible on the finite element analysis. From CT scans of dried human skull was reconstructed a stereolithographic model. Two methods of modeling were performed: STL conversion approach (Model 1) associated to STL simplification and reverse engineering approach (Model 2). For finite element analysis was used the action of lateral pterygoid muscle as loading condition to assess total displacement (D), equivalent von-Mises stress (VM) and maximum principal stress (MP). Two models presented differences on the geometry regarding surface number (1834 (model 1); 282 (model 2)). Were observed differences in finite element mesh regarding element number (30428 nodes/16683 elements (model 1); 15801 nodes/8410 elements (model 2). D, VM and MP stress areas presented similar distribution in two models. The values were different regarding maximum and minimum values of D (ranging 0-0.511 mm (model 1) and 0-0.544 mm (model 2), VM stress (6.36E-04-11.4 MPa (model 1) and 2.15E-04-14.7 MPa (model 2) and MP stress (-1.43-9.14 MPa (model 1) and -1.2-11.6 MPa (model 2). From two methods of Bio-CAD modeling, the reverse engineering presented better anatomical representation compared to the STL conversion approach. The models presented differences in the finite element mesh, total displacement and stress distribution.
Resumo:
The study introduces a new regression model developed to estimate the hourly values of diffuse solar radiation at the surface. The model is based on the clearness index and diffuse fraction relationship, and includes the effects of cloud (cloudiness and cloud type), traditional meteorological variables (air temperature, relative humidity and atmospheric pressure observed at the surface) and air pollution (concentration of particulate matter observed at the surface). The new model is capable of predicting hourly values of diffuse solar radiation better than the previously developed ones (R-2 = 0.93 and RMSE = 0.085). A simple version with a large applicability is proposed that takes into consideration cloud effects only (cloudiness and cloud height) and shows a R-2 = 0.92. (C) 2011 Elsevier Ltd. All rights reserved.