6 resultados para Bayesian hierarchical model
em Repositório digital da Fundação Getúlio Vargas - FGV
Resumo:
We propose mo deIs to analyze animal growlh data wilh lhe aim of eslimating and predicting quanlities of Liological and economical interest such as the maturing rate and asymptotic weight. lt is also studied lhe effect of environmenlal facLors of relevant influence in the growlh processo The models considered in this paper are based on an extension and specialization of the dynamic hierarchical model (Gamerman " Migon, 1993) lo a non-Iinear growlh curve sdLillg, where some of the growth curve parameters are considered cxchangeable among lhe unils. The inferencc for thcse models are appruximale conjugale analysis Lascd on Taylor series cxpallsiulIs aliei linear Bayes procedures.
Resumo:
en_US
Resumo:
Economias emergentes sofrem importantes restrições de crédito quando comparadas com economias desenvolvidas, entretanto, modelos estocásticos de equilíbrio geral (DSGE) desenhados para economias emergentes ainda precisam avançar nessa discussão. Nós propomos um modelo DSGE que pretende representar uma economia emergente com setor bancário baseado em Gerali et al. (2010). Nossa contribuição é considerar uma parcela da renda esperada como colateral para empréstimos das famílias. Nós estimamos o modelo proposto para o Brasil utilizando estimação Bayesiana e encontramos que economias que sofrem restrição de colateral por parte das famílias tendem a sentir o impacto de choques monetários mais rapidamente devido a exposição do setor bancário a mudanças no salário esperado.
Resumo:
In the first essay, "Determinants of Credit Expansion in Brazil", analyzes the determinants of credit using an extensive bank level panel dataset. Brazilian economy has experienced a major boost in leverage in the first decade of 2000 as a result of a set factors ranging from macroeconomic stability to the abundant liquidity in international financial markets before 2008 and a set of deliberate decisions taken by President Lula's to expand credit, boost consumption and gain political support from the lower social strata. As relevant conclusions to our investigation we verify that: credit expansion relied on the reduction of the monetary policy rate, international financial markets are an important source of funds, payroll-guaranteed credit and investment grade status affected positively credit supply. We were not able to confirm the importance of financial inclusion efforts. The importance of financial sector sanity indicators of credit conditions cannot be underestimated. These results raise questions over the sustainability of this expansion process and financial stability in the future. The second essay, “Public Credit, Monetary Policy and Financial Stability”, discusses the role of public credit. The supply of public credit in Brazil has successfully served to relaunch the economy after the Lehman-Brothers demise. It was later transformed into a driver for economic growth as well as a regulation device to force private banks to reduce interest rates. We argue that the use of public funds to finance economic growth has three important drawbacks: it generates inflation, induces higher loan rates and may induce financial instability. An additional effect is the prevention of market credit solutions. This study contributes to the understanding of the costs and benefits of credit as a fiscal policy tool. The third essay, “Bayesian Forecasting of Interest Rates: Do Priors Matter?”, discusses the choice of priors when forecasting short-term interest rates. Central Banks that commit to an Inflation Target monetary regime are bound to respond to inflation expectation spikes and product hiatus widening in a clear and transparent way by abiding to a Taylor rule. There are various reports of central banks being more responsive to inflationary than to deflationary shocks rendering the monetary policy response to be indeed non-linear. Besides that there is no guarantee that coefficients remain stable during time. Central Banks may switch to a dual target regime to consider deviations from inflation and the output gap. The estimation of a Taylor rule may therefore have to consider a non-linear model with time varying parameters. This paper uses Bayesian forecasting methods to predict short-term interest rates. We take two different approaches: from a theoretic perspective we focus on an augmented version of the Taylor rule and include the Real Exchange Rate, the Credit-to-GDP and the Net Public Debt-to-GDP ratios. We also take an ”atheoretic” approach based on the Expectations Theory of the Term Structure to model short-term interest. The selection of priors is particularly relevant for predictive accuracy yet, ideally, forecasting models should require as little a priori expert insight as possible. We present recent developments in prior selection, in particular we propose the use of hierarchical hyper-g priors for better forecasting in a framework that can be easily extended to other key macroeconomic indicators.
Resumo:
Cognition is a core subject to understand how humans think and behave. In that sense, it is clear that Cognition is a great ally to Management, as the later deals with people and is very interested in how they behave, think, and make decisions. However, even though Cognition shows great promise as a field, there are still many topics to be explored and learned in this fairly new area. Kemp & Tenembaum (2008) tried to a model graph-structure problem in which, given a dataset, the best underlying structure and form would emerge from said dataset by using bayesian probabilistic inferences. This work is very interesting because it addresses a key cognition problem: learning. According to the authors, analogous insights and discoveries, understanding the relationships of elements and how they are organized, play a very important part in cognitive development. That is, this are very basic phenomena that allow learning. Human beings minds do not function as computer that uses bayesian probabilistic inferences. People seem to think differently. Thus, we present a cognitively inspired method, KittyCat, based on FARG computer models (like Copycat and Numbo), to solve the proposed problem of discovery the underlying structural-form of a dataset.
Resumo:
The aim of this paper is to analyze extremal events using Generalized Pareto Distributions (GPD), considering explicitly the uncertainty about the threshold. Current practice empirically determines this quantity and proceeds by estimating the GPD parameters based on data beyond it, discarding all the information available be10w the threshold. We introduce a mixture model that combines a parametric form for the center and a GPD for the tail of the distributions and uses all observations for inference about the unknown parameters from both distributions, the threshold inc1uded. Prior distribution for the parameters are indirectly obtained through experts quantiles elicitation. Posterior inference is available through Markov Chain Monte Carlo (MCMC) methods. Simulations are carried out in order to analyze the performance of our proposed mode1 under a wide range of scenarios. Those scenarios approximate realistic situations found in the literature. We also apply the proposed model to a real dataset, Nasdaq 100, an index of the financiai market that presents many extreme events. Important issues such as predictive analysis and model selection are considered along with possible modeling extensions.