1000 resultados para Macroeconomic information
Resumo:
This thesis examines the effects of macroeconomic factors on inflation level and volatility in the Euro Area to improve the accuracy of inflation forecasts with econometric modelling. Inflation aggregates for the EU as well as inflation levels of selected countries are analysed, and the difference between these inflation estimates and forecasts are documented. The research proposes alternative models depending on the focus and the scope of inflation forecasts. I find that models with a Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) in mean process have better explanatory power for inflation variance compared to the regular GARCH models. The significant coefficients are different in EU countries in comparison to the aggregate EU-wide forecast of inflation. The presence of more pronounced GARCH components in certain countries with more stressed economies indicates that inflation volatility in these countries are likely to occur as a result of the stressed economy. In addition, other economies in the Euro Area are found to exhibit a relatively stable variance of inflation over time. Therefore, when analysing EU inflation one have to take into consideration the large differences on country level and focus on those one by one.
Resumo:
This paper provides evidence on the sources of co-movement in monthly US and UK stock price movements by investigating the role of macroeconomic and financial variables in a bivariate system with time-varying conditional correlations. Crosscountry communality in response is uncovered, with changes in the US Federal Funds rate, UK bond yields and oil prices having similar negative effects in both markets. Other variables also play a role, especially for the UK market. These effects do not, however, explain the marked increase in cross-market correlations observed from around 2000, which we attribute to time variation in the correlations of shocks to these markets. A regime-switching smooth transition model captures this time variation well and shows the correlations increase dramatically around 1999-2000. JEL classifications: C32, C51, G15 Keywords: international stock returns, DCC-GARCH model, smooth transition conditional correlation GARCH model, model evaluation.
Resumo:
El desarrollo de este trabajo de tesis constituye el esfuerzo de La Secretaria de Desarrollo Económico, Maloka, CIDEM, Universidad del Rosario y de la estudiante por contribuir con el aprendizaje y desarrollo de los pequeños y medianos empresarios colombianos, en virtud de que, en el país no existe la adecuada capacitación y formación académica y practica para crear, desarrollar y gerenciar empresas que cuenten con la capacidad y habilidad de introducir y penetrar de manera exitosa sus productos en mercados externos y así aumentar la presencia, calidad y competitividad del país con respecto a otros mercados. El referido trabajo se llevo a cabo en cinco módulos que sirvieron como base en la toma de decisiones y definición de estrategias, estos módulos son: Inteligencia de Mercados, Costos, Logística, Comercialización y Plan Exportador.
Resumo:
El desarrollo de este proyecto de tesis es el esfuerzo mancomunado de la Secretaria de Desarrollo Económico, Maloka, CIDEM, Universidad del Rosario y la estudiante con el fin de apoyar la formación y el crecimiento de las pequeñas y medianas empresas colombianas, puesto que actualmente en el país no hay un soporte adecuado en el aprendizaje para crear empresas exitosas que puedan competir en el mercado internacional. Este trabajo tuvo una duración de 5 meses en los cuales se realizaron 5 módulos cada uno enfocado a la creación de un plan exportador, estos módulos son: Inteligencia de Mercados, Costos de Producción, Precios y Logística, Estrategias de Comercialización y Plan Exportador, en todos estos se buscaron las bases para que la empresa MELTRONIC LTDA., pudiera implementar un plan de exportación exitoso para los mercados escogidos: Perú, Costa Rica y Chile.
Resumo:
Este trabajo tuvo una duración de 5 meses en los cuales se realizaron 5 módulos cada uno enfocado a la creación de un plan exportador, estos módulos son: Inteligencia de Mercados, Costos de Producción, Precios y Logística internacional, Estrategias de Comercialización y Plan Exportador, en todos estos se buscaron las bases para que la empresa 3 D.I.T LTDA., pudiera implementar un plan de exportación exitoso para los mercados escogidos: México, Chile y Estados Unidos.
Resumo:
O presente trabalho visa propor uma metodologia para Teste de estresse e, consequentemente, cálculo do colchão adicional de capital em risco de crédito, conforme exigência do Comitê de Supervisão Bancária. A metodologia consiste em utilizar informações macroeconômicas para determinar o comportamento da taxa de inadimplência. Dessa forma, podemos simular possíveis cenários econômicos e, com isso, a taxa de inadimplência associada a esse cenário. Para cada cenário econômico é obtida uma taxa. Cada taxa de inadimplência fornece uma curva de perdas. Simulando diversos cenários econômicos foi possível obter diversas curvas de perda e, com elas, a probabilidade de ocorrência da perda esperada e inesperada. A metodologia foi aplicada a uma carteira de crédito pessoal para pessoa física. Os resultados se mostraram bastantes eficientes para determinar a probabilidade de ocorrência do Capital Alocado. Como consequência do teste, dado um nível de confiança, foi possível determinar qual deveria ser o Capital Alocado para fazer frente às perdas acima da perda inesperada.
Resumo:
A tradicional representação da estrutura a termo das taxas de juros em três fatores latentes (nível, inclinação e curvatura) teve sua formulação original desenvolvida por Charles R. Nelson e Andrew F. Siegel em 1987. Desde então, diversas aplicações vêm sendo desenvolvidas por acadêmicos e profissionais de mercado tendo como base esta classe de modelos, sobretudo com a intenção de antecipar movimentos nas curvas de juros. Ao mesmo tempo, estudos recentes como os de Diebold, Piazzesi e Rudebusch (2010), Diebold, Rudebusch e Aruoba (2006), Pooter, Ravazallo e van Dijk (2010) e Li, Niu e Zeng (2012) sugerem que a incorporação de informação macroeconômica aos modelos da ETTJ pode proporcionar um maior poder preditivo. Neste trabalho, a versão dinâmica do modelo Nelson-Siegel, conforme proposta por Diebold e Li (2006), foi comparada a um modelo análogo, em que são incluídas variáveis exógenas macroeconômicas. Em paralelo, foram testados dois métodos diferentes para a estimação dos parâmetros: a tradicional abordagem em dois passos (Two-Step DNS), e a estimação com o Filtro de Kalman Estendido, que permite que os parâmetros sejam estimados recursivamente, a cada vez que uma nova informação é adicionada ao sistema. Em relação aos modelos testados, os resultados encontrados mostram-se pouco conclusivos, apontando uma melhora apenas marginal nas estimativas dentro e fora da amostra quando as variáveis exógenas são incluídas. Já a utilização do Filtro de Kalman Estendido mostrou resultados mais consistentes quando comparados ao método em dois passos para praticamente todos os horizontes de tempo estudados.
Resumo:
In the last decade, the potential macroeconomic effects of intermittent large adjustments in microeconomic decision variables such as prices, investment, consumption of durables or employment – a behavior which may be justified by the presence of kinked adjustment costs – have been studied in models where economic agents continuously observe the optimal level of their decision variable. In this paper, we develop a simple model which introduces infrequent information in a kinked adjustment cost model by assuming that agents do not observe continuously the frictionless optimal level of the control variable. Periodic releases of macroeconomic statistics or dividend announcements are examples of such infrequent information arrivals. We first solve for the optimal individual decision rule, that is found to be both state and time dependent. We then develop an aggregation framework to study the macroeconomic implications of such optimal individual decision rules. Our model has the distinct characteristic that a vast number of agents tend to act together, and more so when uncertainty is large. The average effect of an aggregate shock is inversely related to its size and to aggregate uncertainty. We show that these results differ substantially from the ones obtained with full information adjustment cost models.
Resumo:
We extend the macroeconomic literature on Sstype rules by introducing infrequent information in a kinked ad justment cost model. We first show that optimal individual decision rules are both state-and -time dependent. We then develop an aggregation framework to study the macroeconomic implications of such optimal individual decision rules. In our model, a vast number of agents act together, and more so when uncertainty is large.The average effect of an aggregate shock is inversely related to its size and to aggregate uncertainty. These results are in contrast with those obtained with full information ad justment cost models.
Resumo:
Sticky information monetary models have been used in the macroeconomic literature to explain some of the observed features regarding inflation dynamics. In this paper, we explore the consequences of relaxing the rational expectations assumption usually taken in this type of model; in particular, by considering expectations formed through adaptive learning, it is possible to arrive to results other than the trivial convergence to a fixed point long-term equilibrium. The results involve the possibility of endogenous cyclical motion (periodic and a-periodic), which emerges essentially in scenarios of hyperinflation. In low inflation settings, the introduction of learning implies a less severe impact of monetary shocks that, nevertheless, tend to last for additional time periods relative to the pure perfect foresight setup.
Resumo:
Flow of new information is what produces price changes, understanding if the market is unbalanced is fundamental to know how much inventory market makers should keep during an important economic release. After identifying which economic indicators impact the S&P and 10 year Treasuries. The Volume Synchronized Probability of Information-Based Trading (VPIN) will be used as a predictability measure. The results point to some predictability power over economic surprises of the VPIN metric, mainly when calculated using the S&P. This finding appears to be supported when analysing depth imbalance before economic releases. Inferior results were achieved when using treasuries. The final aim of this study is to fill the gap between microstructural changes and macroeconomic events.
Resumo:
Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting models as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output growth and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.
Resumo:
Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting model as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.
Resumo:
We perform an experiment on a pure coordination game with uncertaintyabout the payoffs. Our game is closely related to models that have beenused in many macroeconomic and financial applications to solve problemsof equilibrium indeterminacy. In our experiment each subject receives anoisy signal about the true payoffs. This game has a unique strategyprofile that survives the iterative deletion of strictly dominatedstrategies (thus a unique Nash equilibrium). The equilibrium outcomecoincides, on average, with the risk-dominant equilibrium outcome ofthe underlying coordination game. The behavior of the subjects convergesto the theoretical prediction after enough experience has been gained. The data (and the comments) suggest that subjects do not apply through"a priori" reasoning the iterated deletion of dominated strategies.Instead, they adapt to the responses of other players. Thus, the lengthof the learning phase clearly varies for the different signals. We alsotest behavior in a game without uncertainty as a benchmark case. The gamewith uncertainty is inspired by the "global" games of Carlsson and VanDamme (1993).
Resumo:
Loans are illiquid assets that can be sold in a secondary market even that buyers have no certainty about their quality. I study a model in which a lender has access to new investment opportunities when all her assets are illiquid. To raise funds, the lender may either borrow using her assets as collateral, or she can sell them in a secondary market. Given asymmetric information about assets quality, the lender cannot recover the total value of her assets. There is then a role for the government to correct the information problem using fiscal tools.