21 resultados para Risk models

em Repositório digital da Fundação Getúlio Vargas - FGV


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is concerned with evaluating value at risk estimates. It is well known that using only binary variables to do this sacrifices too much information. However, most of the specification tests (also called backtests) avaliable in the literature, such as Christoffersen (1998) and Engle and Maganelli (2004) are based on such variables. In this paper we propose a new backtest that does not realy solely on binary variable. It is show that the new backtest provides a sufficiant condition to assess the performance of a quantile model whereas the existing ones do not. The proposed methodology allows us to identify periods of an increased risk exposure based on a quantile regression model (Koenker & Xiao, 2002). Our theorical findings are corroborated through a monte Carlo simulation and an empirical exercise with daily S&P500 time series.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis is composed of three essays referent to the subjects of macroeconometrics and Önance. In each essay, which corresponds to one chapter, the objective is to investigate and analyze advanced econometric techniques, applied to relevant macroeconomic questions, such as the capital mobility hypothesis and the sustainability of public debt. A Önance topic regarding portfolio risk management is also investigated, through an econometric technique used to evaluate Value-at-Risk models. The Örst chapter investigates an intertemporal optimization model to analyze the current account. Based on Campbell & Shillerís (1987) approach, a Wald test is conducted to analyze a set of restrictions imposed to a VAR used to forecast the current account. The estimation is based on three di§erent procedures: OLS, SUR and the two-way error decomposition of Fuller & Battese (1974), due to the presence of global shocks. A note on Granger causality is also provided, which is shown to be a necessary condition to perform the Wald test with serious implications to the validation of the model. An empirical exercise for the G-7 countries is presented, and the results substantially change with the di§erent estimation techniques. A small Monte Carlo simulation is also presented to investigate the size and power of the Wald test based on the considered estimators. The second chapter presents a study about Öscal sustainability based on a quantile autoregression (QAR) model. A novel methodology to separate periods of nonstationarity from stationary ones is proposed, which allows one to identify trajectories of public debt that are not compatible with Öscal sustainability. Moreover, such trajectories are used to construct a debt ceiling, that is, the largest value of public debt that does not jeopardize long-run Öscal sustainability. An out-of-sample forecast of such a ceiling is also constructed, and can be used by policy makers interested in keeping the public debt on a sustainable path. An empirical exercise by using Brazilian data is conducted to show the applicability of the methodology. In the third chapter, an alternative backtest to evaluate the performance of Value-at-Risk (VaR) models is proposed. The econometric methodology allows one to directly test the overall performance of a VaR model, as well as identify periods of an increased risk exposure, which seems to be a novelty in the literature. Quantile regressions provide an appropriate environment to investigate VaR models, since they can naturally be viewed as a conditional quantile function of a given return series. An empirical exercise is conducted for daily S&P500 series, and a Monte Carlo simulation is also presented, revealing that the proposed test might exhibit more power in comparison to other backtests.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Os modelos hazard, também conhecidos por modelos de tempo até a falência ou duração, são empregados para determinar quais variáveis independentes têm maior poder explicativo na previsão de falência de empresas. Consistem em uma abordagem alternativa aos modelos binários logit e probit, e à análise discriminante. Os modelos de duração deveriam ser mais eficientes que modelos de alternativas discretas, pois levam em consideração o tempo de sobrevivência para estimar a probabilidade instantânea de falência de um conjunto de observações sobre uma variável independente. Os modelos de alternativa discreta tipicamente ignoram a informação de tempo até a falência, e fornecem apenas a estimativa de falhar em um dado intervalo de tempo. A questão discutida neste trabalho é como utilizar modelos hazard para projetar taxas de inadimplência e construir matrizes de migração condicionadas ao estado da economia. Conceitualmente, o modelo é bastante análogo às taxas históricas de inadimplência e mortalidade utilizadas na literatura de crédito. O Modelo Semiparamétrico Proporcional de Cox é testado em empresas brasileiras não pertencentes ao setor financeiro, e observa-se que a probabilidade de inadimplência diminui sensivelmente após o terceiro ano da emissão do empréstimo. Observa-se também que a média e o desvio-padrão das probabilidades de inadimplência são afetados pelos ciclos econômicos. É discutido como o Modelo Proporcional de Cox pode ser incorporado aos quatro modelos mais famosos de gestão de risco .de crédito da atualidade: CreditRisk +, KMV, CreditPortfolio View e CreditMetrics, e as melhorias resultantes dessa incorporação

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Há uma intensa transformação nas empresas, exigindo uma efetividade maior em seus processos. Neste contexto repleto de incertezas, tem grande importância a tomada de decisão. O processo decisório de uma organização é o reflexo de como ela interage com seu ambiente (Simon,1965). A Tomada de Decisão se toma muito importante para a organização, passando por diversas variáveis na constituição de um modelo decisório. Esta pesquisa tem como objetivo o estudo dos modelos adotados por uma instituição financeira, que refletem sua fonna de decidir. A escolha deste tema justifica-se pela importância que o mesmo representa para as organizações financeiras e pela necessidade de melhorar a efetividade decisória. A empresa estudada é o Banco do Brasil, através de uma análise do processo de crédito dos clientes "pessoas jurídicas", onde se procura descrever a fonna como as decisões são tomadas. Há uma descrição da hierarquia do processo decisório, juntamente com os modelos de análise do risco de crédito como fatores preponderantes no sucesso do negócio crédito. No capítulo Metodologia, é caracterizada a empresa, a situação problema e o tema de estudo. No referencial teórico, foi feita uma análise do processo decisório, onde são analisados os modelos de análise do risco de crédito e a formação de sistemas de apoio à decisão nas instituições financeiras. Os resultados do estudo proposto, respondem o problema da pesqUIsa, onde é analisada, com aplicação de pesquisa documental e observação empírica, a empresa pesquisada. Através desta pesquisa, observou-se uma organização com modelo decisório extremamente planejado, porém não utilizando ainda as práticas modernas de tomada de decisão com relação ao crédito. A conclusão deste trabalho, além de trazer várias observações em tomo do tema analisado, também faz algumas sugestões à empresa pesquisada.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we test a version of the conditional CAPM with respect to a local market portfolio, proxied by the Brazilian stock index during the period 1976-1992. We also test a conditional APT modeI by using the difference between the 3-day rate (Cdb) and the overnight rate as a second factor in addition to the market portfolio in order to capture the large inflation risk present during this period. The conditional CAPM and APT models are estimated by the Generalized Method of Moments (GMM) and tested on a set of size portfolios created from individual securities exchanged on the Brazilian markets. The inclusion of this second factor proves to be important for the appropriate pricing of the portfolios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard models of moral hazard predict a negative relationship between risk and incentives, but the empirical work has not confirmed this prediction. In this paper, we propose a model with adverse selection followed by moral hazard, where effort and the degree of risk aversion are private information of an agent who can control the mean and the variance of profits. For a given contract, more risk-averse agents suppIy more effort in risk reduction. If the marginal utility of incentives decreases with risk aversion, more risk-averse agents prefer lower-incentive contractsj thus, in the optimal contract, incentives are positively correlated with endogenous risk. In contrast, if risk aversion is high enough, the possibility of reduction in risk makes the marginal utility of incentives increasing in risk aversion and, in this case, risk and incentives are negatively related.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whether human capital increases or decreases wage uncertainty is an open question from an empirical standpoint. Yet, most policy prescriptions regarding human capital formation are based on models that impose riskiness on this type of investment. In a two period and finite type optimal income taxation problem we derive prescriptions that are robust to the risk characteristics of human capital: savings should be discouraged, human capital investments encouraged and both types of investment driven to an efficient level from an aggregate perspective. These prescriptions are also robust to the assumptions regarding what choices are observed, despite policy instruments being not.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper confronts the Capital Asset Pricing Model - CAPM - and the 3-Factor Fama-French - FF - model using both Brazilian and US stock market data for the same Sample period (1999-2007). The US data will serve only as a benchmark for comparative purposes. We use two competing econometric methods, the Generalized Method of Moments (GMM) by (Hansen, 1982) and the Iterative Nonlinear Seemingly Unrelated Regression Estimation (ITNLSUR) by Burmeister and McElroy (1988). Both methods nest other options based on the procedure by Fama-MacBeth (1973). The estimations show that the FF model fits the Brazilian data better than CAPM, however it is imprecise compared with the US analog. We argue that this is a consequence of an absence of clear-cut anomalies in Brazilian data, specially those related to firm size. The tests on the efficiency of the models - nullity of intercepts and fitting of the cross-sectional regressions - presented mixed conclusions. The tests on intercept failed to rejected the CAPM when Brazilian value-premium-wise portfolios were used, contrasting with US data, a very well documented conclusion. The ITNLSUR has estimated an economically reasonable and statistically significant market risk premium for Brazil around 6.5% per year without resorting to any particular data set aggregation. However, we could not find the same for the US data during identical period or even using a larger data set. Este estudo procura contribuir com a literatura empírica brasileira de modelos de apreçamento de ativos. Dois dos principais modelos de apreçamento são Infrontados, os modelos Capital Asset Pricing Model (CAPM)e de 3 fatores de Fama-French. São aplicadas ferramentas econométricas pouco exploradas na literatura nacional na estimação de equações de apreçamento: os métodos de GMM e ITNLSUR. Comparam-se as estimativas com as obtidas de dados americanos para o mesmo período e conclui-se que no Brasil o sucesso do modelo de Fama e French é limitado. Como subproduto da análise, (i) testa-se a presença das chamadas anomalias nos retornos, e (ii) calcula-se o prêmio de risco implícito nos retornos das ações. Os dados revelam a presença de um prêmio de valor, porém não de um prêmio de tamanho. Utilizando o método de ITNLSUR, o prêmio de risco de mercado é positivo e significativo, ao redor de 6,5% ao ano.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is composed of three articles with the subjects of macroeconomics and - nance. Each article corresponds to a chapter and is done in paper format. In the rst article, which was done with Axel Simonsen, we model and estimate a small open economy for the Canadian economy in a two country General Equilibrium (DSGE) framework. We show that it is important to account for the correlation between Domestic and Foreign shocks and for the Incomplete Pass-Through. In the second chapter-paper, which was done with Hedibert Freitas Lopes, we estimate a Regime-switching Macro-Finance model for the term-structure of interest rates to study the US post-World War II (WWII) joint behavior of macro-variables and the yield-curve. We show that our model tracks well the US NBER cycles, the addition of changes of regime are important to explain the Expectation Theory of the term structure, and macro-variables have increasing importance in recessions to explain the variability of the yield curve. We also present a novel sequential Monte-Carlo algorithm to learn about the parameters and the latent states of the Economy. In the third chapter, I present a Gaussian A ne Term Structure Model (ATSM) with latent jumps in order to address two questions: (1) what are the implications of incorporating jumps in an ATSM for Asian option pricing, in the particular case of the Brazilian DI Index (IDI) option, and (2) how jumps and options a ect the bond risk-premia dynamics. I show that jump risk-premia is negative in a scenario of decreasing interest rates (my sample period) and is important to explain the level of yields, and that gaussian models without jumps and with constant intensity jumps are good to price Asian options.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates whether there is evidence of structural change in the Brazilian term structure of interest rates. Multivariate cointegration techniques are used to verify this evidence. Two econometrics models are estimated. The rst one is a Vector Autoregressive Model with Error Correction Mechanism (VECM) with smooth transition in the deterministic coe¢ cients (Ripatti and Saikkonen [25]). The second one is a VECM with abrupt structural change formulated by Hansen [13]. Two datasets were analysed. The rst one contains a nominal interest rate with maturity up to three years. The second data set focuses on maturity up to one year. The rst data set focuses on a sample period from 1995 to 2010 and the second from 1998 to 2010. The frequency is monthly. The estimated models suggest the existence of structural change in the Brazilian term structure. It was possible to document the existence of multiple regimes using both techniques for both databases. The risk premium for di¤erent spreads varied considerably during the earliest period of both samples and seemed to converge to stable and lower values at the end of the sample period. Long-term risk premiums seemed to converge to inter-national standards, although the Brazilian term structure is still subject to liquidity problems for longer maturities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims at contributing to the research agenda on the sources of price stickiness, showing that the adoption of nominal price rigidity may be an optimal firms' reaction to the consumers' behavior, even if firms have no adjustment costs. With regular broadly accepted assumptions on economic agents behavior, we show that firms' competition can lead to the adoption of sticky prices as an (sub-game perfect) equilibrium strategy. We introduce the concept of a consumption centers model economy in which there are several complete markets. Moreover, we weaken some traditional assumptions used in standard monetary policy models, by assuming that households have imperfect information about the ineflicient time-varying cost shocks faced by the firms, e.g. the ones regarding to inefficient equilibrium output leveIs under fiexible prices. Moreover, the timing of events are assumed in such a way that, at every period, consumers have access to the actual prices prevailing in the market only after choosing a particular consumption center. Since such choices under uncertainty may decrease the expected utilities of risk averse consumers, competitive firms adopt some degree of price stickiness in order to minimize the price uncertainty and fi attract more customers fi.'

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our main goal is to investigate the question of which interest-rate options valuation models are better suited to support the management of interest-rate risk. We use the German market to test seven spot-rate and forward-rate models with one and two factors for interest-rate warrants for the period from 1990 to 1993. We identify a one-factor forward-rate model and two spot-rate models with two faetors that are not significant1y outperformed by any of the other four models. Further rankings are possible if additional cri teria are applied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Asset allocation decisions and value at risk calculations rely strongly on volatility estimates. Volatility measures such as rolling window, EWMA, GARCH and stochastic volatility are used in practice. GARCH and EWMA type models that incorporate the dynamic structure of volatility and are capable of forecasting future behavior of risk should perform better than constant, rolling window volatility models. For the same asset the model that is the ‘best’ according to some criterion can change from period to period. We use the reality check test∗ to verify if one model out-performs others over a class of re-sampled time-series data. The test is based on re-sampling the data using stationary bootstrapping. For each re-sample we check the ‘best’ model according to two criteria and analyze the distribution of the performance statistics. We compare constant volatility, EWMA and GARCH models using a quadratic utility function and a risk management measurement as comparison criteria. No model consistently out-performs the benchmark.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses an output oriented Data Envelopment Analysis (DEA) measure of technical efficiency to assess the technical efficiencies of the Brazilian banking system. Four approaches to estimation are compared in order to assess the significance of factors affecting inefficiency. These are nonparametric Analysis of Covariance, maximum likelihood using a family of exponential distributions, maximum likelihood using a family of truncated normal distributions, and the normal Tobit model. The sole focus of the paper is on a combined measure of output and the data analyzed refers to the year 2001. The factors of interest in the analysis and likely to affect efficiency are bank nature (multiple and commercial), bank type (credit, business, bursary and retail), bank size (large, medium, small and micro), bank control (private and public), bank origin (domestic and foreign), and non-performing loans. The latter is a measure of bank risk. All quantitative variables, including non-performing loans, are measured on a per employee basis. The best fits to the data are provided by the exponential family and the nonparametric Analysis of Covariance. The significance of a factor however varies according to the model fit although it can be said that there is some agreements between the best models. A highly significant association in all models fitted is observed only for nonperforming loans. The nonparametric Analysis of Covariance is more consistent with the inefficiency median responses observed for the qualitative factors. The findings of the analysis reinforce the significant association of the level of bank inefficiency, measured by DEA residuals, with the risk of bank failure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In da Costa et al. (2006) we have shown how a same pricing kernel can account for the excess returns of the S&:P500 over the US short term bond and of the uncovered over the covered trading of foreign government bonds. In this paper we estimate and test the overidentifying restrictiom; of Euler equations associated with "ix different versions of the Consumption Capital Asset Pricing I\Iodel. Our main finding is that the same (however often unreasonable) values for the parameters are estimated for ali models in both nmrkets. In most cases, the rejections or otherwise of overidentifying restrictions occurs for the two markets, suggesting that success and failure stories for the equity premium repeat themselves in foreign exchange markets. Our results corroborate the findings in da Costa et al. (2006) that indicate a strong similarity between the behavior of excess returns in the two markets when modeled as risk premiums, providing empirical grounds to believe that the proposed preference-based solutions to puzzles in domestic financiaI markets can certainly shed light on the Forward Premium Puzzle.