852 resultados para Parametric VaR (Value-at-Risk)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nos últimos tempos, mensurar o Risco Operacional (RO) tornou-se o grande desafio para instituições financeiras no mundo todo, principalmente com a implementação das regras de alocação de capital regulatório do Novo Acordo de Capital da Basiléia (NACB). No Brasil, ao final de 2004, o Banco Central (BACEN) estabeleceu um cronograma de metas e disponibilizou uma equipe responsável pela adaptação e implementação dessas regras no sistema financeiro nacional. A Federação de Bancos Brasileiros (FEBRABAN) também divulgou recente pesquisa de gestão de RO envolvendo vários bancos. Todo esse processo trouxe uma vasta e crescente pesquisa e atividades voltadas para a modelagem de RO no Brasil. Em nosso trabalho, medimos o impacto geral nos banco brasileiros, motivado pelas novas regras de alocação de capital de RO envolvendo os modelos mais básicos do NACB. Também introduzimos um modelo avançado de mensuração de risco, chamado Loss Data Distribution (LDA), que alguns especialistas, provenientes do Risco de Mercado, convencionaram chamar de Value-at-Risk Operacional (VaR Operacional.). Ao final desse trabalho apresentamos um caso prático baseado na implementação do LDA ou VaR

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is composed of three essays referent to the subjects of macroeconometrics and Önance. In each essay, which corresponds to one chapter, the objective is to investigate and analyze advanced econometric techniques, applied to relevant macroeconomic questions, such as the capital mobility hypothesis and the sustainability of public debt. A Önance topic regarding portfolio risk management is also investigated, through an econometric technique used to evaluate Value-at-Risk models. The Örst chapter investigates an intertemporal optimization model to analyze the current account. Based on Campbell & Shillerís (1987) approach, a Wald test is conducted to analyze a set of restrictions imposed to a VAR used to forecast the current account. The estimation is based on three di§erent procedures: OLS, SUR and the two-way error decomposition of Fuller & Battese (1974), due to the presence of global shocks. A note on Granger causality is also provided, which is shown to be a necessary condition to perform the Wald test with serious implications to the validation of the model. An empirical exercise for the G-7 countries is presented, and the results substantially change with the di§erent estimation techniques. A small Monte Carlo simulation is also presented to investigate the size and power of the Wald test based on the considered estimators. The second chapter presents a study about Öscal sustainability based on a quantile autoregression (QAR) model. A novel methodology to separate periods of nonstationarity from stationary ones is proposed, which allows one to identify trajectories of public debt that are not compatible with Öscal sustainability. Moreover, such trajectories are used to construct a debt ceiling, that is, the largest value of public debt that does not jeopardize long-run Öscal sustainability. An out-of-sample forecast of such a ceiling is also constructed, and can be used by policy makers interested in keeping the public debt on a sustainable path. An empirical exercise by using Brazilian data is conducted to show the applicability of the methodology. In the third chapter, an alternative backtest to evaluate the performance of Value-at-Risk (VaR) models is proposed. The econometric methodology allows one to directly test the overall performance of a VaR model, as well as identify periods of an increased risk exposure, which seems to be a novelty in the literature. Quantile regressions provide an appropriate environment to investigate VaR models, since they can naturally be viewed as a conditional quantile function of a given return series. An empirical exercise is conducted for daily S&P500 series, and a Monte Carlo simulation is also presented, revealing that the proposed test might exhibit more power in comparison to other backtests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neste trabalho, analisamos a metodologia de cálculo do capital exigido aos bancos brasileiros pelo Banco Central do Brasil, segundo as regras de Basiléia II. O objetivo foi comparar capital regulamentar com capital econômico, medido por modelos de Value at Risk (VaR). Apresentamos exemplos de aplicação destes conceitos em carteiras normalmente negociadas por bancos brasileiros, mostrando a relação entre capital regulamentar e econômico para diversos mercados e estratégias. Tendo em vista as análises realizadas, realçamos os pontos de maior divergência entre os dois tipos de capital. Concluímos enfatizando a importância da revisão de alguns aspectos das regras de Basiléia II no sentido de promover maior convergência entre capital econômico e regulamentar.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo deste trabalho é analisar o desempenho de estimadores de volatilidade que utilizam valores extremos (máximo, mínimo, abertura e fechamento) para ativos no mercado brasileiro. Discute-se o viés dos estimadores usando como referências o estimador clássico, a volatilidade realizada e a volatilidade implícita de 1 mês da série de opções no dinheiro (ATM - at the money); discute-se a eficiência quanto à previsibilidade da volatilidade futura usando o estimador clássico e a volatilidade implícita defasados um período para frente como variáveis dependentes e a eficiência em si, isto é, quanto ao tamanho da variância do estima-dor. Como representantes de ativos brasileiros, foram escolhidos a paridade BRL/USD spot e o Índice Bovespa. Além de bastante líquidos, esses dois ativos têm bastante importância no mercado brasileiro, seja como benchmark para investimentos, bem como ativos-base para muitos derivativos operados na Bolsa de Mercadorias e Futuros (BM&F) e na Bolsa de Va-lores de São Paulo (Bovespa). A volatilidade do ativo-base é uma das variáveis para o apre-çamento de derivativos como opções; muitas estratégias são montadas, no mercado financei-ro, utilizando-a como referência. A volatilidade também é bastante usada no gerenciamento de riscos, em modelos como o Value at Risk (VaR), por exemplo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crescentemente, a importância da acurada mensuração de risco por parte de empresas não financeiras tem despertado o interesse e se tornado relevante no dia a dia operacional das mesmas. Algo até então muito comum e restrito ao âmbito dos bancos, fundos de investimento e instituições financeiras que utilizam o VaR como um dos principais componentes dos seus sistemas de risk management. Não há consenso, entretanto, quanto à melhor métrica ou definição para mensuração de risco em empresas. O objetivo deste trabalho é analisar risco de mercado em corporações fazendo uma revisão teórica dos principais conceitos apresentados na literatura sobre o assunto, e propor taxonomia mais adequada para as corporações, aproximando o universo das instituições financeiras ao das não financeiras. Um exemplo prático apresentado na análise da Aracruz Celulose busca demonstrar o grau de complexidade nos cálculos, que aliam Asset Pricing, Risco e Finanças Corporativas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As perdas trabalhistas nas Instituições Financeiras representam um valor considerável que devem ser consideradas no modelo de capital regulatório para risco operacional, segundo Basileia. A presente dissertação demonstra uma forma de mensurar o risco às quais as Instituições Financeiras estão expostas nesse tipo de perdas. Diversos tipos de distribuições são analisados conforme sua aderência tanto na frequência como na severidade das perdas. Para os valores de frequência, foi obtida uma amostra de dados real, enquanto para a severidade foram utilizados valores obtidos de relatórios de instituto de pesquisa que serviram de insumo para os cálculos de ações trabalhistas conforme legislação brasileira vigente na CLT (Consolidação das Leis do Trabalho).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the present thesis I study the contribution to firm value of inventories management from a risk management perspective. I find a significant contribution of inventories to the value of risk management especially through the operating flexibility channel. In contrast, I do not find evidence supporting the view of inventories a reserve of liquidity. Inventories substitute, albeit not perfectly, derivatives or cash holdings. The substitution between hedging with derivatives and inventory is moderated by the correlation between cash flow and the underlying asset in the derivative contract. Hedge ratios increase with the effectiveness of derivatives. The decision to hedge with cash holdings or inventories is strongly influenced by the degree of complementarity between production factors and by cash flow volatility. In addition, I provide a risk management based explanation of the secular substitution between inventories and cash holdings documented, among others, in Bates et al. (2009), Journal of Finance. In a sample of U.S. firms between 1980 and 2006, I empirically confirm the negative relation between inventories and cash and provide evidence on the poor performance of investment cash flow sensitivities as a measure of financial constraints also in the case of inventories investment. This result can be explained by firms' scarce reliance on inventories as a reserve of liquidity. Finally, as an extension of my study, I contrast with empirical data the theoretical predictions of a model on the integrated management of inventories, trade credit and cash holdings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional methods do not actually measure peoples’ risk attitude naturally and precisely. Therefore, a fuzzy risk attitude classification method is developed. Since the prospect theory is usually considered as an effective model of decision making, the personalized parameters in prospect theory are firstly fuzzified to distinguish people with different risk attitudes, and then a fuzzy classification database schema is applied to calculate the exact value of risk value attitude and risk be- havior attitude. Finally, by applying a two-hierarchical clas- sification model, the precise value of synthetical risk attitude can be acquired.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The fatality risk caused by avalanches on road networks can be analysed using a long-term approach, resulting in a mean value of risk, and with emphasis on short-term fluctuations due to the temporal variability of both, the hazard potential and the damage potential. In this study, the approach for analysing the long-term fatality risk has been adapted by modelling the highly variable short-term risk. The emphasis was on the temporal variability of the damage potential and the related risk peaks. For defined hazard scenarios resulting from classified amounts of snow accumulation, the fatality risk was calculated by modelling the hazard potential and observing the traffic volume. The avalanche occurrence probability was calculated using a statistical relationship between new snow height and observed avalanche releases. The number of persons at risk was determined from the recorded traffic density. The method resulted in a value for the fatality risk within the observed time frame for the studied road segment. The long-term fatality risk due to snow avalanches as well as the short-term fatality risk was compared to the average fatality risk due to traffic accidents. The application of the method had shown that the long-term avalanche risk is lower than the fatality risk due to traffic accidents. The analyses of short-term avalanche-induced fatality risk provided risk peaks that were 50 times higher than the statistical accident risk. Apart from situations with high hazard level and high traffic density, risk peaks result from both, a high hazard level combined with a low traffic density and a high traffic density combined with a low hazard level. This provided evidence for the importance of the temporal variability of the damage potential for risk simulations on road networks. The assumed dependence of the risk calculation on the sum of precipitation within three days is a simplified model. Thus, further research is needed for an improved determination of the diurnal avalanche probability. Nevertheless, the presented approach may contribute as a conceptual step towards a risk-based decision-making in risk management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Elicitability has recently been discussed as a desirable property for risk measures. Kou and Peng (2014) showed that an elicitable distortion risk measure is either a Value-at-Risk or the mean. We give a concise alternative proof of this result, and discuss the conflict between comonotonic additivity and elicitability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identifying, quantifying, and minimizing technical risks associated with investment decisions is a key challenge for mineral industry decision makers and investors. However, risk analysis in most bankable mine feasibility studies are based on the stochastic modelling of project “Net Present Value” (NPV)which, in most cases, fails to provide decision makers with a truly comprehensive analysis of risks associated with technical and management uncertainty and, as a result, are of little use for risk management and project optimization. This paper presents a value-chain risk management approach where project risk is evaluated for each step of the project lifecycle, from exploration to mine closure, and risk management is performed as a part of a stepwise value-added optimization process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho apresenta uma nova metodologia para otimizar carteiras de ativos financeiros. A metodologia proposta, baseada em interpoladores universais tais quais as Redes Neurais Artificiais e a Krigagem, permite aproximar a superfície de risco e consequentemente a solução do problema de otimização associado a ela de forma generalizada e aplicável a qualquer medida de risco disponível na literatura. Além disto, a metodologia sugerida permite que sejam relaxadas hipóteses restritivas inerentes às metodologias existentes, simplificando o problema de otimização e permitindo que sejam estimados os erros na aproximação da superfície de risco. Ilustrativamente, aplica-se a metodologia proposta ao problema de composição de carteiras com a Variância (controle), o Valor-em-Risco (VaR) e o Valor-em-Risco Condicional (CVaR) como funções objetivo. Os resultados são comparados àqueles obtidos pelos modelos de Markowitz e Rockafellar, respectivamente.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. ^ A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: (a) increase the efficiency of the portfolio optimization process, (b) implement large-scale optimizations, and (c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. ^ The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. ^ The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH). ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: a) increase the efficiency of the portfolio optimization process, b) implement large-scale optimizations, and c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article proposes a three-step procedure to estimate portfolio return distributions under the multivariate Gram-Charlier (MGC) distribution. The method combines quasi maximum likelihood (QML) estimation for conditional means and variances and the method of moments (MM) estimation for the rest of the density parameters, including the correlation coefficients. The procedure involves consistent estimates even under density misspecification and solves the so-called ‘curse of dimensionality’ of multivariate modelling. Furthermore, the use of a MGC distribution represents a flexible and general approximation to the true distribution of portfolio returns and accounts for all its empirical regularities. An application of such procedure is performed for a portfolio composed of three European indices as an illustration. The MM estimation of the MGC (MGC-MM) is compared with the traditional maximum likelihood of both the MGC and multivariate Student’s t (benchmark) densities. A simulation on Value-at-Risk (VaR) performance for an equally weighted portfolio at 1% and 5% confidence indicates that the MGC-MM method provides reasonable approximations to the true empirical VaR. Therefore, the procedure seems to be a useful tool for risk managers and practitioners.