862 resultados para Value at Risk (VaR)
Resumo:
We propose two simple evaluation methods for time varying density forecasts of continuous higher dimensional random variables. Both methods are based on the probability integral transformation for unidimensional forecasts. The first method tests multinormal densities and relies on the rotation of the coordinate system. The advantage of the second method is not only its applicability to any continuous distribution but also the evaluation of the forecast accuracy in specific regions of its domain as defined by the user’s interest. We show that the latter property is particularly useful for evaluating a multidimensional generalization of the Value at Risk. In simulations and in an empirical study, we examine the performance of both tests.
Resumo:
Les adolescents qui décrochent de l’école secondaire arrivent difficilement à s’intégrer dans une économie axée sur le savoir et éprouvent plusieurs problèmes d’ajustement à l’adolescence et à l’âge adulte. Pour prévenir le décrochage scolaire, une étape cruciale consiste à dépister efficacement les élèves les plus à risque. Deux formes de dépistage axées sur des données peuvent être utilisées en milieu scolaire: une forme utilisant des informations auto-rapportées par les élèves à partir de questionnaires, et une autre fondée sur des informations administratives consignées au dossier des élèves. Toutefois, à notre connaissance, l’efficacité de ces différentes modalités n’a jamais été comparée directement. De plus, il est possible que l’efficacité relative de ces outils de dépistage soit différente selon le sexe de l’élève. Cette étude vise à comparer différents outils de dépistage pour prédire le décrochage scolaire, en tenant compte de l’effet modérateur du sexe. Les outils utilisés seront a) un questionnaire auto-rapporté validé (Archambault et Janosz, 2009) et b) un outil conçu à l’aide de données administratives, créé par une commission scolaire du Québec. La comparaison de ces outils est effectuée en termes de qualités psychométriques et d’aspect pratique pour le milieu scolaire. Pour ce faire, un échantillon de 1557 élèves (50% de garçons), âgé entre 14 et 18 ans est utilisé. Les résultats indiquent que l’indice administratif possède une capacité discriminante adéquate, mais inférieure à celle de l’indice auto-rapportée, jugée excellente. L’effet modérateur du sexe n’a pas été confirmé. Les avantages et inconvénients respectifs de ces deux modes de dépistage sont discutés.
Resumo:
El presente proyecto tiene como objeto identificar cuáles son los conceptos de salud, enfermedad, epidemiología y riesgo aplicables a las empresas del sector de extracción de petróleo y gas natural en Colombia. Dado, el bajo nivel de predicción de los análisis financieros tradicionales y su insuficiencia, en términos de inversión y toma de decisiones a largo plazo, además de no considerar variables como el riesgo y las expectativas de futuro, surge la necesidad de abordar diferentes perspectivas y modelos integradores. Esta apreciación es pertinente dentro del sector de extracción de petróleo y gas natural, debido a la creciente inversión extranjera que ha reportado, US$2.862 millones en el 2010, cifra mayor a diez veces su valor en el año 2003. Así pues, se podrían desarrollar modelos multi-dimensional, con base en los conceptos de salud financiera, epidemiológicos y estadísticos. El termino de salud y su adopción en el sector empresarial, resulta útil y mantiene una coherencia conceptual, evidenciando una presencia de diferentes subsistemas o factores interactuantes e interconectados. Es necesario mencionar también, que un modelo multidimensional (multi-stage) debe tener en cuenta el riesgo y el análisis epidemiológico ha demostrado ser útil al momento de determinarlo e integrarlo en el sistema junto a otros conceptos, como la razón de riesgo y riesgo relativo. Esto se analizará mediante un estudio teórico-conceptual, que complementa un estudio previo, para contribuir al proyecto de finanzas corporativas de la línea de investigación en Gerencia.
Resumo:
Las actividades de mantenimiento automotriz en el sector de autopartes conlleva el uso de agentes químicos bajo diversas circunstancias de exposición, tanto en las condiciones de manipulación de productos químicos como a las características propias de cada actividad de mantenimiento asociado a las tareas específicas del trabajo. Tradicionalmente la evaluación de contaminantes químicos desde la visión de la Higiene Ocupacional incluye la evaluación cuantitativa de la exposición mediante técnicas instrumentales concretas y estandarizadas, determinando el nivel de concentración en aire a la cual un trabajador se ve expuesto y que, en comparación con valores límites permisibles (VLPs), inducen el establecimiento de medidas de control y vigilancia, según el nivel de riesgo caracterizado. Sin embargo es evidente la limitación de la implementación de esta sistemática en particular en micros y pequeñas empresas que carecen de los recursos suficientes para abordar la problemática de forma objetiva. En este contexto diversas metodologías de evaluación cualitativa o subjetiva se han desarrollado por distintas organizaciones en el mundo con el fin de disminuir la brecha entre el establecimiento de medidas de control y la valoración del riesgo, ofreciendo alternativas confiables para la toma de decisiones preventivas sin la necesidad de acudir a mediciones cuantitativas. Mediante la presente investigación se pretende validar la efectividad en el uso de una herramienta de evaluación simplificada del riesgo químico propuesta por el INRS (Institut National de Recherche et de Sécurité Francés) mediante la determinación del perfil de exposición potencial a contaminantes químicos de la población laboral de 36 almacenes de autopartes ubicados en el barrio la Paz de la ciudad de Bogotá, Colombia, divididos según énfasis de actividades en Partes Externas, Partes Eléctricas e Inyección, Partes Mecánicas, Partes Múltiples, a través de un estudio de corte transversal. El estudio permitió Jerarquizar el riesgo potencial, valorar el riesgo vía inhalatoria y dérmica para finalmente construir el perfil de exposición potencial a contaminantes químicos de trabajadores. La información de las variables de análisis fue consolidada en una herramienta informática diseñada para tal fin, la cual facilito la administración de los datos y su respectivo análisis. Con base en los hallazgos fue posible establecer los productos químicos que de acuerdo a las condiciones de trabajo y circunstancias de exposición sugieren medidas específicas para la disminución del riesgo potencial de acuerdo a la calificación global de los agentes, permitiendo deducir la viabilidad de la aplicación de herramientas de valoración cualitativa para la evaluación del riesgo químico como estrategia de prevención primaria.
Resumo:
This paper introduces a method for simulating multivariate samples that have exact means, covariances, skewness and kurtosis. We introduce a new class of rectangular orthogonal matrix which is fundamental to the methodology and we call these matrices L matrices. They may be deterministic, parametric or data specific in nature. The target moments determine the L matrix then infinitely many random samples with the same exact moments may be generated by multiplying the L matrix by arbitrary random orthogonal matrices. This methodology is thus termed “ROM simulation”. Considering certain elementary types of random orthogonal matrices we demonstrate that they generate samples with different characteristics. ROM simulation has applications to many problems that are resolved using standard Monte Carlo methods. But no parametric assumptions are required (unless parametric L matrices are used) so there is no sampling error caused by the discrete approximation of a continuous distribution, which is a major source of error in standard Monte Carlo simulations. For illustration, we apply ROM simulation to determine the value-at-risk of a stock portfolio.
Resumo:
Quantile forecasts are central to risk management decisions because of the widespread use of Value-at-Risk. A quantile forecast is the product of two factors: the model used to forecast volatility, and the method of computing quantiles from the volatility forecasts. In this paper we calculate and evaluate quantile forecasts of the daily exchange rate returns of five currencies. The forecasting models that have been used in recent analyses of the predictability of daily realized volatility permit a comparison of the predictive power of different measures of intraday variation and intraday returns in forecasting exchange rate variability. The methods of computing quantile forecasts include making distributional assumptions for future daily returns as well as using the empirical distribution of predicted standardized returns with both rolling and recursive samples. Our main findings are that the Heterogenous Autoregressive model provides more accurate volatility and quantile forecasts for currencies which experience shifts in volatility, such as the Canadian dollar, and that the use of the empirical distribution to calculate quantiles can improve forecasts when there are shifts
Resumo:
The purpose of this article is to explore customer retention strategies and tactics implemented by firms in recession. Our investigations show just how big a challenge many organizations face in their ability to manage customer retention effectively. While leading organizations have embedded real-time customer life cycle management, developed accurate early warning systems, price elasticity models and ‘deal calculators’, the organizations we spoke to have only gone as far as calculating the value at risk and building simple predictive models.
Resumo:
A presente dissertação tem como objeto de estudo a superfície de volatilidade implícita de opções européias da paridade Real / Dólar no mercado brasileiro. Este trabalho não tenta explicar as deformações ou os desvios da volatilidade implícita com relação à hipótese de volatilidade constante do modelo de Black & Scholes (1973), mas trata a volatilidade implícita como uma variável financeira interessante por si só, procurando analisar a dinâmica de sua superfície. Para a análise desta superfície, o presente estudo propõe a utilização de uma ferramenta empregada em estudos empíricos de diversos ramos da ciência: a Análise de Componentes Principais – ACP (Principal Component Analysis). As mudanças na superfície de volatilidade alteram o apreçamento das opções de uma carteira. Desta forma, constituem um fator de risco que precisa ser estudado e entendido para o desenvolvimento de estratégias de imunização e de técnicas de gerenciamento de risco, dentre elas o cálculo de Valor em Risco (V@R – Value at Risk). De posse dos resultados obtidos com a análise de componentes principais da superfície de volatilidade implícita, o presente estudo tem por objetivo obter valores limite de variação desta volatilidade implícita, como forma de estimar as conseqüentes variações extremas nos valores de uma carteira de opções. Para tanto, baseia-se em estudos sobre a aplicação da análise de componentes principais da superfície de volatilidade implícita desenvolvidos por Alexander (2001). Estes estudos, por sua vez, são derivados de estudo sobre a dinâmica de curvas de volatilidade proposto por Derman (1999). Para se verificar a eficiência da metodologia proposta, os valores extremos obtidos são testados de acordo com os critérios de teste retroativo propostos pela emenda ao comitê da Basiléia de 1996.
Resumo:
O objetivo do trabalho é demonstrar que a otimização de uma carteira composta por fundos multimercados brasileiros gera melhores resultados quando a medida de risco utilizada é o Conditional Value-at-Risk. Modelos de otimização de carteira têm como objetivo selecionar ativos que maximizem o retorno do investidor para um determinado nível de risco. Assim, a definição de uma medida apropriada de risco é de fundamental importância para o processo de alocação. A metodologia tradicional de otimização de carteiras, desenvolvida por Markowitz, utiliza como medida de risco a variância dos retornos. Entretanto, a variância é uma medida apenas apropriada para casos em que os retornos são normalmente distribuídos ou em que os investidores possuem funções de utilidade quadrática. Porém, o trabalho mostra que os retornos dos fundos multimercados brasileiros tendem a não apresentar distribuição normal. Logo, para efetuar a otimização de uma carteira composta por fundos multimercados brasileiros é necessário utilizar uma medida de risco alternativa.
Resumo:
We study the direct and indirect ownership structure of Brazilian corporations and their market value and risk by the end of 1996 and 1998. Ownership is quite concentrated with most companies being controlled by a single direct shareholder. We find evidence that indirect control structures may be used to concentrate control even more rather than to keep control of the company with a smaller share of total capital. The greater the concentration of voting rights then less the value of the fmn should be due to potential expropriation ofrninority shareholders. We fmd evidence that when there is a majority shareholder and when indirect ownership structures are used without the loss of control, corporate valuations are greater when control is dilluted through the indirect ownership structure. This evidence is consistent with the existence of private benefits of control that can be translated as potential minority shareholder expropriation.
Resumo:
The goal of this paper is twofold. First, using five of the most actively traded stocks in the Brazilian financial market, this paper shows that the normality assumption commonly used in the risk management area to describe the distributions of returns standardized by volatilities is not compatible with volatilities estimated by EWMA or GARCH models. In sharp contrast, when the information contained in high frequency data is used to construct the realized volatilies measures, we attain the normality of the standardized returns, giving promise of improvements in Value at Risk statistics. We also describe the distributions of volatilities of the Brazilian stocks, showing that the distributions of volatilities are nearly lognormal. Second, we estimate a simple linear model to the log of realized volatilities that differs from the ones in other studies. The main difference is that we do not find evidence of long memory. The estimated model is compared with commonly used alternatives in an out-of-sample experiment.
Resumo:
Asset allocation decisions and value at risk calculations rely strongly on volatility estimates. Volatility measures such as rolling window, EWMA, GARCH and stochastic volatility are used in practice. GARCH and EWMA type models that incorporate the dynamic structure of volatility and are capable of forecasting future behavior of risk should perform better than constant, rolling window volatility models. For the same asset the model that is the ‘best’ according to some criterion can change from period to period. We use the reality check test∗ to verify if one model out-performs others over a class of re-sampled time-series data. The test is based on re-sampling the data using stationary bootstrapping. For each re-sample we check the ‘best’ model according to two criteria and analyze the distribution of the performance statistics. We compare constant volatility, EWMA and GARCH models using a quadratic utility function and a risk management measurement as comparison criteria. No model consistently out-performs the benchmark.
Resumo:
Despite the large size of the Brazilian debt market, as well the large diversity of its bonds, the picture that emerges is of a market that has not yet completed its transition from the role it performed during the megainflation years, namely that of providing a liquid asset that provided positive real returns. This unfinished transition is currently placing the market under severe stress, as fears of a possible default from the next administration grow larger. This paper analyzes several aspects pertaining to the management of the domestic public debt. The causes for the extremely large and fast growth ofthe domestic public debt during the seven-year period that President Cardoso are discussed in Section 2. Section 3 computes Value at Risk and Cash Flow at Risk measures for the domestic public debt. The rollover risk is introduced in a mean-variance framework in Section 4. Section 5 discusses a few issues pertaining to the overlap between debt management and monetary policy. Finally, Section 6 wraps up with policy discussion and policy recommendations.
Resumo:
Objectives: To evaluate the association between the consumption of different dietary fats with the quality of the diet, insulin resistance, and hyperhomocysteinemia in adults. Methods: Cross-sectional study conducted with 624 overweight subjects (73.7% females). Assessments of food intake (24h food recall and health eating index-HEI), anthropometry, and biochemical assays of fasting glucose, insulin (HOMA-IR and β calculus) and homocysteinemia were performed. Results: The low quality of diet was associated with the vegetable oil at 3rd quintile (≥1.5-2.0 servings) showed risk 2.9 times and cholesterol at quintiles 2nd, 3rd, and 4th was 2.0 times. HOMA-IR was higher at 5th quintile of saturated fat (≥10,7% - total caloric value) with risk of 60% and hyperhomocysteinemia the vegetable oil at 3rd quintile (>1.5-2.0 servings) with risk of 12.0 times and 5th (≥3.5 servings) 7.1 times. However, significance disappeared when adjusted for anthropometric variables. Conclusion: Dietary fats were associated with the harm diet quality, insulin resistance, and hyperhomocysteinemia. However, associations are dependant of demographic variables, dietetic, and nutritional state. © 2011 CELOM.