900 resultados para Parametric VaR (Value-at-Risk)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les adolescents qui décrochent de l’école secondaire arrivent difficilement à s’intégrer dans une économie axée sur le savoir et éprouvent plusieurs problèmes d’ajustement à l’adolescence et à l’âge adulte. Pour prévenir le décrochage scolaire, une étape cruciale consiste à dépister efficacement les élèves les plus à risque. Deux formes de dépistage axées sur des données peuvent être utilisées en milieu scolaire: une forme utilisant des informations auto-rapportées par les élèves à partir de questionnaires, et une autre fondée sur des informations administratives consignées au dossier des élèves. Toutefois, à notre connaissance, l’efficacité de ces différentes modalités n’a jamais été comparée directement. De plus, il est possible que l’efficacité relative de ces outils de dépistage soit différente selon le sexe de l’élève. Cette étude vise à comparer différents outils de dépistage pour prédire le décrochage scolaire, en tenant compte de l’effet modérateur du sexe. Les outils utilisés seront a) un questionnaire auto-rapporté validé (Archambault et Janosz, 2009) et b) un outil conçu à l’aide de données administratives, créé par une commission scolaire du Québec. La comparaison de ces outils est effectuée en termes de qualités psychométriques et d’aspect pratique pour le milieu scolaire. Pour ce faire, un échantillon de 1557 élèves (50% de garçons), âgé entre 14 et 18 ans est utilisé. Les résultats indiquent que l’indice administratif possède une capacité discriminante adéquate, mais inférieure à celle de l’indice auto-rapportée, jugée excellente. L’effet modérateur du sexe n’a pas été confirmé. Les avantages et inconvénients respectifs de ces deux modes de dépistage sont discutés.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El presente proyecto tiene como objeto identificar cuáles son los conceptos de salud, enfermedad, epidemiología y riesgo aplicables a las empresas del sector de extracción de petróleo y gas natural en Colombia. Dado, el bajo nivel de predicción de los análisis financieros tradicionales y su insuficiencia, en términos de inversión y toma de decisiones a largo plazo, además de no considerar variables como el riesgo y las expectativas de futuro, surge la necesidad de abordar diferentes perspectivas y modelos integradores. Esta apreciación es pertinente dentro del sector de extracción de petróleo y gas natural, debido a la creciente inversión extranjera que ha reportado, US$2.862 millones en el 2010, cifra mayor a diez veces su valor en el año 2003. Así pues, se podrían desarrollar modelos multi-dimensional, con base en los conceptos de salud financiera, epidemiológicos y estadísticos. El termino de salud y su adopción en el sector empresarial, resulta útil y mantiene una coherencia conceptual, evidenciando una presencia de diferentes subsistemas o factores interactuantes e interconectados. Es necesario mencionar también, que un modelo multidimensional (multi-stage) debe tener en cuenta el riesgo y el análisis epidemiológico ha demostrado ser útil al momento de determinarlo e integrarlo en el sistema junto a otros conceptos, como la razón de riesgo y riesgo relativo. Esto se analizará mediante un estudio teórico-conceptual, que complementa un estudio previo, para contribuir al proyecto de finanzas corporativas de la línea de investigación en Gerencia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Las actividades de mantenimiento automotriz en el sector de autopartes conlleva el uso de agentes químicos bajo diversas circunstancias de exposición, tanto en las condiciones de manipulación de productos químicos como a las características propias de cada actividad de mantenimiento asociado a las tareas específicas del trabajo. Tradicionalmente la evaluación de contaminantes químicos desde la visión de la Higiene Ocupacional incluye la evaluación cuantitativa de la exposición mediante técnicas instrumentales concretas y estandarizadas, determinando el nivel de concentración en aire a la cual un trabajador se ve expuesto y que, en comparación con valores límites permisibles (VLPs), inducen el establecimiento de medidas de control y vigilancia, según el nivel de riesgo caracterizado. Sin embargo es evidente la limitación de la implementación de esta sistemática en particular en micros y pequeñas empresas que carecen de los recursos suficientes para abordar la problemática de forma objetiva. En este contexto diversas metodologías de evaluación cualitativa o subjetiva se han desarrollado por distintas organizaciones en el mundo con el fin de disminuir la brecha entre el establecimiento de medidas de control y la valoración del riesgo, ofreciendo alternativas confiables para la toma de decisiones preventivas sin la necesidad de acudir a mediciones cuantitativas. Mediante la presente investigación se pretende validar la efectividad en el uso de una herramienta de evaluación simplificada del riesgo químico propuesta por el INRS (Institut National de Recherche et de Sécurité Francés) mediante la determinación del perfil de exposición potencial a contaminantes químicos de la población laboral de 36 almacenes de autopartes ubicados en el barrio la Paz de la ciudad de Bogotá, Colombia, divididos según énfasis de actividades en Partes Externas, Partes Eléctricas e Inyección, Partes Mecánicas, Partes Múltiples, a través de un estudio de corte transversal. El estudio permitió Jerarquizar el riesgo potencial, valorar el riesgo vía inhalatoria y dérmica para finalmente construir el perfil de exposición potencial a contaminantes químicos de trabajadores. La información de las variables de análisis fue consolidada en una herramienta informática diseñada para tal fin, la cual facilito la administración de los datos y su respectivo análisis. Con base en los hallazgos fue posible establecer los productos químicos que de acuerdo a las condiciones de trabajo y circunstancias de exposición sugieren medidas específicas para la disminución del riesgo potencial de acuerdo a la calificación global de los agentes, permitiendo deducir la viabilidad de la aplicación de herramientas de valoración cualitativa para la evaluación del riesgo químico como estrategia de prevención primaria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The abattoir and the fallen stock surveys constitute the active surveillance component aimed at improving the detection of scrapie across the European Union. Previous studies have suggested the occurrence of significant differences in the operation of the surveys across the EU. In the present study we assessed the standardisation of the surveys throughout time across the EU and identified clusters of countries with similar underlying characteristics allowing comparisons between them. In the absence of sufficient covariate information to explain the observed variability across countries, we modelled the unobserved heterogeneity by means of non-parametric distributions on the risk ratios of the fallen stock over the abattoir survey. More specifically, we used the profile likelihood method on 2003, 2004 and 2005 active surveillance data for 18 European countries on classical scrapie, and on 2004 and 2005 data for atypical scrapie separately. We extended our analyses to include the limited covariate information available, more specifically, the proportion of the adult sheep population sampled by the fallen stock survey every year. Our results show that the between-country heterogeneity dropped in 2004 and 2005 relative to that of 2003 for classical scrapie. As a consequence, the number of clusters in the last two years was also reduced indicating the gradual standardisation of the surveillance efforts across the EU. The crude analyses of the atypical data grouped all the countries in one cluster and showed non-significant gain in the detection of this type of scrapie by any of the two sources. The proportion of the population sampled by the fallen stock appeared significantly associated with our risk ratio for both types of scrapie, although in opposite directions: negative for classical and positive for atypical. The initial justification for the fallen stock, targeting a high-risk population to increase the likelihood of case finding, appears compromised for both types of scrapie in some countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantile forecasts are central to risk management decisions because of the widespread use of Value-at-Risk. A quantile forecast is the product of two factors: the model used to forecast volatility, and the method of computing quantiles from the volatility forecasts. In this paper we calculate and evaluate quantile forecasts of the daily exchange rate returns of five currencies. The forecasting models that have been used in recent analyses of the predictability of daily realized volatility permit a comparison of the predictive power of different measures of intraday variation and intraday returns in forecasting exchange rate variability. The methods of computing quantile forecasts include making distributional assumptions for future daily returns as well as using the empirical distribution of predicted standardized returns with both rolling and recursive samples. Our main findings are that the Heterogenous Autoregressive model provides more accurate volatility and quantile forecasts for currencies which experience shifts in volatility, such as the Canadian dollar, and that the use of the empirical distribution to calculate quantiles can improve forecasts when there are shifts

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this article is to explore customer retention strategies and tactics implemented by firms in recession. Our investigations show just how big a challenge many organizations face in their ability to manage customer retention effectively. While leading organizations have embedded real-time customer life cycle management, developed accurate early warning systems, price elasticity models and ‘deal calculators’, the organizations we spoke to have only gone as far as calculating the value at risk and building simple predictive models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Syftet med denna litteraturstudie var att beskriva vilka riskfaktorer som kunde leda till höftfraktur hos äldre samt att beskriva vilka preventiva åtgärder sjuksköterskan kunde använda i omvårdnaden för att förhindra höftfraktur. Resultatet baserades på 21 vetenskapliga artiklar skrivna på engelskt språk. Exklusionskriterie var artiklar baserade på individer yngre än 50 år. Artiklarna söktes via databaserna Elin och Blackwell Synergy och skulle vara publicerade från år 1996 till 2006. Även manuell sökning genomfördes utifrån artiklars referenser samt en tidskrift. Sökorden som användes var hip fracture, risk factor, prevention, cause, nursing samt nursing care. Sökorden användes i olika kombinationer. Resultatet visade att kvinnligt kön, hög ålder, osteoporos, tidigare frakturer, synnedsättning, urininkontinens, läkemedel, nedsatt kognition, rörlighet och faktorer i närmiljön var riskfaktorer som kunde leda till höftfrakturer hos äldre. Preventivt arbete för att minska riskerna för höftfraktur var åtgärder som livsstilsförändringar, riskbedömning för fallolyckor och riskbedömning av närmiljön. Träningsprogram för att förbättra styrka och balans samt ökad användning av höftskydd var ytterligare preventiva åtgärder som sjuksköterskan kunde använda. Med kunskaper om riskfaktorer och prevention kunde sjuksköterskan med enkla hjälpmedel minska frekvensen höftfrakturer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides an examination of the determinants of derivative use by Australian corporations. We analysed the characteristics of a sample of 469 firm/year observations drawn from the largest Australian publicly listed companies in 1999 and 2000 to address two issues: the decision to use financial derivatives and the extent to which they are used. Logit analysis suggests that a firm's leverage (distress proxy), size (financial distress and setup costs) and liquidity (financial constraints proxy) are important factors associated with the decision to use derivatives. These findings support the financial distress hypothesis while the evidence on the underinvestment hypothesis is mixed. Additionally, setup costs appear to be important, as larger firms are more likely to use derivatives. Tobit results, on the other hand, show that once the decision to use derivatives has been made, a firm uses more derivatives as its leverage increases and as it pays out more dividends (hedging substitute proxy). The overall results indicate that Australian companies use derivatives with a view to enhancing the firms' value rather than to maximizing managerial wealth. In particular, corporations' derivative policies are mostly concerned with reducing the expected cost of financial distress and managing cash flows. Our inability to identify managerial influences behind the derivative decision suggests a competitive Australian managerial labor market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we consider an extension of the recently proposed bivariate Markov-switching multifractal model of Calvet, Fisher, and Thompson [2006. "Volatility Comovement: A Multifrequency Approach." Journal of Econometrics {131}: 179-215]. In particular, we allow correlations between volatility components to be non-homogeneous with two different parameters governing the volatility correlations at high and low frequencies. Specification tests confirm the added explanatory value of this specification. In order to explore its practical performance, we apply the model for computing value-at-risk statistics for different classes of financial assets and compare the results with the baseline, homogeneous bivariate multifractal model and the bivariate DCC-GARCH of Engle [2002. "Dynamic Conditional Correlation: A Simple Class of Multivariate Generalized Autoregressive Conditional Heteroskedasticity Models." Journal of Business & Economic Statistics 20 (3): 339-350]. As it turns out, the multifractal model with heterogeneous volatility correlations provides more reliable results than both the homogeneous benchmark and the DCC-GARCH model. © 2014 Taylor & Francis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A presente dissertação tem como objeto de estudo a superfície de volatilidade implícita de opções européias da paridade Real / Dólar no mercado brasileiro. Este trabalho não tenta explicar as deformações ou os desvios da volatilidade implícita com relação à hipótese de volatilidade constante do modelo de Black & Scholes (1973), mas trata a volatilidade implícita como uma variável financeira interessante por si só, procurando analisar a dinâmica de sua superfície. Para a análise desta superfície, o presente estudo propõe a utilização de uma ferramenta empregada em estudos empíricos de diversos ramos da ciência: a Análise de Componentes Principais – ACP (Principal Component Analysis). As mudanças na superfície de volatilidade alteram o apreçamento das opções de uma carteira. Desta forma, constituem um fator de risco que precisa ser estudado e entendido para o desenvolvimento de estratégias de imunização e de técnicas de gerenciamento de risco, dentre elas o cálculo de Valor em Risco (V@R – Value at Risk). De posse dos resultados obtidos com a análise de componentes principais da superfície de volatilidade implícita, o presente estudo tem por objetivo obter valores limite de variação desta volatilidade implícita, como forma de estimar as conseqüentes variações extremas nos valores de uma carteira de opções. Para tanto, baseia-se em estudos sobre a aplicação da análise de componentes principais da superfície de volatilidade implícita desenvolvidos por Alexander (2001). Estes estudos, por sua vez, são derivados de estudo sobre a dinâmica de curvas de volatilidade proposto por Derman (1999). Para se verificar a eficiência da metodologia proposta, os valores extremos obtidos são testados de acordo com os critérios de teste retroativo propostos pela emenda ao comitê da Basiléia de 1996.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo do trabalho é demonstrar que a otimização de uma carteira composta por fundos multimercados brasileiros gera melhores resultados quando a medida de risco utilizada é o Conditional Value-at-Risk. Modelos de otimização de carteira têm como objetivo selecionar ativos que maximizem o retorno do investidor para um determinado nível de risco. Assim, a definição de uma medida apropriada de risco é de fundamental importância para o processo de alocação. A metodologia tradicional de otimização de carteiras, desenvolvida por Markowitz, utiliza como medida de risco a variância dos retornos. Entretanto, a variância é uma medida apenas apropriada para casos em que os retornos são normalmente distribuídos ou em que os investidores possuem funções de utilidade quadrática. Porém, o trabalho mostra que os retornos dos fundos multimercados brasileiros tendem a não apresentar distribuição normal. Logo, para efetuar a otimização de uma carteira composta por fundos multimercados brasileiros é necessário utilizar uma medida de risco alternativa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the direct and indirect ownership structure of Brazilian corporations and their market value and risk by the end of 1996 and 1998. Ownership is quite concentrated with most companies being controlled by a single direct shareholder. We find evidence that indirect control structures may be used to concentrate control even more rather than to keep control of the company with a smaller share of total capital. The greater the concentration of voting rights then less the value of the fmn should be due to potential expropriation ofrninority shareholders. We fmd evidence that when there is a majority shareholder and when indirect ownership structures are used without the loss of control, corporate valuations are greater when control is dilluted through the indirect ownership structure. This evidence is consistent with the existence of private benefits of control that can be translated as potential minority shareholder expropriation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this paper is twofold. First, using five of the most actively traded stocks in the Brazilian financial market, this paper shows that the normality assumption commonly used in the risk management area to describe the distributions of returns standardized by volatilities is not compatible with volatilities estimated by EWMA or GARCH models. In sharp contrast, when the information contained in high frequency data is used to construct the realized volatilies measures, we attain the normality of the standardized returns, giving promise of improvements in Value at Risk statistics. We also describe the distributions of volatilities of the Brazilian stocks, showing that the distributions of volatilities are nearly lognormal. Second, we estimate a simple linear model to the log of realized volatilities that differs from the ones in other studies. The main difference is that we do not find evidence of long memory. The estimated model is compared with commonly used alternatives in an out-of-sample experiment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Asset allocation decisions and value at risk calculations rely strongly on volatility estimates. Volatility measures such as rolling window, EWMA, GARCH and stochastic volatility are used in practice. GARCH and EWMA type models that incorporate the dynamic structure of volatility and are capable of forecasting future behavior of risk should perform better than constant, rolling window volatility models. For the same asset the model that is the ‘best’ according to some criterion can change from period to period. We use the reality check test∗ to verify if one model out-performs others over a class of re-sampled time-series data. The test is based on re-sampling the data using stationary bootstrapping. For each re-sample we check the ‘best’ model according to two criteria and analyze the distribution of the performance statistics. We compare constant volatility, EWMA and GARCH models using a quadratic utility function and a risk management measurement as comparison criteria. No model consistently out-performs the benchmark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the large size of the Brazilian debt market, as well the large diversity of its bonds, the picture that emerges is of a market that has not yet completed its transition from the role it performed during the megainflation years, namely that of providing a liquid asset that provided positive real returns. This unfinished transition is currently placing the market under severe stress, as fears of a possible default from the next administration grow larger. This paper analyzes several aspects pertaining to the management of the domestic public debt. The causes for the extremely large and fast growth ofthe domestic public debt during the seven-year period that President Cardoso are discussed in Section 2. Section 3 computes Value at Risk and Cash Flow at Risk measures for the domestic public debt. The rollover risk is introduced in a mean-variance framework in Section 4. Section 5 discusses a few issues pertaining to the overlap between debt management and monetary policy. Finally, Section 6 wraps up with policy discussion and policy recommendations.