950 resultados para Value at Risk


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este estudo analisa a utilização do gerenciamento de riscos em algumas Empresas de Pequeno e Médio Porte (PMEs) na cidade de São Bernardo do Campo. A análise do risco empresarial possui uma crescente importância e ela pode contribuir fortemente para a continuidade dos negócios. A capacidade para gerenciar os riscos do negócio em relação ás inevitáveis incertezas e com uma valorização futura dos resultados é um fator substancial de vantagem competitiva. Este processo de geração de valor providencia a disciplina e ferramentas de administração dos riscos empresariais permitindo a criação de valor para sua organização. As Metodologias de Análise de Risco, em sua maioria, são aplicadas para grandes corporações. Uma das motivações desse trabalho é verificar o grau de utilidade dessas metodologias para as empresas PMEs escolhidas para a pesquisa em São Bernardo do Campo. O estudo é desenvolvido por meio de pesquisas bibliográficas e pesquisa exploratória nas empresas escolhidas. Após as pesquisas, foi feita uma análise qualitativa utilizando o método de estudo de casos. Finalmente, conclui-se que as empresas pesquisadas de São Bernardo do Campo, podem obter vantagens significativas ao implantar metodologias de gerenciamento de risco. Todas as empresas pesquisadas possuem mais de dez anos e consideram importante controlar a continuidade de seus negócios.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As ações de maior liquidez do índice IBOVESPA, refletem o comportamento das ações de um modo geral, bem como a relação das variáveis macroeconômicas em seu comportamento e estão entre as mais negociadas no mercado de capitais brasileiro. Desta forma, pode-se entender que há reflexos de fatores que impactam as empresas de maior liquidez que definem o comportamento das variáveis macroeconômicas e que o inverso também é uma verdade, oscilações nos fatores macroeconômicos também afetam as ações de maior liquidez, como IPCA, PIB, SELIC e Taxa de Câmbio. O estudo propõe uma análise da relação existente entre variáveis macroeconômicas e o comportamento das ações de maior liquidez do índice IBOVESPA, corroborando com estudos que buscam entender a influência de fatores macroeconômicos sobre o preço de ações e contribuindo empiricamente com a formação de portfólios de investimento. O trabalho abrangeu o período de 2008 a 2014. Os resultados concluíram que a formação de carteiras, visando a proteção do capital investido, deve conter ativos com correlação negativa em relação às variáveis estudadas, o que torna possível a composição de uma carteira com risco reduzido.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A szerző egy, a szennyezőanyag-kibocsátás európai kereskedelmi rendszerében megfelelésre kötelezett gázturbinás erőmű szén-dioxid-kibocsátását modellezi négy termékre (völgy- és csúcsidőszaki áramár, gázár, kibocsátási kvóta) vonatkozó reálopciós modell segítségével. A profitmaximalizáló erőmű csak abban az esetben termel és szennyez, ha a megtermelt áramon realizálható fedezete pozitív. A jövőbeli időszak összesített szén-dioxid-kibocsátása megfeleltethető európai típusú bináris különbözetopciók összegének. A modell keretein belül a szén-dioxid-kibocsátás várható értékét és sűrűségfüggvényét becsülhetjük, az utóbbi segítségével a szén-dioxid-kibocsátási pozíció kockáztatott értékét határozhatjuk meg, amely az erőmű számára előírt megfelelési kötelezettség teljesítésének adott konfidenciaszint melletti költségét jelenti. A sztochasztikus modellben az alaptermékek geometriai Ornstein-Uhlenbeck-folyamatot követnek. Ezt illesztette a szerző a német energiatőzsdéről származó publikus piaci adatokra. A szimulációs modellre támaszkodva megvizsgálta, hogy a különböző technológiai és piaci tényezők ceteris paribus megváltozása milyen hatással van a megfelelés költségére, a kockáztatott értékére. ______ The carbon-dioxide emissions of an EU Emissions Trading System participant, gas-fuelled power generator are modelled by using real options for four underlying instruments (peak and off-peak electricity, gas, emission quota). This profit-maximizing power plant operates and emits pollution only if its profit (spread) on energy produced is positive. The future emissions can be estimated by a sum of European binary-spread options. Based on the real-option model, the expected value of emissions and its probability-density function can be deducted. Also calculable is the Value at Risk of emission quota position, which gives the cost of compliance at a given confidence level. To model the prices of the four underlying instruments, the geometric Ornstein-Uhlenbeck process is supposed and matched to public available price data from EEX. Based on the simulation model, the effects of various technological and market factors are analysed for the emissions level and the cost of compliance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. ^ A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: (a) increase the efficiency of the portfolio optimization process, (b) implement large-scale optimizations, and (c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. ^ The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. ^ The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH). ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is considered that the Strategic Alignment IT is the first step within the IT Governance process for any institution. Taking as initial point the recognition that the governance corporate has an overall view of the organizations, the IT Governance takes place as a sub-set responsible for the implementation of the organization strategies in what concerns the provision of the necessary tools for the achievement of the goals set in the Institutional Development Plan. In order to do so, COBIT specifies that such Governance shall be built on the following principles: Strategic Alignment, Value Delivery, Risk Management, Performance Measurement. This paper aims at the Strategic Alignment, considered by the authors as the foundation for the development of the entire IT Governance core. By deepening the technical knowledge of the management system development, UFRN has made a decisive step towards the technical empowerment needed to the “Value Delivery”, yet, by perusing the primarily set processes to the “Strategic Alignment”, gaps that limited the IT strategic view in the implementation of the organizational goals were found. In the qualitative study that used documentary research with content analysis and interviews with the strategic and tactical managers, the view on the role of SINFO – Superintendência de Informática was mapped. The documentary research was done on public documents present on the institutional site and on TCU – Tribunal de Contas da União – documents that map the IT Governance profiles on the federal public service as a whole. As a means to obtain the documentary research results equalization, questionnaires/interviews and iGovTI indexes, quantitative tools to the standardization of the results were used, always bearing in mind the usage of the same scale elements present in the TCU analysis. This being said, similarly to what the TCU study through the IGovTI index provides, this paper advocates a particular index to the study area – SA (Strategic Alignment), calculated from the representative variables of the COBIT 4.1 domains and having the representative variables of the Strategic Alignment primary process as components. As a result, an intermediate index among the values in two adjacent surveys done by TCU in the years of 2010 and 2012 was found, which reflects the attitude and view of managers towards the IT governance: still linked to Data Processing in which a department performs its tasks according to the demand of the various departments or sectors, although there is a commission that discusses the issues related to infrastructure acquisition and systems development. With an Operational view rather than Strategic/Managerial and low attachment to the tools consecrated by the market, several processes are not contemplated in the framework COBIT defined set; this is mainly due to the inexistence of a formal strategic plan for IT; hence, the partial congruency between the organization goals and the IT goals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: a) increase the efficiency of the portfolio optimization process, b) implement large-scale optimizations, and c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article proposes a three-step procedure to estimate portfolio return distributions under the multivariate Gram-Charlier (MGC) distribution. The method combines quasi maximum likelihood (QML) estimation for conditional means and variances and the method of moments (MM) estimation for the rest of the density parameters, including the correlation coefficients. The procedure involves consistent estimates even under density misspecification and solves the so-called ‘curse of dimensionality’ of multivariate modelling. Furthermore, the use of a MGC distribution represents a flexible and general approximation to the true distribution of portfolio returns and accounts for all its empirical regularities. An application of such procedure is performed for a portfolio composed of three European indices as an illustration. The MM estimation of the MGC (MGC-MM) is compared with the traditional maximum likelihood of both the MGC and multivariate Student’s t (benchmark) densities. A simulation on Value-at-Risk (VaR) performance for an equally weighted portfolio at 1% and 5% confidence indicates that the MGC-MM method provides reasonable approximations to the true empirical VaR. Therefore, the procedure seems to be a useful tool for risk managers and practitioners.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Estatística, 2015.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En la actualidad hay una especial preocupación de los inversionistas por realizar sus inversiones de manera más segura, obteniendo una buena rentabilidad y sin poner en riesgo su capital -- En este sentido, la posibilidad de generar nuevas herramientas que permitan tomar mejores decisiones de inversión es cada vez más relevante en el mundo financiero -- Así, uno de los aportes más importantes de los que se dispone para ese propósito es el de Markowitz, que propone la generación de carteras óptimamente diversificadas -- Sin embargo, el problema es cómo escoger entre algunas de estas carteras -- Por ese motivo, este proyecto tuvo como objetivo comparar el modelo de la desviación estándar (Ratio de Sharpe) con el de Value at Risk (VaR) como concepto de riesgo, para la elección de una cartera óptima dentro del entorno de un mercado desarrollado, en este caso, el mercado estadounidense, por medio de un backtesting se analizó también si el ciclo de mercado bajista, estable o alcista tiene incidencia de igual forma en esta elección -- Después de realizar el modelo y aplicarlo se concluyó que bajo situaciones normales, en un mercado desarrollado, elegir una cartera sobre otra tuvo mayores beneficios si se realiza teniendo en cuenta como concepto de riesgo el VaR bajo un modelo de Simulación de Montecarlo, en lugar de la desviación estándar -- Al aplicar este modelo a un entono menos desarrollado y más fluctuante como el colombiano, se determinó que no hay una ventaja significativa entre los dos modelos (desviación estándar y VaR)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Energy Conservation Measure (ECM) project selection is made difficult given real-world constraints, limited resources to implement savings retrofits, various suppliers in the market and project financing alternatives. Many of these energy efficient retrofit projects should be viewed as a series of investments with annual returns for these traditionally risk-averse agencies. Given a list of ECMs available, federal, state and local agencies must determine how to implement projects at lowest costs. The most common methods of implementation planning are suboptimal relative to cost. Federal, state and local agencies can obtain greater returns on their energy conservation investment over traditional methods, regardless of the implementing organization. This dissertation outlines several approaches to improve the traditional energy conservations models. Any public buildings in regions with similar energy conservation goals in the United States or internationally can also benefit greatly from this research. Additionally, many private owners of buildings are under mandates to conserve energy e.g., Local Law 85 of the New York City Energy Conservation Code requires any building, public or private, to meet the most current energy code for any alteration or renovation. Thus, both public and private stakeholders can benefit from this research. The research in this dissertation advances and presents models that decision-makers can use to optimize the selection of ECM projects with respect to the total cost of implementation. A practical application of a two-level mathematical program with equilibrium constraints (MPEC) improves the current best practice for agencies concerned with making the most cost-effective selection leveraging energy services companies or utilities. The two-level model maximizes savings to the agency and profit to the energy services companies (Chapter 2). An additional model presented leverages a single congressional appropriation to implement ECM projects (Chapter 3). Returns from implemented ECM projects are used to fund additional ECM projects. In these cases, fluctuations in energy costs and uncertainty in the estimated savings severely influence ECM project selection and the amount of the appropriation requested. A risk aversion method proposed imposes a minimum on the number of “of projects completed in each stage. A comparative method using Conditional Value at Risk is analyzed. Time consistency was addressed in this chapter. This work demonstrates how a risk-based, stochastic, multi-stage model with binary decision variables at each stage provides a much more accurate estimate for planning than the agency’s traditional approach and deterministic models. Finally, in Chapter 4, a rolling-horizon model allows for subadditivity and superadditivity of the energy savings to simulate interactive effects between ECM projects. The approach makes use of inequalities (McCormick, 1976) to re-express constraints that involve the product of binary variables with an exact linearization (related to the convex hull of those constraints). This model additionally shows the benefits of learning between stages while remaining consistent with the single congressional appropriations framework.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The first paper sheds light on the informational content of high frequency data and daily data. I assess the economic value of the two family models comparing their performance in forecasting asset volatility through the Value at Risk metric. In running the comparison this paper introduces two key assumptions: jumps in prices and leverage effect in volatility dynamics. Findings suggest that high frequency data models do not exhibit a superior performance over daily data models. In the second paper, building on Majewski et al. (2015), I propose an affine-discrete time model, labeled VARG-J, which is characterized by a multifactor volatility specification. In the VARG-J model volatility experiences periods of extreme movements through a jump factor modeled as an Autoregressive Gamma Zero process. The estimation under historical measure is done by quasi-maximum likelihood and the Extended Kalman Filter. This strategy allows to filter out both volatility factors introducing a measurement equation that relates the Realized Volatility to latent volatility. The risk premia parameters are calibrated using call options written on S&P500 Index. The results clearly illustrate the important contribution of the jump factor in the pricing performance of options and the economic significance of the volatility jump risk premia. In the third paper, I analyze whether there is empirical evidence of contagion at the bank level, measuring the direction and the size of contagion transmission between European markets. In order to understand and quantify the contagion transmission on banking market, I estimate the econometric model by Aït-Sahalia et al. (2015) in which contagion is defined as the within and between countries transmission of shocks and asset returns are directly modeled as a Hawkes jump diffusion process. The empirical analysis indicates that there is a clear evidence of contagion from Greece to European countries as well as self-contagion in all countries.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The value of a seasonal forecasting system based on phases of the Southern Oscillation was estimated for a representative dryland wheat grower in the vicinity of Goondiwindi. In particular the effects on this estimate of risk attitude and planting conditions were examined. A recursive stochastic programming approach was used to identify the grower's utility-maximising action set in the event of each of the climate patterns over the period 1894-1991 recurring In the imminent season. The approach was repeated with and without use of the forecasts. The choices examined were, at planting, nitrogen application rate and cultivar and, later in the season, choices of proceeding with or abandoning each wheat activity, The value of the forecasting system was estimated as the maximum amount the grower could afford to pay for its use without expected utility being lowered relative to its non use.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE: To determine in arrhythmogenic right ventricular cardiomyopathy the value of QT interval dispersion for identifying the induction of sustained ventricular tachycardia in the electrophysiological study or the risk of sudden cardiac death. METHODS: We assessed QT interval dispersion in the 12-lead electrocardiogram of 26 patients with arrhythmogenic right ventricular cardiomyopathy. We analyzed its association with sustained ventricular tachycardia and sudden cardiac death, and in 16 controls similar in age and sex. RESULTS: (mean ± SD). QT interval dispersion: patients = 53.8±14.1ms; control group = 35.0±10.6ms, p=0.001. Patients with induction of ventricular tachycardia: 52.5±13.8ms; without induction of ventricular tachycardia: 57.5±12.8ms, p=0.420. In a mean follow-up period of 41±11 months, five sudden cardiac deaths occurred. QT interval dispersion in this group was 62.0±17.8, and in the others it was 51.9±12.8ms, p=0.852. Using a cutoff > or = 60ms to define an increase in the degree of the QT interval dispersion, we were able to identify patients at risk of sudden cardiac death with a sensitivity of 60%, a specificity of 57%, and positive and negative predictive values of 25% and 85%, respectively. CONCLUSION: Patients with arrhythmogenic right ventricular cardiomyopathy have a significant increase in the degree of QT interval dispersion when compared with the healthy population. However it, did not identify patients with induction of ventricular tachycardia in the electrophysiological study, showing a very low predictive value for defining the risk of sudden cardiac death in the population studied.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

At the age of 50, a woman has a lifetime risk of more than 40% to present a vertebral fracture. More than 60% of vertebral fractures remain undiagnosed. As a consequence it is of major importance to develop screening strategies to detect these fractures. Vertebral fracture assessment (VFA) by DXA allows one to detect vertebral fracture from T4 to L4 using DXA devices, while performing also during the same visit the bone mineral density measurement. Such an approach should improve the evaluation of fracture risk and therapeutic indication. Compared to the standard X-ray assessment, VFA highly enables to detect moderate or severe vertebral fractures below T6.