869 resultados para Value-at-Risk
Resumo:
Traditional methods do not actually measure peoples’ risk attitude naturally and precisely. Therefore, a fuzzy risk attitude classification method is developed. Since the prospect theory is usually considered as an effective model of decision making, the personalized parameters in prospect theory are firstly fuzzified to distinguish people with different risk attitudes, and then a fuzzy classification database schema is applied to calculate the exact value of risk value attitude and risk be- havior attitude. Finally, by applying a two-hierarchical clas- sification model, the precise value of synthetical risk attitude can be acquired.
Resumo:
The fatality risk caused by avalanches on road networks can be analysed using a long-term approach, resulting in a mean value of risk, and with emphasis on short-term fluctuations due to the temporal variability of both, the hazard potential and the damage potential. In this study, the approach for analysing the long-term fatality risk has been adapted by modelling the highly variable short-term risk. The emphasis was on the temporal variability of the damage potential and the related risk peaks. For defined hazard scenarios resulting from classified amounts of snow accumulation, the fatality risk was calculated by modelling the hazard potential and observing the traffic volume. The avalanche occurrence probability was calculated using a statistical relationship between new snow height and observed avalanche releases. The number of persons at risk was determined from the recorded traffic density. The method resulted in a value for the fatality risk within the observed time frame for the studied road segment. The long-term fatality risk due to snow avalanches as well as the short-term fatality risk was compared to the average fatality risk due to traffic accidents. The application of the method had shown that the long-term avalanche risk is lower than the fatality risk due to traffic accidents. The analyses of short-term avalanche-induced fatality risk provided risk peaks that were 50 times higher than the statistical accident risk. Apart from situations with high hazard level and high traffic density, risk peaks result from both, a high hazard level combined with a low traffic density and a high traffic density combined with a low hazard level. This provided evidence for the importance of the temporal variability of the damage potential for risk simulations on road networks. The assumed dependence of the risk calculation on the sum of precipitation within three days is a simplified model. Thus, further research is needed for an improved determination of the diurnal avalanche probability. Nevertheless, the presented approach may contribute as a conceptual step towards a risk-based decision-making in risk management.
Resumo:
Elicitability has recently been discussed as a desirable property for risk measures. Kou and Peng (2014) showed that an elicitable distortion risk measure is either a Value-at-Risk or the mean. We give a concise alternative proof of this result, and discuss the conflict between comonotonic additivity and elicitability.
A Methodological model to assist the optimization and risk management of mining investment decisions
Resumo:
Identifying, quantifying, and minimizing technical risks associated with investment decisions is a key challenge for mineral industry decision makers and investors. However, risk analysis in most bankable mine feasibility studies are based on the stochastic modelling of project “Net Present Value” (NPV)which, in most cases, fails to provide decision makers with a truly comprehensive analysis of risks associated with technical and management uncertainty and, as a result, are of little use for risk management and project optimization. This paper presents a value-chain risk management approach where project risk is evaluated for each step of the project lifecycle, from exploration to mine closure, and risk management is performed as a part of a stepwise value-added optimization process.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
We present a general multistage stochastic mixed 0-1 problem where the uncertainty appears everywhere in the objective function, constraints matrix and right-hand-side. The uncertainty is represented by a scenario tree that can be a symmetric or a nonsymmetric one. The stochastic model is converted in a mixed 0-1 Deterministic Equivalent Model in compact representation. Due to the difficulty of the problem, the solution offered by the stochastic model has been traditionally obtained by optimizing the objective function expected value (i.e., mean) over the scenarios, usually, along a time horizon. This approach (so named risk neutral) has the inconvenience of providing a solution that ignores the variance of the objective value of the scenarios and, so, the occurrence of scenarios with an objective value below the expected one. Alternatively, we present several approaches for risk averse management, namely, a scenario immunization strategy, the optimization of the well known Value-at-Risk (VaR) and several variants of the Conditional Value-at-Risk strategies, the optimization of the expected mean minus the weighted probability of having a "bad" scenario to occur for the given solution provided by the model, the optimization of the objective function expected value subject to stochastic dominance constraints (SDC) for a set of profiles given by the pairs of threshold objective values and either bounds on the probability of not reaching the thresholds or the expected shortfall over them, and the optimization of a mixture of the VaR and SDC strategies.
Resumo:
This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.
Resumo:
Fuzzy Waste Load Allocation Model (FWLAM), developed in an earlier study, derives the optimal fractional levels, for the base flow conditions, considering the goals of the Pollution Control Agency (PCA) and dischargers. The Modified Fuzzy Waste Load Allocation Model (MFWLAM) developed subsequently is a stochastic model and considers the moments (mean, variance and skewness) of water quality indicators, incorporating uncertainty due to randomness of input variables along with uncertainty due to imprecision. The risk of low water quality is reduced significantly by using this modified model, but inclusion of new constraints leads to a low value of acceptability level, A, interpreted as the maximized minimum satisfaction in the system. To improve this value, a new model, which is a combination Of FWLAM and MFWLAM, is presented, allowing for some violations in the constraints of MFWLAM. This combined model is a multiobjective optimization model having the objectives, maximization of acceptability level and minimization of violation of constraints. Fuzzy multiobjective programming, goal programming and fuzzy goal programming are used to find the solutions. For the optimization model, Probabilistic Global Search Lausanne (PGSL) is used as a nonlinear optimization tool. The methodology is applied to a case study of the Tunga-Bhadra river system in south India. The model results in a compromised solution of a higher value of acceptability level as compared to MFWLAM, with a satisfactory value of risk. Thus the goal of risk minimization is achieved with a comparatively better value of acceptability level.
Resumo:
The interdependence of Greece and other European stock markets and the subsequent portfolio implications are examined in wavelet and variational mode decomposition domain. In applying the decomposition techniques, we analyze the structural properties of data and distinguish between short and long term dynamics of stock market returns. First, the GARCH-type models are fitted to obtain the standardized residuals. Next, different copula functions are evaluated, and based on the conventional information criteria and time varying parameter, Joe-Clayton copula is chosen to model the tail dependence between the stock markets. The short-run lower tail dependence time paths show a sudden increase in comovement during the global financial crises. The results of the long-run dependence suggest that European stock markets have higher interdependence with Greece stock market. Individual country’s Value at Risk (VaR) separates the countries into two distinct groups. Finally, the two-asset portfolio VaR measures provide potential markets for Greece stock market investment diversification.
Resumo:
Seismic microzonation has generally been recognized as the most accepted tool in seismic hazard assessment and risk evaluation. In general, risk reduction can be done by reducing the hazard, the vulnerability or the value at risk. Since the earthquake hazard can not be reduced, one has to concentrate on vulnerability and value at risk. The vulnerability of an urban area / municipalities depends on the vulnerability of infrastructure and redundancies within the infrastructure. The earthquake risk is the damage to buildings along with number of people that are killed / hurt and the economic losses during the event due to an earthquake with a return period corresponding to this time period. The principal approaches one can follow to reduce these losses are to avoid, if possible, high hazard areas for the siting of buildings and infrastructure, and further ensure that the buildings and infrastructure are designed and constructed to resist expected earthquake loads. This can be done if one can assess the hazard at local scales. Seismic microzonation maps provide the basis for scientifically based decision-making to reduce earthquake risk for Govt./public agencies, private owners and the general public. Further, seismic microzonation carried out on an appropriate scale provides a valuable tool for disaster mitigation planning and emergency response planning for urban centers / municipalities. It provides the basis for the identification of the areas of the city / municipality which are most likely to experience serious damage in the event of an earthquake.
Resumo:
One of the most fundamental and widely accepted ideas in finance is that investors are compensated through higher returns for taking on non-diversifiable risk. Hence the quantification, modeling and prediction of risk have been, and still are one of the most prolific research areas in financial economics. It was recognized early on that there are predictable patterns in the variance of speculative prices. Later research has shown that there may also be systematic variation in the skewness and kurtosis of financial returns. Lacking in the literature so far, is an out-of-sample forecast evaluation of the potential benefits of these new more complicated models with time-varying higher moments. Such an evaluation is the topic of this dissertation. Essay 1 investigates the forecast performance of the GARCH (1,1) model when estimated with 9 different error distributions on Standard and Poor’s 500 Index Future returns. By utilizing the theory of realized variance to construct an appropriate ex post measure of variance from intra-day data it is shown that allowing for a leptokurtic error distribution leads to significant improvements in variance forecasts compared to using the normal distribution. This result holds for daily, weekly as well as monthly forecast horizons. It is also found that allowing for skewness and time variation in the higher moments of the distribution does not further improve forecasts. In Essay 2, by using 20 years of daily Standard and Poor 500 index returns, it is found that density forecasts are much improved by allowing for constant excess kurtosis but not improved by allowing for skewness. By allowing the kurtosis and skewness to be time varying the density forecasts are not further improved but on the contrary made slightly worse. In Essay 3 a new model incorporating conditional variance, skewness and kurtosis based on the Normal Inverse Gaussian (NIG) distribution is proposed. The new model and two previously used NIG models are evaluated by their Value at Risk (VaR) forecasts on a long series of daily Standard and Poor’s 500 returns. The results show that only the new model produces satisfactory VaR forecasts for both 1% and 5% VaR Taken together the results of the thesis show that kurtosis appears not to exhibit predictable time variation, whereas there is found some predictability in the skewness. However, the dynamic properties of the skewness are not completely captured by any of the models.
Resumo:
Financial time series tend to behave in a manner that is not directly drawn from a normal distribution. Asymmetries and nonlinearities are usually seen and these characteristics need to be taken into account. To make forecasts and predictions of future return and risk is rather complicated. The existing models for predicting risk are of help to a certain degree, but the complexity in financial time series data makes it difficult. The introduction of nonlinearities and asymmetries for the purpose of better models and forecasts regarding both mean and variance is supported by the essays in this dissertation. Linear and nonlinear models are consequently introduced in this dissertation. The advantages of nonlinear models are that they can take into account asymmetries. Asymmetric patterns usually mean that large negative returns appear more often than positive returns of the same magnitude. This goes hand in hand with the fact that negative returns are associated with higher risk than in the case where positive returns of the same magnitude are observed. The reason why these models are of high importance lies in the ability to make the best possible estimations and predictions of future returns and for predicting risk.
Resumo:
The objective of this paper is to improve option risk monitoring by examining the information content of implied volatility and by introducing the calculation of a single-sum expected risk exposure similar to the Value-at-Risk. The figure is calculated in two steps. First, there is a need to estimate the value of a portfolio of options for a number of different market scenarios, while the second step is to summarize the information content of the estimated scenarios into a single-sum risk measure. This involves the use of probability theory and return distributions, which confronts the user with the problems of non-normality in the return distribution of the underlying asset. Here the hyperbolic distribution is used to describe one alternative for dealing with heavy tails. Results indicate that the information content of implied volatility is useful when predicting future large returns in the underlying asset. Further, the hyperbolic distribution provides a good fit to historical returns enabling a more accurate definition of statistical intervals and extreme events.
Resumo:
As instituições financeiras são obrigadas por acordos internacionais, como o Acordo de Basiléia, a avaliar o risco de mercado ao qual a instituição está propensa de forma a evitar possíveis contaminações de desastres financeiros em seu patrimônio. Com o intuito de capturar tais fenômenos, surge a necessidade de construir modelos que capturem com mais acurácia movimentos extremos das séries de retornos. O trabalho teve como principal objetivo aplicar a Teoria do Valor Extremo juntamente com Copulas na estimação de quantis extremos para o VaR. Ele utiliza técnicas de simulação de Monte Carlo, Teoria do Valor Extremo e Cópulas com distribuições gaussianas e t. Em contrapartida, as estimativas produzidas serão comparadas com as de um segundo modelo, chamado de simulação histórica de Monte Carlo filtrada, mais conhecida como filtered historical simulation (FHS). As técnicas serão aplicadas a um portfólio de ações de empresas brasileiras.
Resumo:
From 2008-2012, a dramatic upsurge in incidents of maritime piracy in the Western Indian Ocean led to renewed global attention to this region: including the deployment of multi national naval patrols, attempts to prosecute suspected pirates, and the development of financial interdiction systems to track and stop the flow of piracy ransoms. Largely seen as the maritime ripple effect of anarchy on land, piracy has been slotted into narratives of state failure and problems of governance and criminality in this region.
This view fails to account for a number of factors that were crucial in making possible the unprecedented rise of Somali piracy and its contemporary transformation. Instead of an emphasis on failed states and crises of governance, my dissertation approaches maritime piracy within a historical and regional configuration of actors and relationships that precede this round of piracy and will outlive it. The story I tell in this work begins before the contemporary upsurge of piracy and closes with a foretaste of the itineraries beyond piracy that are being crafted along the East African coast.
Beginning in the world of port cities in the long nineteenth century, my dissertation locates piracy and the relationship between trade, plunder, and state formation within worlds of exchange, including European incursions into this oceanic space. Scholars of long distance trade have emphasized the sociality engendered through commerce and the centrality of idioms of trust and kinship in structuring mercantile relationships across oceanic divides. To complement this scholarship, my work brings into view the idiom of protection: as a claim to surety, a form of tax, and a moral claim to authority in trans-regional commerce.
To build this theory of protection, my work combines archival sources with a sustained ethnographic engagement in coastal East Africa, including the pirate ports of Northern Somalia, and focuses on the interaction between land-based pastoral economies and maritime trade. This connection between land and sea calls attention to two distinct visions of the ocean: one built around trade and mobility and the other built on the ocean as a space of extraction and sovereignty. Moving between historical encounters over trade and piracy and the development of a national maritime economy during the height of the Somali state, I link the contemporary upsurge of maritime piracy to the confluence of these two conceptualizations of the ocean and the ideas of capture, exchange, and redistribution embedded within them.
The second section of my dissertation reframes piracy as an economy of protection and a form of labor implicated within other legal and illegal economies in the Indian Ocean. Based on extensive field research, including interviews with self-identified pirates, I emphasize the forms of labor, value, and risk that characterize piracy as an economy of protection. The final section of my dissertation focuses on the diverse international, regional, and local responses to maritime piracy. This section locates the response to piracy within a post-Cold War and post-9/11 global order and longer attempts to regulate and assuage the risks of maritime trade. Through an ethnographic focus on maritime insurance markets, navies, and private security contractors, I analyze the centrality of protection as a calculation of risk and profit in the contemporary economy of counter-piracy.
Through this focus on longer histories of trade, empire, and regulation my dissertation reframes maritime piracy as an economy of protection straddling boundaries of land and sea, legality and illegality, law and economy, and history and anthropology.