963 resultados para Rischio finanziario, Value-at-Risk, Expected Shortfall


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Obiettivi: valutare in pazienti con rene singolo congenito la correlazione tra il filtrato glomerulare misurato con il DTPA (DTPA-VFG) e 1) marker laboratoristici di danno renale (creatinina, cistatinaC, proteinuria) 2) formule per stimare il filtrato glomerulare 3) parametri di valutazione della crescita renale ecografica. Materiali e metodi: Sono stati arruolati 118 pazienti con rene singolo congenito tra 0 e 18 anni. Sono stati valutati a ogni visita altezza, creatinina, cistatinaC, proteinuria e lunghezza ecografica renale. E’ stato calcolato il filtrato stimato con formule basate sulla creatinina (Schwartz), sulla cistatina C (Zappitelli, Filler, Grubb e Bokenkamp) e su entrambe (equazione di Zappitelli). La crescita renale è stata valutata come rapporto lunghezza ecografica/altezza corporea (USL/H), differenza percentuale tra lunghezza renale misurata e attesa per età (delta%) e presenza o meno d’ipertrofia compensatoria. In 74 bambini è stata misurata la DTPA-VFG. Risultati: Il follow-up è di 2.1 ± 0.9 anni. Il 65% sono maschi. Nessun paziente ha sviluppato danno renale cronico. La media del DTPA-VFG era di 135±44ml/min/1.73m², il valore medio della creatinina 0.47±0.17mg/dl e di cistatinaC di 1±0.4mg/L. La lunghezza ecografica renale media era di 100±17 mm, il rapporto USL/H medio di 0.8±0,1 e il delta% di 1,13±11,4, il 66% presentava ipertrofia renale. Le uniche correlazioni significative con DTPA-VFG sono inversa con la creatinina (p=<.001) e lineare con USL/H (p=<.001). Discussione: Lo studio ha mostrato che come per altre nefrouropatie, la creatina e l’ecografia renale siano due strumenti validi per il follow-up dei pazienti con rene singolo congenito. Il limite principale è dovuto al fatto che nessuno dei pazienti ha sviluppato danno renale cronico e pertanto non è stato possibile stabilire dei cutt-off di rischio per parametri quali USL/H.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The risk of a financial position is usually summarized by a risk measure. As this risk measure has to be estimated from historical data, it is important to be able to verify and compare competing estimation procedures. In statistical decision theory, risk measures for which such verification and comparison is possible, are called elicitable. It is known that quantile-based risk measures such as value at risk are elicitable. In this paper, the existing result of the nonelicitability of expected shortfall is extended to all law-invariant spectral risk measures unless they reduce to minus the expected value. Hence, it is unclear how to perform forecast verification or comparison. However, the class of elicitable law-invariant coherent risk measures does not reduce to minus the expected value. We show that it consists of certain expectiles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Estatística, 2015.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent literature has focused on realized volatility models to predict financial risk. This paper studies the benefit of explicitly modeling jumps in this class of models for value at risk (VaR) prediction. Several popular realized volatility models are compared in terms of their VaR forecasting performances through a Monte Carlo study and an analysis based on empirical data of eight Chinese stocks. The results suggest that careful modeling of jumps in realized volatility models can largely improve VaR prediction, especially for emerging markets where jumps play a stronger role than those in developed markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Genome-wide association studies have identified multiple genetic variants associated with prostate cancer risk which explain a substantial proportion of familial relative risk. These variants can be used to stratify individuals by their risk of prostate cancer. Methods We genotyped 25 prostate cancer susceptibility loci in 40,414 individuals and derived a polygenic risk score (PRS).We estimated empirical odds ratios (OR) for prostate cancer associated with different risk strata defined by PRS and derived agespecific absolute risks of developing prostate cancer by PRS stratum and family history. Results The prostate cancer risk for men in the top 1% of the PRS distribution was 30.6 (95% CI, 16.4-57.3) fold compared with men in the bottom 1%, and 4.2 (95% CI, 3.2-5.5) fold compared with the median risk. The absolute risk of prostate cancer by age of 85 years was 65.8% for a man with family history in the top 1% of the PRS distribution, compared with 3.7% for a man in the bottom 1%. The PRS was only weakly correlated with serum PSA level (correlation = 0.09). Conclusions Risk profiling can identify men at substantially increased or reduced risk of prostate cancer. The effect size, measured by OR per unit PRS, was higher in men at younger ages and in men with family history of prostate cancer. Incorporating additional newly identified loci into a PRS should improve the predictive value of risk profiles. Impact:We demonstrate that the risk profiling based on SNPs can identify men at substantially increased or reduced risk that could have useful implications for targeted prevention and screening programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seismic microzonation has generally been recognized as the most accepted tool in seismic hazard assessment and risk evaluation. In general, risk reduction can be done by reducing the hazard, the vulnerability or the value at risk. Since the earthquake hazard can not be reduced, one has to concentrate on vulnerability and value at risk. The vulnerability of an urban area / municipalities depends on the vulnerability of infrastructure and redundancies within the infrastructure. The earthquake risk is the damage to buildings along with number of people that are killed / hurt and the economic losses during the event due to an earthquake with a return period corresponding to this time period. The principal approaches one can follow to reduce these losses are to avoid, if possible, high hazard areas for the siting of buildings and infrastructure, and further ensure that the buildings and infrastructure are designed and constructed to resist expected earthquake loads. This can be done if one can assess the hazard at local scales. Seismic microzonation maps provide the basis for scientifically based decision-making to reduce earthquake risk for Govt./public agencies, private owners and the general public. Further, seismic microzonation carried out on an appropriate scale provides a valuable tool for disaster mitigation planning and emergency response planning for urban centers / municipalities. It provides the basis for the identification of the areas of the city / municipality which are most likely to experience serious damage in the event of an earthquake.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this paper is to improve option risk monitoring by examining the information content of implied volatility and by introducing the calculation of a single-sum expected risk exposure similar to the Value-at-Risk. The figure is calculated in two steps. First, there is a need to estimate the value of a portfolio of options for a number of different market scenarios, while the second step is to summarize the information content of the estimated scenarios into a single-sum risk measure. This involves the use of probability theory and return distributions, which confronts the user with the problems of non-normality in the return distribution of the underlying asset. Here the hyperbolic distribution is used to describe one alternative for dealing with heavy tails. Results indicate that the information content of implied volatility is useful when predicting future large returns in the underlying asset. Further, the hyperbolic distribution provides a good fit to historical returns enabling a more accurate definition of statistical intervals and extreme events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systematic liquidity shocks should affect the optimal behavior of agents in financial markets. Indeed, fluctuations in various measures of liquidity are significantly correlated across common stocks. Accordingly, this paper empirically analyzes whether Spanish average returns vary cross-sectionally with betas estimated relative to two competing liquidity risk factors. The first one, proposed by Pastor and Stambaugh (2002), is associated with the strength of volume-related return reversals. Our marketwide liquidity factor is defined as the difference between returns highly sensitive to changes in the relative bid-ask spread and returns with low sensitivities to those changes. Our empirical results show that neither of these proxies for systematic liquidity risk seems to be priced in the Spanish stock market. Further international evidence is deserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the applicability of the present-value asset pricing model to fishing quota markets by applying instrumental variable panel data estimation techniques to 15 years of market transactions from New Zealand's individual transferable quota (ITQ) market. In addition to the influence of current fishing rents, we explore the effect of market interest rates, risk, and expected changes in future rents on quota asset prices. The results indicate that quota asset prices are positively related to declines in interest rates, lower levels of risk, expected increases in future fish prices, and expected cost reductions from rationalization under the quota system. © 2007 American Agricultural Economics Association.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Previous research has shown that home ownership is associated with a reduced risk of admission to institutional care. The extent to which this reflects associations between wealth and health, between wealth and ability to buy in care or increased motivation to avoid admission related to policies on charging is unclear. Taking account of the value of the home, as well as housing tenure, may provide some clarification as to the relative importance of these factors.
Aims To analyse the probability of admission to residential and nursing home care according to housing tenure and house value.
Methods Cox regression was used to examine the association between home ownership, house value and risk of care home admissions over 6 years of follow-up among a cohort of 51 619 people aged 65 years or older drawn from the Northern Ireland Longitudinal Study, a representative sample of approximate to 28% of the population of Northern Ireland.
Results 4% of the cohort (2138) was admitted during follow-up. Homeowners were less likely than those who rented to be admitted to care homes (HR 0.77, 95% CI 0.70 to 0.85, after adjusting for age, sex, health, living arrangement and urban/rural differences). There was a strong association between house value/tenure and health with those in the highest valued houses having the lowest odds of less than good health or limiting long-term illness. However, there was no difference in probability of admission according to house value; HRs of 0.78 (95% CI 0.67 to 0.90) and 0.81 (95% CI 0.70 to 0.95), respectively, for the lowest and highest value houses compared with renters.
Conclusions The requirement for people in the UK with capital resources to contribute to their care is a significant disincentive to institutional admission. This may place an additional burden on carers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a stochastic programming approach is proposed for trading wind energy in a market environment under uncertainty. Uncertainty in the energy market prices is the main cause of high volatility of profits achieved by power producers. The volatile and intermittent nature of wind energy represents another source of uncertainty. Hence, each uncertain parameter is modeled by scenarios, where each scenario represents a plausible realization of the uncertain parameters with an associated occurrence probability. Also, an appropriate risk measurement is considered. The proposed approach is applied on a realistic case study, based on a wind farm in Portugal. Finally, conclusions are duly drawn. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies a risk measure inherited from ruin theory and investigates some of its properties. Specifically, we consider a value-at-risk (VaR)-type risk measure defined as the smallest initial capital needed to ensure that the ultimate ruin probability is less than a given level. This VaR-type risk measure turns out to be equivalent to the VaR of the maximal deficit of the ruin process in infinite time. A related Tail-VaR-type risk measure is also discussed.