14 resultados para clinical risk management

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During the last few decades there have been far going financial market deregulation, technical development, advances in information technology, and standardization of legislation between countries. As a result, one can expect that financial markets have grown more interlinked. The proper understanding of the cross-market linkages has implications for investment and risk management, diversification, asset pricing, and regulation. The purpose of this research is to assess the degree of price, return, and volatility linkages between both geographic markets and asset categories within one country, Finland. Another purpose is to analyze risk asymmetries, i.e., the tendency of equity risk to be higher after negative events than after positive events of equal magnitude. The analysis is conducted both with respect to total risk (volatility), and systematic risk (beta). The thesis consists of an introductory part and four essays. The first essay studies to which extent international stock prices comove. The degree of comovements is low, indicating benefits from international diversification. The second essay examines the degree to which the Finnish market is linked to the “world market”. The total risk is divided into two parts, one relating to world factors, and one relating to domestic factors. The impact of world factors has increased over time. After 1993, when foreign investors were allowed to freely invest in Finnish assets, the risk level has been higher than previously. This was also the case during the economic recession in the beginning of the 1990’s. The third essay focuses on the stock, bond, and money markets in Finland. According to a trading model, the degree of volatility linkages should be strong. However, the results contradict this. The linkages are surprisingly weak, even negative. The stock market is the most independent, while the money market is affected by events on the two other markets. The fourth essay concentrates on volatility and beta asymmetries. Contrary to many international studies there are only few cases of risk asymmetries. When they occur, they tend to be driven by the market-wide component rather than the portfolio specific element.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fluid bed granulation is a key pharmaceutical process which improves many of the powder properties for tablet compression. Dry mixing, wetting and drying phases are included in the fluid bed granulation process. Granules of high quality can be obtained by understanding and controlling the critical process parameters by timely measurements. Physical process measurements and particle size data of a fluid bed granulator that are analysed in an integrated manner are included in process analytical technologies (PAT). Recent regulatory guidelines strongly encourage the pharmaceutical industry to apply scientific and risk management approaches to the development of a product and its manufacturing process. The aim of this study was to utilise PAT tools to increase the process understanding of fluid bed granulation and drying. Inlet air humidity levels and granulation liquid feed affect powder moisture during fluid bed granulation. Moisture influences on many process, granule and tablet qualities. The approach in this thesis was to identify sources of variation that are mainly related to moisture. The aim was to determine correlations and relationships, and utilise the PAT and design space concepts for the fluid bed granulation and drying. Monitoring the material behaviour in a fluidised bed has traditionally relied on the observational ability and experience of an operator. There has been a lack of good criteria for characterising material behaviour during spraying and drying phases, even though the entire performance of a process and end product quality are dependent on it. The granules were produced in an instrumented bench-scale Glatt WSG5 fluid bed granulator. The effect of inlet air humidity and granulation liquid feed on the temperature measurements at different locations of a fluid bed granulator system were determined. This revealed dynamic changes in the measurements and enabled finding the most optimal sites for process control. The moisture originating from the granulation liquid and inlet air affected the temperature of the mass and pressure difference over granules. Moreover, the effects of inlet air humidity and granulation liquid feed rate on granule size were evaluated and compensatory techniques used to optimize particle size. Various end-point indication techniques of drying were compared. The ∆T method, which is based on thermodynamic principles, eliminated the effects of humidity variations and resulted in the most precise estimation of the drying end-point. The influence of fluidisation behaviour on drying end-point detection was determined. The feasibility of the ∆T method and thus the similarities of end-point moisture contents were found to be dependent on the variation in fluidisation between manufacturing batches. A novel parameter that describes behaviour of material in a fluid bed was developed. Flow rate of the process air and turbine fan speed were used to calculate this parameter and it was compared to the fluidisation behaviour and the particle size results. The design space process trajectories for smooth fluidisation based on the fluidisation parameters were determined. With this design space it is possible to avoid excessive fluidisation and improper fluidisation and bed collapse. Furthermore, various process phenomena and failure modes were observed with the in-line particle size analyser. Both rapid increase and a decrease in granule size could be monitored in a timely manner. The fluidisation parameter and the pressure difference over filters were also discovered to express particle size when the granules had been formed. The various physical parameters evaluated in this thesis give valuable information of fluid bed process performance and increase the process understanding.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Long QT syndrome is a congenital or acquired arrhythmic disorder which manifests as a prolonged QT-interval on the electrocardiogram and as a tendency to develop ventricular arrhythmias which can lead to sudden death. Arrhythmias often occur during intense exercise and/or emotional stress. The two most common subtypes of LQTS are LQT1, caused by mutations in the KCNQ1 gene and LQT2, caused by mutations in the KCNH2 gene. LQT1 and LQT2 patients exhibit arrhythmias in different types of situations: in LQT1 the trigger is usually vigorous exercise whereas in LQT2 arrhythmia results from the patient being startled from rest. It is not clear why trigger factors and clinical outcome differ from each other in the different LQTS subtypes. It is possible that stress hormones such as catecholamines may show different effects depending on the exact nature of the genetic defect, or sensitivity to catecholamines varies from subject to subject. Furthermore, it is possible that subtle genetic variants of putative modifier genes, including those coding for ion channels and hormone receptors, play a role as determinants of individual sensitivity to life-threatening arrhythmias. The present study was designed to identify some of these risk modifiers. It was found that LQT1 and LQT2 patients show an abnormal QT-adaptation to both mental and physical stress. Furthermore, as studied with epinephrine infusion experiments while the heart was paced and action potentials were measured from the right ventricular septum, LQT1 patients showed repolarization abnormalities which were related to their propensity to develop arrhythmia during intense, prolonged sympathetic tone, such as exercise. In LQT2 patients, this repolarization abnormality was noted already at rest corresponding to their arrhythmic episodes as a result of intense, sudden surges in adrenergic tone, such as fright or rage. A common KCNH2 polymorphism was found to affect KCNH2 channel function as demonstrated by in vitro experiments utilizing mammalian cells transfected with the KCNH2 potassium channel as well as QT-dynamics in vivo. Finally, the present study identified a common β-1-adrenergic receptor genotype that is related a shorter QT-interval in LQT1 patients. Also, it was discovered that compound homozygosity for two common β-adrenergic polymorphisms was related to the occurrence of symptoms in the LQT1 type of long QT syndrome. The studies demonstrate important genotype-phenotype differences between different LQTS subtypes and suggest that common modifier gene polymorphisms may affect cardiac repolarization in LQTS. It will be important in the future to prospectively study whether variant gene polymorphisms will assist in clinical risk profiling of LQTS patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Backround and Purpose The often fatal (in 50-35%) subarachnoid hemorrhage (SAH) caused by saccular cerebral artery aneurysm (SCAA) rupture affects mainly the working aged population. The incidence of SAH is 10-11 / 100 000 in Western countries and twice as high in Finland and Japan. The estimated prevalence of SCAAs is around 2%. Many of those never rupture. Currently there are, however, no diagnostic methods to identify rupture-prone SCAAs from quiescent, (dormant) ones. Finding diagnostic markers for rupture-prone SCAAs is of primary importance since a SCAA rupture has such a sinister outcome, and all current treatment modalities are associated with morbidity and mortality. Also the therapies that prevent SCAA rupture need to be developed to as minimally invasive as possible. Although the clinical risk factors for SCAA rupture have been extensively studied and documented in large patient series, the cellular and molecular mechanisms how these risk factors lead to SCAA wall rupture remain incompletely known. Elucidation of the molecular and cellular pathobiology of the SCAA wall is needed in order to develop i) novel diagnostic tools that could identify rupture-prone SCAAs or patients at risk of SAH, and to ii) develop novel biological therapies that prevent SCAA wall rupture. Materials and Methods In this study, histological samples from unruptured and ruptured SCAAs and plasma samples from SCAA carriers were compared in order to identify structural changes, cell populations, growth factor receptors, or other molecular markers that would associate with SCAA wall rupture. In addition, experimental saccular aneurysm models and experimental models of mechanical vascular injury were used to study the cellular mechanisms of scar formation in the arterial wall, and the adaptation of the arterial wall to increased mechanical stress. Results and Interpretation Inflammation and degeneration of the SCAA wall, namely loss of mural cells and degradation of the wall matrix, were found to associate with rupture. Unruptured SCAA walls had structural resemblance with pads of myointimal hyperplasia or so called neointima that characterizes early atherosclerotic lesions, and is the repair and adaptation mechanism of the arterial wall after injury or increased mechanical stress. As in pads of myointimal hyperplasia elsewhere in the vasculature, oxidated LDL was found in the SCAA walls. Immunity against OxLDL was demonstrated in SAH patients with detection of circulating anti-oxidized LDL antibodies, which were significantly associated with the risk of rupture in patients with solitary SCAAs. Growth factor receptors associated with arterial wall remodeling and angiogenesis were more expressed in ruptured SCAA walls. In experimental saccular aneurysm models, capillary growth, arterial wall remodeling and neointima formation were found. The neointimal cells were shown to originate from the experimental aneurysm wall with minor contribution from the adjacent artery, and a negligible contribution of bone marrow-derived neointimal cells. Since loss of mural cells characterizes ruptured human SCAAs and likely impairs the adaptation and repair mechanism of ruptured or rupture-prone SCAAs, we investigated also the hypothesis that bone marrow-derived or circulating neointimal precursor cells could be used to enhance neointima formation and compensate the impaired repair capacity in ruptured SCAA walls. However, significant contribution of bone marrow cells or circulating mononuclear cells to neointima formation was not found.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Uveal melanoma is the most common primary intraocular malignancy in adults. Vision in the affected eye is threatened by both the tumor and side-effects from the treatments currently available. Poor prognosis for saving vision increases with tumor size and, consequently, enucleation has been the treatment of choice for large uveal melanomas in most centers. However, increasing evidence suggests that no survival benefit is gained (nor lost) by enucleation as compared to eye-conserving methods. The Helsinki University Eye Hospital has since 1990 offered episcleral iodine-125 plaque brachytherapy (IBT) for all patients unwilling to undergo enucleation for a large uveal melanoma. The primary aim of this study was to assess survival, local tumor recurrence and preservation of the eye and vision after IBT in a population-based series of 97 patients with uveal melanomas classified as large by the Collaborative Ocular Melanoma Study (COMS) criteria. Further aims included reporting the incidence of side-effects and assessing the role of intraocular dose distribution and clinical risk factors in their development. Finally, means to improve the current treatment were investigated by using computer models to compare existing plaques with collimating ones and by comparing the outcome of a subgroup of 54 IBT patients with very thick tumors with 33 patients with similarly-sized tumors managed with transscleral local resection (TSR) in Liverpool, United Kingdom. Kaplan-Meier estimates of all-cause and melanoma-specific survival at 5 years after IBT were 62% and 65%, respectively, and visually comparable with the survival experience of patients reported after enucleation by the COMS. Local recurrence developed in 6% of eyes and 84% of eyes were conserved at 5 years. Visual prognosis was guarded with 11% avoiding loss of 20/70 vision and 26% avoiding loss of 20/400 vision in the tumor eye at 2 years. Large tumor height and short distance from the posterior pole were independently associated with loss of vision. Using cumulative incidence analysis to account for competing risks, such as enucleation and metastatic death, the 5-year incidence of cataract after IBT was 79%, glaucoma 60%, optic neuropathy 46%, maculopathy 52%, persistent or recurring retinal detachment (RD) 25%, and vitreous hemorrhage 36%. In multivariate competing risks regression models, increasing tumor height was associated with cataract, iris neovascularization and RD. Maculopathy and optic neuropathy were associated with distance from the tumor to the respective structure. Median doses to the tumor apex, macula and optic disc were 81 Gy (range, 40-158), 79 Gy (range, 12-632), and 83 Gy (range, 10-377), respectively. Dose to the optic disc was independently associated with optic neuropathy, and both dose to the optic disc and dose to the macula predicted vision loss after IBT. Simulated treatment using collimating plaques resulted in clinically meaningful reduction in both optic disc (median reduction, 30 Gy) and macular (median reduction, 36 Gy) doses as compared to the actual treatment with standard plaques. In the subgroup of patients with uveal melanomas classified as large because of tumor height, cumulative incidence analysis revealed that while long-term preservation of 20/70 vision was rare after both IBT and TSR, preservation of 20/400 vision was better after TSR (32% vs. 5% at 5 years). In multivariate logistic regression models, TSR was independently associated with better preservation of 20/400 vision (OR 0.03 at 2 years, P=0.005) No cases of secondary glaucoma were observed after TSR and optic neuropathy was rare. However, local tumor recurrence was more common after TSR than it was after IBT (Cumulative incidence 41% vs. 7% at 5 years, respectively). In terms of survival, IBT seems to be a safe alternative to enucleation in managing large uveal melanomas. Local tumor control is no worse than with medium-sized tumors and the chances of avoiding secondary enucleation are good. Unfortunately, side-effects from radiotherapy are frequent, especially in thick tumors, and long-term prognosis of saving vision is consequently guarded. Some complications can be limited by using collimating plaques and by managing uveal melanomas that are large because of tumor height with TSR instead of IBT. However, the patient must be willing to accept a substantial risk of local tumor recurrence after TSR and it is best suited for cases in which the preservation of vision in the tumor eye is critical.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this research is to draw up a clear construction of an anticipatory communicative decision-making process and a successful implementation of a Bayesian application that can be used as an anticipatory communicative decision-making support system. This study is a decision-oriented and constructive research project, and it includes examples of simulated situations. As a basis for further methodological discussion about different approaches to management research, in this research, a decision-oriented approach is used, which is based on mathematics and logic, and it is intended to develop problem solving methods. The approach is theoretical and characteristic of normative management science research. Also, the approach of this study is constructive. An essential part of the constructive approach is to tie the problem to its solution with theoretical knowledge. Firstly, the basic definitions and behaviours of an anticipatory management and managerial communication are provided. These descriptions include discussions of the research environment and formed management processes. These issues define and explain the background to further research. Secondly, it is processed to managerial communication and anticipatory decision-making based on preparation, problem solution, and solution search, which are also related to risk management analysis. After that, a solution to the decision-making support application is formed, using four different Bayesian methods, as follows: the Bayesian network, the influence diagram, the qualitative probabilistic network, and the time critical dynamic network. The purpose of the discussion is not to discuss different theories but to explain the theories which are being implemented. Finally, an application of Bayesian networks to the research problem is presented. The usefulness of the prepared model in examining a problem and the represented results of research is shown. The theoretical contribution includes definitions and a model of anticipatory decision-making. The main theoretical contribution of this study has been to develop a process for anticipatory decision-making that includes management with communication, problem-solving, and the improvement of knowledge. The practical contribution includes a Bayesian Decision Support Model, which is based on Bayesian influenced diagrams. The main contributions of this research are two developed processes, one for anticipatory decision-making, and the other to produce a model of a Bayesian network for anticipatory decision-making. In summary, this research contributes to decision-making support by being one of the few publicly available academic descriptions of the anticipatory decision support system, by representing a Bayesian model that is grounded on firm theoretical discussion, by publishing algorithms suitable for decision-making support, and by defining the idea of anticipatory decision-making for a parallel version. Finally, according to the results of research, an analysis of anticipatory management for planned decision-making is presented, which is based on observation of environment, analysis of weak signals, and alternatives to creative problem solving and communication.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Follicular lymphoma (FL) is the second most common non-Hodgkin lymphoma. It is an indolent and clinically heterogeneous disease, which is generally considered incurable. Currently, immunochemotherapy has significantly improved the outcome of FL patients. This is based on the combination of rituximab, a monoclonal anti-CD20 antibody, with chemotherapy, and is used at present as a standard first-line therapy in FL. Thus far, however, patients have been selected for treatment based on clinical risk factors and indices that were developed before the rituximab era. Therefore, there is a growing need to understand the molecular mechanisms underlying the disease, which would not only provide information to predict survival in the rituximab era, but also enable the design of more targeted therapeutic strategies. In this study, our aim was to identify genes predicting the outcome in FL patients treated with immunochemotherapy. Thus, we performed a cDNA microarray with 24 FL patients. When gene expression differences from diagnostic tumour samples were related to the clinical outcome, we identified novel genes with a prognostic impact on survival. The expression of selected genes was further characterized with quantitative PCR and immunohistochemistry (IHC). Interestingly, the prognostic influence of these genes was often associated with their expression in non-malignant cells instead of tumour cells. Based on the observed gene expression patterns, we analyzed the abundance and prognostic value of non-malignant immune cells in 95-98 FL patients treated with immunochemotherapy. We observed that a high content of tumour-associated macrophages was a marker of a favourable prognosis. In contrast, the accumulation of mast cells correlated with a poor outcome and was further associated with tumour vascularity. Increased microvessel density also correlated with an inferior outcome. In addition, we used the same microarray data with a systems biology approach to identify signalling pathways or groups of genes capable of separating patients with favourable or adverse outcomes. Among the transcripts, there were many genes associated with signal transducers and activators of the transcription (STAT5a) pathway. When IHC was used as validation, STAT5a expression was mostly observed in T-cells and follicular dendritic cells, and expression was found to predict a favourable outcome. In cell cultures, rituximab was observed to induce the expression of STAT5a-associated interleukins in human lymphoma cell lines, which might provide a possible link for the cross-talk between rituximab-induced FL cells and their microenvironment. In conclusion, we have demonstrated that the microenvironment has a prognostic role in FL patients treated with immunochemotherapy. The results also address the importance of re-evaluating the prognostic markers in the rituximab era of lymphoma therapies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this paper is to improve option risk monitoring by examining the information content of implied volatility and by introducing the calculation of a single-sum expected risk exposure similar to the Value-at-Risk. The figure is calculated in two steps. First, there is a need to estimate the value of a portfolio of options for a number of different market scenarios, while the second step is to summarize the information content of the estimated scenarios into a single-sum risk measure. This involves the use of probability theory and return distributions, which confronts the user with the problems of non-normality in the return distribution of the underlying asset. Here the hyperbolic distribution is used to describe one alternative for dealing with heavy tails. Results indicate that the information content of implied volatility is useful when predicting future large returns in the underlying asset. Further, the hyperbolic distribution provides a good fit to historical returns enabling a more accurate definition of statistical intervals and extreme events.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this paper is to suggest a method that accounts for the impact of the volatility smile dynamics when performing scenario analysis for a portfolio consisting of vanilla options. As the volatility smile is documented to change at least with the level of implied at-the-money volatility, a suitable model is here included in the calculation process of the simulated market scenarios. By constructing simple portfolios of index options and comparing the ex ante risk exposure measured using different pricing methods to realized market values, ex post, the improvements of the incorporation of the model are monitored. The analyzed examples in the study generate results that statistically support that the most accurate scenarios are those calculated using the model accounting for the dynamics of the smile. Thus, we show that the differences emanating from the volatility smile are apparent and should be accounted for and that the methodology presented herein is one suitable alternative for doing so.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Africa is threatened by climate change. The adaptive capacity of local communities continues to be weakened by ineffective and inefficient livelihood strategies and inappropriate development interventions. One of the greatest challenges for climate change adaptation in Africa is related to the governance of natural resources used by vulnerable poor groups as assets for adaptation. Practical and good governance activities for adaptation in Africa is urgently and much needed to support adaptation actions, interventions and planning. The adaptation role of forests has not been as prominent in the international discourse and actions as their mitigation role. This study therefore focused on the forest as one of the natural resources used for adaptation. The general objective of this research was to assess the extent to which cases of current forest governance practices in four African countries Burkina Faso, The Democratic Republic of Congo (DRC), Ghana and Sudan are supportive to the adaptation of vulnerable societies and ecosystems to impacts of climate change. Qualitative and quantitative analyses from surveys, expert consultations and group discussions were used in analysing the case studies. The entire research was guided by three conceptual sets of thinking forest governance, climate change vulnerability and ecosystem services. Data for the research were collected from selected ongoing forestry activities and programmes. The study mainly dealt with forest management policies and practices that can improve the adaptation of forest ecosystems (Study I) and the adaptive capacity through the management of forest resources by vulnerable farmers (Studies II, III, IV and V). It was found that adaptation is not part of current forest policies, but, instead, policies contain elements of risk management practices, which are also relevant to the adaptation of forest ecosystems. These practices include, among others, the management of forest fires, forest genetic resources, non-timber resources and silvicultural practices. Better livelihood opportunities emerged as the priority for the farmers. These vulnerable farmers had different forms of forest management. They have a wide range of experience and practical knowledge relevant to ensure and achieve livelihood improvement alongside sustainable management and good governance of natural resources. The contributions of traded non-timber forest products to climate change adaptation appear limited for local communities, based on their distribution among the stakeholders in the market chain. Plantation (agro)forestry, if well implemented and managed by communities, has a high potential in reducing socio-ecological vulnerability by increasing the food production and restocking degraded forest lands. Integration of legal arrangements with continuous monitoring, evaluation and improvement may drive this activity to support short, medium and long term expectations related to adaptation processes. The study concludes that effective forest governance initiatives led by vulnerable poor groups represent one practical way to improve the adaptive capacities of socio-ecological systems against the impacts of climate change in Africa.