837 resultados para Risk measures
Resumo:
We describe several simulation algorithms that yield random probability distributions with given values of risk measures. In case of vanilla risk measures, the algorithms involve combining and transforming random cumulative distribution functions or random Lorenz curves obtained by simulating rather general random probability distributions on the unit interval. A new algorithm based on the simulation of a weighted barycentres array is suggested to generate random probability distributions with a given value of the spectral risk measure.
Resumo:
Since 2010, the client base of online-trading service providers has grown significantly. Such companies enable small investors to access the stock market at advantageous rates. Because small investors buy and sell stocks in moderate amounts, they should consider fixed transaction costs, integral transaction units, and dividends when selecting their portfolio. In this paper, we consider the small investor’s problem of investing capital in stocks in a way that maximizes the expected portfolio return and guarantees that the portfolio risk does not exceed a prescribed risk level. Portfolio-optimization models known from the literature are in general designed for institutional investors and do not consider the specific constraints of small investors. We therefore extend four well-known portfolio-optimization models to make them applicable for small investors. We consider one nonlinear model that uses variance as a risk measure and three linear models that use the mean absolute deviation from the portfolio return, the maximum loss, and the conditional value-at-risk as risk measures. We extend all models to consider piecewise-constant transaction costs, integral transaction units, and dividends. In an out-of-sample experiment based on Swiss stock-market data and the cost structure of the online-trading service provider Swissquote, we apply both the basic models and the extended models; the former represent the perspective of an institutional investor, and the latter the perspective of a small investor. The basic models compute portfolios that yield on average a slightly higher return than the portfolios computed with the extended models. However, all generated portfolios yield on average a higher return than the Swiss performance index. There are considerable differences between the four risk measures with respect to the mean realized portfolio return and the standard deviation of the realized portfolio return.
Resumo:
The risk of a financial position is usually summarized by a risk measure. As this risk measure has to be estimated from historical data, it is important to be able to verify and compare competing estimation procedures. In statistical decision theory, risk measures for which such verification and comparison is possible, are called elicitable. It is known that quantile-based risk measures such as value at risk are elicitable. In this paper, the existing result of the nonelicitability of expected shortfall is extended to all law-invariant spectral risk measures unless they reduce to minus the expected value. Hence, it is unclear how to perform forecast verification or comparison. However, the class of elicitable law-invariant coherent risk measures does not reduce to minus the expected value. We show that it consists of certain expectiles.
Resumo:
A statistical functional, such as the mean or the median, is called elicitable if there is a scoring function or loss function such that the correct forecast of the functional is the unique minimizer of the expected score. Such scoring functions are called strictly consistent for the functional. The elicitability of a functional opens the possibility to compare competing forecasts and to rank them in terms of their realized scores. In this paper, we explore the notion of elicitability for multi-dimensional functionals and give both necessary and sufficient conditions for strictly consistent scoring functions. We cover the case of functionals with elicitable components, but we also show that one-dimensional functionals that are not elicitable can be a component of a higher order elicitable functional. In the case of the variance, this is a known result. However, an important result of this paper is that spectral risk measures with a spectral measure with finite support are jointly elicitable if one adds the “correct” quantiles. A direct consequence of applied interest is that the pair (Value at Risk, Expected Shortfall) is jointly elicitable under mild conditions that are usually fulfilled in risk management applications.
Resumo:
A new methodology is proposed for the analysis of generation capacity investment in a deregulated market environment. This methodology proposes to make the investment appraisal using a probabilistic framework. The probabilistic production simulation (PPC) algorithm is used to compute the expected energy generated, taking into account system load variations and plant forced outage rates, while the Monte Carlo approach has been applied to model the electricity price variability seen in a realistic network. The model is able to capture the price and hence the profitability uncertainties for generator companies. Seasonal variation in the electricity prices and the system demand are independently modeled. The method is validated on IEEE RTS system, augmented with realistic market and plant data, by using it to compare the financial viability of several generator investments applying either conventional or directly connected generator (powerformer) technologies. The significance of the results is assessed using several financial risk measures.
Resumo:
2000 Mathematics Subject Classi cation: Primary 90C31. Secondary 62C12, 62P05, 93C41.
Resumo:
This dissertation is a collection of three economics essays on different aspects of carbon emission trading markets. The first essay analyzes the dynamic optimal emission control strategies of two nations. With a potential to become the largest buyer under the Kyoto Protocol, the US is assumed to be a monopsony, whereas with a large number of tradable permits on hand Russia is assumed to be a monopoly. Optimal costs of emission control programs are estimated for both the countries under four different market scenarios: non-cooperative no trade, US monopsony, Russia monopoly, and cooperative trading. The US monopsony scenario is found to be the most Pareto cost efficient. The Pareto efficient outcome, however, would require the US to make side payments to Russia, which will even out the differences in the cost savings from cooperative behavior. The second essay analyzes the price dynamics of the Chicago Climate Exchange (CCX), a voluntary emissions trading market. By examining the volatility in market returns using AR-GARCH and Markov switching models, the study associates the market price fluctuations with two different political regimes of the US government. Further, the study also identifies a high volatility in the returns few months before the market collapse. Three possible regulatory and market-based forces are identified as probable causes of market volatility and its ultimate collapse. Organizers of other voluntary markets in the US and worldwide may closely watch for these regime switching forces in order to overcome emission market crashes. The third essay compares excess skewness and kurtosis in carbon prices between CCX and EU ETS (European Union Emission Trading Scheme) Phase I and II markets, by examining the tail behavior when market expectations exceed the threshold level. Dynamic extreme value theory is used to find out the mean price exceedence of the threshold levels and estimate the risk loss. The calculated risk measures suggest that CCX and EU ETS Phase I are extremely immature markets for a risk investor, whereas EU ETS Phase II is a more stable market that could develop as a mature carbon market in future years.
Resumo:
This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.
Resumo:
This dissertation is a collection of three economics essays on different aspects of carbon emission trading markets. The first essay analyzes the dynamic optimal emission control strategies of two nations. With a potential to become the largest buyer under the Kyoto Protocol, the US is assumed to be a monopsony, whereas with a large number of tradable permits on hand Russia is assumed to be a monopoly. Optimal costs of emission control programs are estimated for both the countries under four different market scenarios: non-cooperative no trade, US monopsony, Russia monopoly, and cooperative trading. The US monopsony scenario is found to be the most Pareto cost efficient. The Pareto efficient outcome, however, would require the US to make side payments to Russia, which will even out the differences in the cost savings from cooperative behavior. The second essay analyzes the price dynamics of the Chicago Climate Exchange (CCX), a voluntary emissions trading market. By examining the volatility in market returns using AR-GARCH and Markov switching models, the study associates the market price fluctuations with two different political regimes of the US government. Further, the study also identifies a high volatility in the returns few months before the market collapse. Three possible regulatory and market-based forces are identified as probable causes of market volatility and its ultimate collapse. Organizers of other voluntary markets in the US and worldwide may closely watch for these regime switching forces in order to overcome emission market crashes. The third essay compares excess skewness and kurtosis in carbon prices between CCX and EU ETS (European Union Emission Trading Scheme) Phase I and II markets, by examining the tail behavior when market expectations exceed the threshold level. Dynamic extreme value theory is used to find out the mean price exceedence of the threshold levels and estimate the risk loss. The calculated risk measures suggest that CCX and EU ETS Phase I are extremely immature markets for a risk investor, whereas EU ETS Phase II is a more stable market that could develop as a mature carbon market in future years.
Resumo:
RESUMEN INTRODUCCION: Los riesgos psicosociales en los últimos años han venido tomando importancia debido a que están íntimamente relacionados con el desarrollo de enfermedades producidas por el estrés tanto en la población general como en la comunidad trabajadora. De esta manera los riesgos psicosociales se convierten en un foco de interés cuando de prevención en el trabajo se trata. OBJETIVO: Identificar los factores de riesgo psicosocial a los cuales se encuentran expuestos los trabajadores de una empresa agrícola y alimenticia ubicada en Chile en el año 2016. MÉTODOS: Se llevó a cabo un estudio de corte transversal con datos secundarios procedentes de 194 trabajadores. Se incluyeron variables socio-demográficas, laborales y las relacionadas con el riesgo psicosocial. Como instrumento se utilizó el cuestionario ISTAS 21, que permite la evaluación y prevención de dicho riesgo, mide las dimensiones de la exposición psicosocial y de la salud, el estrés y la satisfacción. En el análisis descriptivo en las variables cuantitativas se calcularon las medidas de tendencia central media, mediana y medidas de dispersión como rango y desviación estándar. En las cualitativas se calculó la frecuencia absoluta y porcentaje, se utilizó el programa estadístico SPSS versión 22. RESULTADOS: El mínimo de edad fue de 25 y el máximo de 55 años. Los grupos etarios predominantes fueron entre 25-30 años y 36-45 años. Se observó que la mayoría de los miembros de esta organización estuvo conformada por hombres. El nivel alto de riesgo psicosocial percibido por los trabajadores por porcentaje en tres de las cinco dimensiones se registró de la siguiente manera: Trabajo activo y posibilidades de desarrollo (48.5%); apoyo social y calidad de liderazgo (52.0%) y doble presencia (52.6%). Las subdimensiones que más se vieron afectadas por el riesgo alto fueron: influencia en el trabajo (69.4%); control sobre el tiempo de trabajo (53.1%); posibilidades de desarrollo en el trabajo (50.0%); claridad de rol (66.8%); calidad de la relación con superiores (52.6%), compañeros (48.8%) y carga de tareas domésticas (50.0%). CONCLUSIONES: El nivel de riesgo psicosocial percibido por los trabajadores fue de Riesgo Alto en tres de las cinco dimensiones del cuestionario SUSESO ISTAS-21 (dimensión Trabajo activo y posibilidades de desarrollo; apoyo social y calidad de liderazgo y doble presencia). A partir de estos hallazgos se concluye que si hay presencia importante de situaciones desfavorables para la salud mental de los trabajadores y un riesgo psicosocial alto que es necesario enfrentar a través de la implementación de estrategias encaminadas a reducir el riesgo en las subdimensiones que más alto riesgo presentaron.
Resumo:
The QUT Outdoor Worker Sun Protection (OWSP) project undertook a comprehensive applied health promotion project to demonstrate the effectiveness of sun protection measures which influence high risk outdoor workers in Queensland to adopt sun safe behaviours. The three year project (2010-2013) was driven by two key concepts: 1) The hierarchy of control, which is used to address risks in the workplace, advocates for six control measures that need to be considered in order of priority (refer to Section 3.4.2); and 2) the Ottawa Charter which recommends five action means to achieve health promotion (refer to Section 2.1). The project framework was underpinned by a participatory action research approach that valued peoples’ input, took advantage of existing skills and resources, and stimulated innovation (refer to Section 4.2). Fourteen workplaces (small and large) with a majority outdoor workforce were recruited across regional Queensland (Darling Downs, Northwest, Mackay and Cairns) from four industries types: 1) building and construction, 2) rural and farming, 3) local government, and 4) public sector. A workplace champion was identified at each workplace and was supported (through resource provision, regular contact and site visits) over a 14 to 18 month intervention period to make sun safety a priority in their workplace. Employees and employers were independently assessed for pre- and postintervention sun protection behaviours. As part of the intervention, an individualised sun safety action plan was developed in conjunction with each workplace to guide changes across six key strategy areas including: 1) Policy (e.g., adopt sun safety practices during all company events); 2) Structural and environmental (e.g., shade on worksites; eliminate or minimise reflective surfaces); 3) Personal protective equipment (PPE) (e.g., trial different types of sunscreens, or wide-brimmed hats); 4) Education and awareness (e.g., include sun safety in inductions and toolbox talks; send reminder emails or text messages to workers);5) Role modelling (e.g., by managers, supervisors, workplace champions and mentors); and 6) Skin examinations (e.g., allow time off work for skin checks). The participatory action process revealed that there was no “one size fits all” approach to sun safety in the workplace; a comprehensive, tailored approach was fundamental. This included providing workplaces with information, resources, skills, know how, incentives and practical help. For example, workplaces engaged in farming complete differing seasonal tasks across the year and needed to prepare for optimal sun safety of their workers during less labour intensive times. In some construction workplaces, long pants were considered a trip hazard and could not be used as part of a PPE strategy. Culture change was difficult to achieve and workplace champions needed guidance on the steps to facilitate this (e.g., influencing leaders through peer support, mentoring and role modelling). With the assistance of the project team the majority of workplaces were able to successfully implement the sun safety strategies contained within their action plans, up skilling them in the evidence for sun safety, how to overcome barriers, how to negotiate with all relevant parties and assess success. The most important enablers to the implementation of a successful action plan were a pro-active workplace champion, strong employee engagement, supportive management, the use of highly visual educational resources, and external support (provided by the project team through regular contact either directly through phone calls or indirectly through emails and e-newsletters). Identified barriers included a lack of time, the multiple roles of workplace champions, (especially among smaller workplaces), competing issues leading to a lack of priority for sun safety, the culture of outdoor workers, and costs or budgeting constraints. The level of sun safety awareness, knowledge, and sun protective behaviours reported by the workers increased between pre-and post-intervention. Of the nine sun protective behaviours that were assessed, the largest changes reported included a 26% increase in workers who “usually or always” wore a broad-brimmed hat, a 20% increase in the use of natural shade, a 19% increase in workers wearing long-sleeved collared shirts, and a 16% increase in workers wearing long trousers.
Resumo:
In this work we clarify the relationships between riskiness, risk acceptance and bankruptcy avoidance. We distinguish between the restriction on the current wealth required to make a gamble acceptable to the decision maker and the restriction on the current wealth required to guarantee no bankruptcy if a gamble is accepted. We focus on the measure of riskiness proposed by Foster and Hart.
Treatment intensification and risk factor control: toward more clinically relevant quality measures.
Resumo:
BACKGROUND: Intensification of pharmacotherapy in persons with poorly controlled chronic conditions has been proposed as a clinically meaningful process measure of quality. OBJECTIVE: To validate measures of treatment intensification by evaluating their associations with subsequent control in hypertension, hyperlipidemia, and diabetes mellitus across 35 medical facility populations in Kaiser Permanente, Northern California. DESIGN: Hierarchical analyses of associations of improvements in facility-level treatment intensification rates from 2001 to 2003 with patient-level risk factor levels at the end of 2003. PATIENTS: Members (515,072 and 626,130; age >20 years) with hypertension, hyperlipidemia, and/or diabetes mellitus in 2001 and 2003, respectively. MEASUREMENTS: Treatment intensification for each risk factor defined as an increase in number of drug classes prescribed, of dosage for at least 1 drug, or switching to a drug from another class within 3 months of observed poor risk factor control. RESULTS: Facility-level improvements in treatment intensification rates between 2001 and 2003 were strongly associated with greater likelihood of being in control at the end of 2003 (P < or = 0.05 for each risk factor) after adjustment for patient- and facility-level covariates. Compared with facility rankings based solely on control, addition of percentages of poorly controlled patients who received treatment intensification changed 2003 rankings substantially: 14%, 51%, and 29% of the facilities changed ranks by 5 or more positions for hypertension, hyperlipidemia, and diabetes, respectively. CONCLUSIONS: Treatment intensification is tightly linked to improved control. Thus, it deserves consideration as a process measure for motivating quality improvement and possibly for measuring clinical performance.