862 resultados para Value at Risk (VaR)


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A statistical functional, such as the mean or the median, is called elicitable if there is a scoring function or loss function such that the correct forecast of the functional is the unique minimizer of the expected score. Such scoring functions are called strictly consistent for the functional. The elicitability of a functional opens the possibility to compare competing forecasts and to rank them in terms of their realized scores. In this paper, we explore the notion of elicitability for multi-dimensional functionals and give both necessary and sufficient conditions for strictly consistent scoring functions. We cover the case of functionals with elicitable components, but we also show that one-dimensional functionals that are not elicitable can be a component of a higher order elicitable functional. In the case of the variance, this is a known result. However, an important result of this paper is that spectral risk measures with a spectral measure with finite support are jointly elicitable if one adds the “correct” quantiles. A direct consequence of applied interest is that the pair (Value at Risk, Expected Shortfall) is jointly elicitable under mild conditions that are usually fulfilled in risk management applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este estudo analisa a utilização do gerenciamento de riscos em algumas Empresas de Pequeno e Médio Porte (PMEs) na cidade de São Bernardo do Campo. A análise do risco empresarial possui uma crescente importância e ela pode contribuir fortemente para a continuidade dos negócios. A capacidade para gerenciar os riscos do negócio em relação ás inevitáveis incertezas e com uma valorização futura dos resultados é um fator substancial de vantagem competitiva. Este processo de geração de valor providencia a disciplina e ferramentas de administração dos riscos empresariais permitindo a criação de valor para sua organização. As Metodologias de Análise de Risco, em sua maioria, são aplicadas para grandes corporações. Uma das motivações desse trabalho é verificar o grau de utilidade dessas metodologias para as empresas PMEs escolhidas para a pesquisa em São Bernardo do Campo. O estudo é desenvolvido por meio de pesquisas bibliográficas e pesquisa exploratória nas empresas escolhidas. Após as pesquisas, foi feita uma análise qualitativa utilizando o método de estudo de casos. Finalmente, conclui-se que as empresas pesquisadas de São Bernardo do Campo, podem obter vantagens significativas ao implantar metodologias de gerenciamento de risco. Todas as empresas pesquisadas possuem mais de dez anos e consideram importante controlar a continuidade de seus negócios.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As ações de maior liquidez do índice IBOVESPA, refletem o comportamento das ações de um modo geral, bem como a relação das variáveis macroeconômicas em seu comportamento e estão entre as mais negociadas no mercado de capitais brasileiro. Desta forma, pode-se entender que há reflexos de fatores que impactam as empresas de maior liquidez que definem o comportamento das variáveis macroeconômicas e que o inverso também é uma verdade, oscilações nos fatores macroeconômicos também afetam as ações de maior liquidez, como IPCA, PIB, SELIC e Taxa de Câmbio. O estudo propõe uma análise da relação existente entre variáveis macroeconômicas e o comportamento das ações de maior liquidez do índice IBOVESPA, corroborando com estudos que buscam entender a influência de fatores macroeconômicos sobre o preço de ações e contribuindo empiricamente com a formação de portfólios de investimento. O trabalho abrangeu o período de 2008 a 2014. Os resultados concluíram que a formação de carteiras, visando a proteção do capital investido, deve conter ativos com correlação negativa em relação às variáveis estudadas, o que torna possível a composição de uma carteira com risco reduzido.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A szerző egy, a szennyezőanyag-kibocsátás európai kereskedelmi rendszerében megfelelésre kötelezett gázturbinás erőmű szén-dioxid-kibocsátását modellezi négy termékre (völgy- és csúcsidőszaki áramár, gázár, kibocsátási kvóta) vonatkozó reálopciós modell segítségével. A profitmaximalizáló erőmű csak abban az esetben termel és szennyez, ha a megtermelt áramon realizálható fedezete pozitív. A jövőbeli időszak összesített szén-dioxid-kibocsátása megfeleltethető európai típusú bináris különbözetopciók összegének. A modell keretein belül a szén-dioxid-kibocsátás várható értékét és sűrűségfüggvényét becsülhetjük, az utóbbi segítségével a szén-dioxid-kibocsátási pozíció kockáztatott értékét határozhatjuk meg, amely az erőmű számára előírt megfelelési kötelezettség teljesítésének adott konfidenciaszint melletti költségét jelenti. A sztochasztikus modellben az alaptermékek geometriai Ornstein-Uhlenbeck-folyamatot követnek. Ezt illesztette a szerző a német energiatőzsdéről származó publikus piaci adatokra. A szimulációs modellre támaszkodva megvizsgálta, hogy a különböző technológiai és piaci tényezők ceteris paribus megváltozása milyen hatással van a megfelelés költségére, a kockáztatott értékére. ______ The carbon-dioxide emissions of an EU Emissions Trading System participant, gas-fuelled power generator are modelled by using real options for four underlying instruments (peak and off-peak electricity, gas, emission quota). This profit-maximizing power plant operates and emits pollution only if its profit (spread) on energy produced is positive. The future emissions can be estimated by a sum of European binary-spread options. Based on the real-option model, the expected value of emissions and its probability-density function can be deducted. Also calculable is the Value at Risk of emission quota position, which gives the cost of compliance at a given confidence level. To model the prices of the four underlying instruments, the geometric Ornstein-Uhlenbeck process is supposed and matched to public available price data from EEX. Based on the simulation model, the effects of various technological and market factors are analysed for the emissions level and the cost of compliance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is considered that the Strategic Alignment IT is the first step within the IT Governance process for any institution. Taking as initial point the recognition that the governance corporate has an overall view of the organizations, the IT Governance takes place as a sub-set responsible for the implementation of the organization strategies in what concerns the provision of the necessary tools for the achievement of the goals set in the Institutional Development Plan. In order to do so, COBIT specifies that such Governance shall be built on the following principles: Strategic Alignment, Value Delivery, Risk Management, Performance Measurement. This paper aims at the Strategic Alignment, considered by the authors as the foundation for the development of the entire IT Governance core. By deepening the technical knowledge of the management system development, UFRN has made a decisive step towards the technical empowerment needed to the “Value Delivery”, yet, by perusing the primarily set processes to the “Strategic Alignment”, gaps that limited the IT strategic view in the implementation of the organizational goals were found. In the qualitative study that used documentary research with content analysis and interviews with the strategic and tactical managers, the view on the role of SINFO – Superintendência de Informática was mapped. The documentary research was done on public documents present on the institutional site and on TCU – Tribunal de Contas da União – documents that map the IT Governance profiles on the federal public service as a whole. As a means to obtain the documentary research results equalization, questionnaires/interviews and iGovTI indexes, quantitative tools to the standardization of the results were used, always bearing in mind the usage of the same scale elements present in the TCU analysis. This being said, similarly to what the TCU study through the IGovTI index provides, this paper advocates a particular index to the study area – SA (Strategic Alignment), calculated from the representative variables of the COBIT 4.1 domains and having the representative variables of the Strategic Alignment primary process as components. As a result, an intermediate index among the values in two adjacent surveys done by TCU in the years of 2010 and 2012 was found, which reflects the attitude and view of managers towards the IT governance: still linked to Data Processing in which a department performs its tasks according to the demand of the various departments or sectors, although there is a commission that discusses the issues related to infrastructure acquisition and systems development. With an Operational view rather than Strategic/Managerial and low attachment to the tools consecrated by the market, several processes are not contemplated in the framework COBIT defined set; this is mainly due to the inexistence of a formal strategic plan for IT; hence, the partial congruency between the organization goals and the IT goals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Energy Conservation Measure (ECM) project selection is made difficult given real-world constraints, limited resources to implement savings retrofits, various suppliers in the market and project financing alternatives. Many of these energy efficient retrofit projects should be viewed as a series of investments with annual returns for these traditionally risk-averse agencies. Given a list of ECMs available, federal, state and local agencies must determine how to implement projects at lowest costs. The most common methods of implementation planning are suboptimal relative to cost. Federal, state and local agencies can obtain greater returns on their energy conservation investment over traditional methods, regardless of the implementing organization. This dissertation outlines several approaches to improve the traditional energy conservations models. Any public buildings in regions with similar energy conservation goals in the United States or internationally can also benefit greatly from this research. Additionally, many private owners of buildings are under mandates to conserve energy e.g., Local Law 85 of the New York City Energy Conservation Code requires any building, public or private, to meet the most current energy code for any alteration or renovation. Thus, both public and private stakeholders can benefit from this research. The research in this dissertation advances and presents models that decision-makers can use to optimize the selection of ECM projects with respect to the total cost of implementation. A practical application of a two-level mathematical program with equilibrium constraints (MPEC) improves the current best practice for agencies concerned with making the most cost-effective selection leveraging energy services companies or utilities. The two-level model maximizes savings to the agency and profit to the energy services companies (Chapter 2). An additional model presented leverages a single congressional appropriation to implement ECM projects (Chapter 3). Returns from implemented ECM projects are used to fund additional ECM projects. In these cases, fluctuations in energy costs and uncertainty in the estimated savings severely influence ECM project selection and the amount of the appropriation requested. A risk aversion method proposed imposes a minimum on the number of “of projects completed in each stage. A comparative method using Conditional Value at Risk is analyzed. Time consistency was addressed in this chapter. This work demonstrates how a risk-based, stochastic, multi-stage model with binary decision variables at each stage provides a much more accurate estimate for planning than the agency’s traditional approach and deterministic models. Finally, in Chapter 4, a rolling-horizon model allows for subadditivity and superadditivity of the energy savings to simulate interactive effects between ECM projects. The approach makes use of inequalities (McCormick, 1976) to re-express constraints that involve the product of binary variables with an exact linearization (related to the convex hull of those constraints). This model additionally shows the benefits of learning between stages while remaining consistent with the single congressional appropriations framework.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Hedging against tail events in equity markets has been forcefully advocated in the aftermath of recent global financial crisis. Whether this is beneficial to long horizon investors like employees enrolled in defined contribution (DC) plans, however, has been subject to criticism. We conduct historical simulation since 1928 to examine the effectiveness of active and passive tail risk hedging using out of money put options for hypothetical equity portfolios of DC plan participants with 20 years to retirement. Our findings show that the cost of tail hedging exceeds the benefits for a majority of the plan participants during the sample period. However, for a significant number of simulations, hedging result in superior outcomes relative to an unhedged position. Active tail hedging is more effective when employees confront several panic-driven periods characterized by short and sharp market swings in the equity markets over the investment horizon. Passive hedging, on the other hand, proves beneficial when they encounter an extremely rare event like the Great Depression as equity markets go into deep and prolonged decline.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We used an established seagrass monitoring programme to examine the short and longer-term impacts of an oil spill event on intertidal seagrass meadows. Results for potentially impacted seagrass areas were compared with existing monitoring data and with control seagrass meadows located outside of the oil spill area. Seagrass meadows were not significantly affected by the oil spill. Declines in seagrass biomass and area 1 month post-spill were consistent between control and impact meadows. Eight months post-spill, seagrass density and area increased to be within historical ranges. The declines in seagrass meadows were likely attributable to natural seasonal variation and a combination of climatic and anthropogenic impacts. The lack of impact from the oil spill was due to several mitigating factors rather than a lack of toxic effects to seagrasses. The study demonstrates the value of long-term monitoring of critical habitats in high risk areas to effectively assess impacts.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the past two decades, the poultry sector in China went through a phase of tremendous growth as well as rapid intensification and concentration. Highly pathogenic avian influenza virus (HPAIV) subtype H5N1 was first detected in 1996 in Guangdong province, South China and started spreading throughout Asia in early 2004. Since then, control of the disease in China has relied heavily on wide-scale preventive vaccination combined with movement control, quarantine and stamping out. This strategy has been successful in drastically reducing the number of outbreaks during the past 5 years. However, HPAIV H5N1 is still circulating and is regularly isolated in traditional live bird markets (LBMs) where viral infection can persist, which represent a public health hazard for people visiting them. The use of social network analysis in combination with epidemiological surveillance in South China has identified areas where the success of current strategies for HPAI control in the poultry production sector may benefit from better knowledge of poultry trading patterns and the LBM network configuration as well as their capacity for maintaining HPAIV H5N1 infection. We produced a set of LBM network maps and estimated the associated risk of HPAIV H5N1 within LBMs and along poultry market chains, providing new insights into how live poultry trade and infection are intertwined. More specifically, our study provides evidence that several biosecurity factors such as daily cage cleaning, daily cage disinfection or manure processing contribute to a reduction in HPAIV H5N1 presence in LBMs. Of significant importance is that the results of our study also show the association between social network indicators and the presence of HPAIV H5N1 in specific network configurations such as the one represented by the counties of origin of the birds traded in LBMs. This new information could be used to develop more targeted and effective control interventions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: One-third of patients with type 1 diabetes develop diabetic complications, such as diabetic nephropathy. The diabetic complications are related to a high mortality from cardiovascular disease, impose a great burden on the health care system, and reduce the health-related quality of life of patients. Aims: This thesis assessed, whether parental risk factors identify subjects at a greater risk of developing diabetic complications. Another aim was to evaluate the impact of a parental history of type 2 diabetes on patients with type 1 diabetes. A third aim was to assess the role of the metabolic syndrome in patients with type 1 diabetes, both its presence and its predictive value with respect to complications. Subjects and methods: This study is part of the ongoing nationwide Finnish Diabetic Nephropathy (FinnDiane) Study. The study was initiated in 1997, and, thus far, 4,800 adult patients with type 1 diabetes have been recruited. Since 2004, follow-up data have also been collected in parallel to the recruitment of new patients. Studies I to III have a cross-sectional design, whereas Study IV has a prospective design. Information on parents was obtained from the patients with type 1 diabetes by a questionnaire. Results: Clustering of parental hypertension, cardiovascular disease, and diabetes (type 1 and type 2) was associated with diabetic nephropathy in patients with type 1 diabetes, as was paternal mortality. A parental history of type 2 diabetes was associated with a later onset of type 1 diabetes, a higher prevalence of the metabolic syndrome, and a metabolic profile related to insulin resistance, despite no difference in the distribution of human leukocyte antigen genotypes or the presence of diabetic complications. A maternal history of type 2 diabetes, seemed to contribute to a worse metabolic profile in the patients with type 1 diabetes than a paternal history. The metabolic syndrome was a frequent finding in patients with type 1 diabetes, observed in 38% of males and 40% of females. The prevalence increased with worsening of the glycemic control and more severe renal disease. The metabolic syndrome was associated with a 3.75-fold odds ratio for diabetic nephropathy, and all of the components of the syndrome were independently associated with diabetic nephropathy. The metabolic syndrome, independent of diabetic nephropathy, increased the risk of cardiovascular events and cardiovascular and diabetes-related mortality over a 5.5-year follow-up. With respect to progression of diabetic nephropathy, the role of the metabolic syndrome was less clear, playing a strong role only in the progression from macroalbuminuria to end-stage renal disease. Conclusions: Familial factors and the metabolic syndrome play an important role in patients with type 1 diabetes. Assessment of these factors is an easily applicable tool in clinical practice to identify patients at a greater risk of developing diabetic complications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study examines the role of corporate philanthropy in the management of reputation risk and shareholder value of the top 100 ASX listed Australian firms for the three years 2011-2013. The results of this study demonstrate the business case for corporate philanthropy and hence encourage corporate philanthropy by showing increasing firms’ investment in corporate giving as a percentage of profit before tax, increases the likelihood of an increase in shareholder value. However, the proviso is that firms must also manage their reputation risk at the same time. There is a negative association between corporate giving and shareholder value (Tobin’s Q) which is mitigated by firms’ management of reputation. The economic significance of this result is that for every cent in the dollar the firm spends on corporate giving, Tobin’s Q will decrease by 0.413%. In contrast, if the firm increase their reputation by 1 point then Tobin’s Q will increase by 0.267%. Consequently, the interaction of corporate giving and reputation risk management is positively associated with shareholder value. These results are robust while controlling for potential endogeneity and reverse causality. This paper assists both academics and practitioners by demonstrating that the benefits of corporate philanthropy extend beyond a gesture to improve reputation or an attempt to increase financial performance, to a direct collaboration between all the factors where the benefits far outweigh the costs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Acute heart failure (AHF) is a complex syndrome associated with exceptionally high mortality. Still, characteristics and prognostic factors of contemporary AHF patients have been inadequately studied. Kidney function has emerged as a very powerful prognostic risk factor in cardiovascular disease. This is believed to be the consequence of an interaction between the heart and kidneys, also termed the cardiorenal syndrome, the mechanisms of which are not fully understood. Renal insufficiency is common in heart failure and of particular interest for predicting outcome in AHF. Cystatin C (CysC) is a marker of glomerular filtration rate with properties making it a prospective alternative to the currently used measure creatinine for assessment of renal function. The aim of this thesis is to characterize a representative cohort of patients hospitalized for AHF and to identify risk factors for poor outcome in AHF. In particular, the role of CysC as a marker of renal function is evaluated, including examination of the value of CysC as a predictor of mortality in AHF. The FINN-AKVA (Finnish Acute Heart Failure) study is a national prospective multicenter study conducted to investigate the clinical presentation, aetiology and treatment of, as well as concomitant diseases and outcome in, AHF. Patients hospitalized for AHF were enrolled in the FINN-AKVA study, and mortality was followed for 12 months. The mean age of patients with AHF is 75 years and they frequently have both cardiovascular and non-cardiovascular co-morbidities. The mortality after hospitalization for AHF is high, rising to 27% by 12 months. The present study shows that renal dysfunction is very common in AHF. CysC detects impaired renal function in forty percent of patients. Renal function, measured by CysC, is one of the strongest predictors of mortality independently of other prognostic risk markers, such as age, gender, co-morbidities and systolic blood pressure on admission. Moreover, in patients with normal creatinine values, elevated CysC is associated with a marked increase in mortality. Acute kidney injury, defined as an increase in CysC within 48 hours of hospital admission, occurs in a significant proportion of patients and is associated with increased short- and mid-term mortality. The results suggest that CysC can be used for risk stratification in AHF. Markers of inflammation are elevated both in heart failure and in chronic kidney disease, and inflammation is one of the mechanisms thought to mediate heart-kidney interactions in the cardiorenal syndrome. Inflammatory cytokines such as interleukin-6 (IL-6) and tumor necrosis factor-alpha (TNF-α) correlate very differently to markers of cardiac stress and renal function. In particular, TNF-α showed a robust correlation to CysC, but was not associated with levels of NT-proBNP, a marker of hemodynamic cardiac stress. Compared to CysC, the inflammatory markers were not strongly related to mortality in AHF. In conclusion, patients with AHF are elderly with multiple co-morbidities, and renal dysfunction is very common. CysC demonstrates good diagnostic properties both in identifying impaired renal function and acute kidney injury in patients with AHF. CysC, as a measure of renal function, is also a powerful prognostic marker in AHF. CysC shows promise as a marker for assessment of kidney function and risk stratification in patients hospitalized for AHF.