947 resultados para Parametric VaR (Value-at-Risk)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta dissertação trata da importância da Governança Corporativa e da Gestão de Risco para as empresas brasileiras que tem suas ações negociadas nas Bolsas de Valores de Nova York e de São Paulo. Tem como principais objetivos: a avaliação do atual estágio de adequação dessas empresas brasileiras às normas da Lei Sarbanes & Oxley, a confirmação da importância do gerenciamento de risco para a Governança Corporativa, buscando fazer uma associação da ocorrência de perdas patrimoniais com as ferramentas da gestão de risco e das fraudes com a fragilidade de normas de controle interno e com as normas emanadas dos órgãos externos regulatórios. O trabalho acadêmico, um estudo exploratório, teve como ponto de partida uma pesquisa bibliográfica de livros e artigos técnicos versando sobre Governança Corporativa com foco na gestão de riscos. A pesquisa foi feita através da leitura dos relatórios de administração das empresas selecionadas e a aplicabilidade das normas da Lei Sarbanes Oxley. Como conclusão foi possível confirmar com razoável certeza que as grandes perdas, que levaram empresas internacionais a quebra, ocorreram pela falta de uma eficaz gestão de risco ou por um deficiente sistema de controle interno associada a falta de ações preventivas. Por outro lado, apesar dos esforços das empresas brasileiras em se adequar às novas exigências para poder atuar no mercado financeiro do Brasil e dos Estados Unidos da América, parte das empresas pesquisadas ainda se encontra em fase de implementação dos Comitês de Auditoria, de Normas e Procedimentos de Controle Interno e das demais práticas de Gestão Corporativa. Novas pesquisas sobre o tema central deste estudo poderão ensejar no aprofundamento da questão da relação custo x benefício da implantação das práticas de Governança Corporativa e na questão da eficácia dos sistemas corporativos de gerenciamento e controle considerando os custos incorridos em sua implantação e manutenção e os benefícios obtidos. Propõe-se ainda um estudo que busque a revisão das responsabilidades das autoridades reguladoras no que tange ao controle ante e pós-fato. Um dilema a ser resolvido e que deve instigar futuros pesquisadores.(AU)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este estudo analisa a utilização do gerenciamento de riscos em algumas Empresas de Pequeno e Médio Porte (PMEs) na cidade de São Bernardo do Campo. A análise do risco empresarial possui uma crescente importância e ela pode contribuir fortemente para a continuidade dos negócios. A capacidade para gerenciar os riscos do negócio em relação ás inevitáveis incertezas e com uma valorização futura dos resultados é um fator substancial de vantagem competitiva. Este processo de geração de valor providencia a disciplina e ferramentas de administração dos riscos empresariais permitindo a criação de valor para sua organização. As Metodologias de Análise de Risco, em sua maioria, são aplicadas para grandes corporações. Uma das motivações desse trabalho é verificar o grau de utilidade dessas metodologias para as empresas PMEs escolhidas para a pesquisa em São Bernardo do Campo. O estudo é desenvolvido por meio de pesquisas bibliográficas e pesquisa exploratória nas empresas escolhidas. Após as pesquisas, foi feita uma análise qualitativa utilizando o método de estudo de casos. Finalmente, conclui-se que as empresas pesquisadas de São Bernardo do Campo, podem obter vantagens significativas ao implantar metodologias de gerenciamento de risco. Todas as empresas pesquisadas possuem mais de dez anos e consideram importante controlar a continuidade de seus negócios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As ações de maior liquidez do índice IBOVESPA, refletem o comportamento das ações de um modo geral, bem como a relação das variáveis macroeconômicas em seu comportamento e estão entre as mais negociadas no mercado de capitais brasileiro. Desta forma, pode-se entender que há reflexos de fatores que impactam as empresas de maior liquidez que definem o comportamento das variáveis macroeconômicas e que o inverso também é uma verdade, oscilações nos fatores macroeconômicos também afetam as ações de maior liquidez, como IPCA, PIB, SELIC e Taxa de Câmbio. O estudo propõe uma análise da relação existente entre variáveis macroeconômicas e o comportamento das ações de maior liquidez do índice IBOVESPA, corroborando com estudos que buscam entender a influência de fatores macroeconômicos sobre o preço de ações e contribuindo empiricamente com a formação de portfólios de investimento. O trabalho abrangeu o período de 2008 a 2014. Os resultados concluíram que a formação de carteiras, visando a proteção do capital investido, deve conter ativos com correlação negativa em relação às variáveis estudadas, o que torna possível a composição de uma carteira com risco reduzido.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A szerző egy, a szennyezőanyag-kibocsátás európai kereskedelmi rendszerében megfelelésre kötelezett gázturbinás erőmű szén-dioxid-kibocsátását modellezi négy termékre (völgy- és csúcsidőszaki áramár, gázár, kibocsátási kvóta) vonatkozó reálopciós modell segítségével. A profitmaximalizáló erőmű csak abban az esetben termel és szennyez, ha a megtermelt áramon realizálható fedezete pozitív. A jövőbeli időszak összesített szén-dioxid-kibocsátása megfeleltethető európai típusú bináris különbözetopciók összegének. A modell keretein belül a szén-dioxid-kibocsátás várható értékét és sűrűségfüggvényét becsülhetjük, az utóbbi segítségével a szén-dioxid-kibocsátási pozíció kockáztatott értékét határozhatjuk meg, amely az erőmű számára előírt megfelelési kötelezettség teljesítésének adott konfidenciaszint melletti költségét jelenti. A sztochasztikus modellben az alaptermékek geometriai Ornstein-Uhlenbeck-folyamatot követnek. Ezt illesztette a szerző a német energiatőzsdéről származó publikus piaci adatokra. A szimulációs modellre támaszkodva megvizsgálta, hogy a különböző technológiai és piaci tényezők ceteris paribus megváltozása milyen hatással van a megfelelés költségére, a kockáztatott értékére. ______ The carbon-dioxide emissions of an EU Emissions Trading System participant, gas-fuelled power generator are modelled by using real options for four underlying instruments (peak and off-peak electricity, gas, emission quota). This profit-maximizing power plant operates and emits pollution only if its profit (spread) on energy produced is positive. The future emissions can be estimated by a sum of European binary-spread options. Based on the real-option model, the expected value of emissions and its probability-density function can be deducted. Also calculable is the Value at Risk of emission quota position, which gives the cost of compliance at a given confidence level. To model the prices of the four underlying instruments, the geometric Ornstein-Uhlenbeck process is supposed and matched to public available price data from EEX. Based on the simulation model, the effects of various technological and market factors are analysed for the emissions level and the cost of compliance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is considered that the Strategic Alignment IT is the first step within the IT Governance process for any institution. Taking as initial point the recognition that the governance corporate has an overall view of the organizations, the IT Governance takes place as a sub-set responsible for the implementation of the organization strategies in what concerns the provision of the necessary tools for the achievement of the goals set in the Institutional Development Plan. In order to do so, COBIT specifies that such Governance shall be built on the following principles: Strategic Alignment, Value Delivery, Risk Management, Performance Measurement. This paper aims at the Strategic Alignment, considered by the authors as the foundation for the development of the entire IT Governance core. By deepening the technical knowledge of the management system development, UFRN has made a decisive step towards the technical empowerment needed to the “Value Delivery”, yet, by perusing the primarily set processes to the “Strategic Alignment”, gaps that limited the IT strategic view in the implementation of the organizational goals were found. In the qualitative study that used documentary research with content analysis and interviews with the strategic and tactical managers, the view on the role of SINFO – Superintendência de Informática was mapped. The documentary research was done on public documents present on the institutional site and on TCU – Tribunal de Contas da União – documents that map the IT Governance profiles on the federal public service as a whole. As a means to obtain the documentary research results equalization, questionnaires/interviews and iGovTI indexes, quantitative tools to the standardization of the results were used, always bearing in mind the usage of the same scale elements present in the TCU analysis. This being said, similarly to what the TCU study through the IGovTI index provides, this paper advocates a particular index to the study area – SA (Strategic Alignment), calculated from the representative variables of the COBIT 4.1 domains and having the representative variables of the Strategic Alignment primary process as components. As a result, an intermediate index among the values in two adjacent surveys done by TCU in the years of 2010 and 2012 was found, which reflects the attitude and view of managers towards the IT governance: still linked to Data Processing in which a department performs its tasks according to the demand of the various departments or sectors, although there is a commission that discusses the issues related to infrastructure acquisition and systems development. With an Operational view rather than Strategic/Managerial and low attachment to the tools consecrated by the market, several processes are not contemplated in the framework COBIT defined set; this is mainly due to the inexistence of a formal strategic plan for IT; hence, the partial congruency between the organization goals and the IT goals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Energy Conservation Measure (ECM) project selection is made difficult given real-world constraints, limited resources to implement savings retrofits, various suppliers in the market and project financing alternatives. Many of these energy efficient retrofit projects should be viewed as a series of investments with annual returns for these traditionally risk-averse agencies. Given a list of ECMs available, federal, state and local agencies must determine how to implement projects at lowest costs. The most common methods of implementation planning are suboptimal relative to cost. Federal, state and local agencies can obtain greater returns on their energy conservation investment over traditional methods, regardless of the implementing organization. This dissertation outlines several approaches to improve the traditional energy conservations models. Any public buildings in regions with similar energy conservation goals in the United States or internationally can also benefit greatly from this research. Additionally, many private owners of buildings are under mandates to conserve energy e.g., Local Law 85 of the New York City Energy Conservation Code requires any building, public or private, to meet the most current energy code for any alteration or renovation. Thus, both public and private stakeholders can benefit from this research. The research in this dissertation advances and presents models that decision-makers can use to optimize the selection of ECM projects with respect to the total cost of implementation. A practical application of a two-level mathematical program with equilibrium constraints (MPEC) improves the current best practice for agencies concerned with making the most cost-effective selection leveraging energy services companies or utilities. The two-level model maximizes savings to the agency and profit to the energy services companies (Chapter 2). An additional model presented leverages a single congressional appropriation to implement ECM projects (Chapter 3). Returns from implemented ECM projects are used to fund additional ECM projects. In these cases, fluctuations in energy costs and uncertainty in the estimated savings severely influence ECM project selection and the amount of the appropriation requested. A risk aversion method proposed imposes a minimum on the number of “of projects completed in each stage. A comparative method using Conditional Value at Risk is analyzed. Time consistency was addressed in this chapter. This work demonstrates how a risk-based, stochastic, multi-stage model with binary decision variables at each stage provides a much more accurate estimate for planning than the agency’s traditional approach and deterministic models. Finally, in Chapter 4, a rolling-horizon model allows for subadditivity and superadditivity of the energy savings to simulate interactive effects between ECM projects. The approach makes use of inequalities (McCormick, 1976) to re-express constraints that involve the product of binary variables with an exact linearization (related to the convex hull of those constraints). This model additionally shows the benefits of learning between stages while remaining consistent with the single congressional appropriations framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The first paper sheds light on the informational content of high frequency data and daily data. I assess the economic value of the two family models comparing their performance in forecasting asset volatility through the Value at Risk metric. In running the comparison this paper introduces two key assumptions: jumps in prices and leverage effect in volatility dynamics. Findings suggest that high frequency data models do not exhibit a superior performance over daily data models. In the second paper, building on Majewski et al. (2015), I propose an affine-discrete time model, labeled VARG-J, which is characterized by a multifactor volatility specification. In the VARG-J model volatility experiences periods of extreme movements through a jump factor modeled as an Autoregressive Gamma Zero process. The estimation under historical measure is done by quasi-maximum likelihood and the Extended Kalman Filter. This strategy allows to filter out both volatility factors introducing a measurement equation that relates the Realized Volatility to latent volatility. The risk premia parameters are calibrated using call options written on S&P500 Index. The results clearly illustrate the important contribution of the jump factor in the pricing performance of options and the economic significance of the volatility jump risk premia. In the third paper, I analyze whether there is empirical evidence of contagion at the bank level, measuring the direction and the size of contagion transmission between European markets. In order to understand and quantify the contagion transmission on banking market, I estimate the econometric model by Aït-Sahalia et al. (2015) in which contagion is defined as the within and between countries transmission of shocks and asset returns are directly modeled as a Hawkes jump diffusion process. The empirical analysis indicates that there is a clear evidence of contagion from Greece to European countries as well as self-contagion in all countries.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Banana flour obtained from unripe banana (Musa acuminata, var. Nanico) under specific drying conditions was evaluated regarding its chemical composition and nutritional value. Results are expressed in dry weight (dw). The unripe banana flour (UBF) presented a high amount of total dietary fiber (DF) (56.24 g/100 g), which consisted of resistant starch (RS) (48.99 g/100 g), fructans (0.05 g/100 g) and DF without RS or fructans (7.2 g/100 g). The contents of available starch (AS) (27.78 g/100 g) and soluble sugars (1.81 g/100 g) were low. The main phytosterols found were campesterol (4.1 mg/100 g), stigmasterol (2.5 mg/100 g) and beta-sitosterol (6.2 mg/100 g). The total polyphenol content was 50.65 mg GAE/100 g. Antioxidant activity, by the FRAP and ORAC methods, was moderated, being 358.67 and 261.00 mu mol of Trolox equivalent/100 g, respectively. The content of Zn, Ca and Fe and mineral dialyzability were low. The procedure used to obtain UBF resulted in the recovery of undamaged starch granules and in a low-energy product (597 kJ/100 g).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The value of a seasonal forecasting system based on phases of the Southern Oscillation was estimated for a representative dryland wheat grower in the vicinity of Goondiwindi. In particular the effects on this estimate of risk attitude and planting conditions were examined. A recursive stochastic programming approach was used to identify the grower's utility-maximising action set in the event of each of the climate patterns over the period 1894-1991 recurring In the imminent season. The approach was repeated with and without use of the forecasts. The choices examined were, at planting, nitrogen application rate and cultivar and, later in the season, choices of proceeding with or abandoning each wheat activity, The value of the forecasting system was estimated as the maximum amount the grower could afford to pay for its use without expected utility being lowered relative to its non use.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE: To determine in arrhythmogenic right ventricular cardiomyopathy the value of QT interval dispersion for identifying the induction of sustained ventricular tachycardia in the electrophysiological study or the risk of sudden cardiac death. METHODS: We assessed QT interval dispersion in the 12-lead electrocardiogram of 26 patients with arrhythmogenic right ventricular cardiomyopathy. We analyzed its association with sustained ventricular tachycardia and sudden cardiac death, and in 16 controls similar in age and sex. RESULTS: (mean ± SD). QT interval dispersion: patients = 53.8±14.1ms; control group = 35.0±10.6ms, p=0.001. Patients with induction of ventricular tachycardia: 52.5±13.8ms; without induction of ventricular tachycardia: 57.5±12.8ms, p=0.420. In a mean follow-up period of 41±11 months, five sudden cardiac deaths occurred. QT interval dispersion in this group was 62.0±17.8, and in the others it was 51.9±12.8ms, p=0.852. Using a cutoff > or = 60ms to define an increase in the degree of the QT interval dispersion, we were able to identify patients at risk of sudden cardiac death with a sensitivity of 60%, a specificity of 57%, and positive and negative predictive values of 25% and 85%, respectively. CONCLUSION: Patients with arrhythmogenic right ventricular cardiomyopathy have a significant increase in the degree of QT interval dispersion when compared with the healthy population. However it, did not identify patients with induction of ventricular tachycardia in the electrophysiological study, showing a very low predictive value for defining the risk of sudden cardiac death in the population studied.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

At the age of 50, a woman has a lifetime risk of more than 40% to present a vertebral fracture. More than 60% of vertebral fractures remain undiagnosed. As a consequence it is of major importance to develop screening strategies to detect these fractures. Vertebral fracture assessment (VFA) by DXA allows one to detect vertebral fracture from T4 to L4 using DXA devices, while performing also during the same visit the bone mineral density measurement. Such an approach should improve the evaluation of fracture risk and therapeutic indication. Compared to the standard X-ray assessment, VFA highly enables to detect moderate or severe vertebral fractures below T6.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: SIOPEN scoring of 123I mIBG imaging has been shown to predict response to induction chemotherapy and outcome at diagnosis in children with HRN.Method: Patterns of skeletal 123I mIBG uptake were assigned numerical scores (Mscore) ranging from 0 (no metastasis) to 72 (diffuse metastases) within 12 body areas as described previously. 271 anonymised, paired image data sets acquired at diagnosis and on completion of Rapid COJEC induction chemotherapy were reviewed, constituting a representative sample of 1602 children treated prospectively within the HR-NBL1/SIOPEN trial. Pre-and post-treatment Mscores were compared with bone marrow cytology (BM) and 3 year event free survival (EFS).Results: Results 224/271 patients showed skeletal MIBG-uptake at diagnosis and were evaluable forMIBG-response. Complete response (CR) on MIBG to Rapid COJEC induction was achieved by 66%, 34% and 15% of patients who had pre-treatment Mscores of <18 (n¼65, 29%), 18-44 (n¼95,42%) and Y ´ 45 (n¼64, 28.5%) respectively (chi squared test p<.0001). Mscore at diagnosis and on completion of Rapid COJEC correlated strongly with BM involvement (p<0.0001). The correlation of pre score with post scores and response was highly significant (p<0.001). Most importantly, the 3 year EFS in 47 children with Mscore 0 at diagnosis was 0.68 (A ` 0.07), by comparison with 0.42 (A` 0.06), 0.35 (A` 0.05) and 0.25 (A` 0.06) for patients in pre-treatment score groups <18, 18-44 and Y ´ 45, respectively (p<0.001). AnMscore threshold ofY ´ 45 at diagnosis was associated with significantly worse outcome by comparison with all other Mscore groups (p¼0.029). The 3 year EFS of 0.53 (A` 0.07) of patients in metastatic CR (mIBG and BM) after Rapid Cojec (33%) is clearly superior to patients not achieving metastatic CR (0.24 (A ` 0.04), p¼0.005).Conclusion: SIOPEN scoring of 123I mIBG imaging has been shown to predict response to induction chemotherapy and outcome at diagnosis in children with HRN.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In a cohort study of 182 consecutive patients with active endogenous Cushing's syndrome, the only predictor of fracture occurrence after adjustment for age, gender bone mineral density (BMD) and trabecular bone score (TBS) was 24-h urinary free cortisol (24hUFC) levels with a threshold of 1472 nmol/24 h (odds ratio, 3.00 (95 % confidence interval (CI), 1.52-5.92); p = 0.002). INTRODUCTION: The aim was to estimate the risk factors for fracture in subjects with endogenous Cushing's syndrome (CS) and to evaluate the value of the TBS in these patients. METHODS: All enrolled patients with CS (n = 182) were interviewed in relation to low-traumatic fractures and underwent lateral X-ray imaging from T4 to L5. BMD measurements were performed using a DXA Prodigy device (GEHC Lunar, Madison, Wisconsin, USA). The TBS was derived retrospectively from existing BMD scans, blinded to clinical outcome, using TBS iNsight software v2.1 (Medimaps, Merignac, France). Urinary free cortisol (24hUFC) was measured by immunochemiluminescence assay (reference range, 60-413 nmol/24 h). RESULTS: Among enrolled patients with CS (149 females; 33 males; mean age, 37.8 years (95 % confidence interval, 34.2-39.1); 24hUFC, 2370 nmol/24 h (2087-2632), fractures were confirmed in 81 (44.5 %) patients, with 70 suffering from vertebral fractures, which were multiple in 53 cases; 24 patients reported non-vertebral fractures. The mean spine TBS was 1.207 (1.187-1.228), and TBS Z-score was -1.86 (-2.07 to -1.65); area under the curve (AUC) was used to predict fracture (mean spine TBS) = 0.548 (95 % CI, 0.454-0.641)). In the final regression model, the only predictor of fracture occurrence was 24hUFC levels (p = 0.001), with an increase of 1.041 (95 % CI, 1.019-1.063), calculated for every 100 nmol/24-h cortisol elevation (AUC (24hUFC) = 0.705 (95 % CI, 0.629-0.782)). CONCLUSIONS: Young patients with CS have a low TBS. However, the only predictor of low traumatic fracture is the severity of the disease itself, indicated by high 24hUFC levels.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis examines the effect of operating leverage and financial leverage on the value premium in the Finnish stock markets 2002-2012. The purpose of the thesis is to examine whether operating leverage and financial leverage affect firm`s BE/ME and stock returns. The accounting data has been collected from Amadeus database and market-based data from the Datastream database. Sample used in this thesis covers years from 1998 to 2012. This thesis confirms the findings of previous research of tight connection between operating leverage and BE/ME and reinforces the findings of previous research that relation between financial leverage and BE/ME is not robust. In turn, relation between operating leverage, BE/ME and stock returns is not clearly perceived during the 2002-2012 period in the Finnish stock markets.