851 resultados para Competing Risk Model
Resumo:
This paper traces the developments of credit risk modeling in the past 10 years. Our work can be divided into two parts: selecting articles and summarizing results. On the one hand, by constructing an ordered logit model on historical Journal of Economic Literature (JEL) codes of articles about credit risk modeling, we sort out articles which are the most related to our topic. The result indicates that the JEL codes have become the standard to classify researches in credit risk modeling. On the other hand, comparing with the classical review Altman and Saunders(1998), we observe some important changes of research methods of credit risk. The main finding is that current focuses on credit risk modeling have moved from static individual-level models to dynamic portfolio models.
Resumo:
An operational complexity model (OCM) is proposed to enable the complexity of both the cognitive and the computational components of a process to be determined. From the complexity of formation of a set of traces via a specified route a measure of the probability of that route can be determined. By determining the complexities of alternative routes leading to the formation of the same set of traces, the odds ratio indicating the relative plausibility of the alternative routes can be found. An illustrative application to a BitTorrent piracy case is presented, and the results obtained suggest that the OCM is capable of providing a realistic estimate of the odds ratio for two competing hypotheses. It is also demonstrated that the OCM can be straightforwardly refined to encompass a variety of circumstances.
Resumo:
The reliable evaluation of the flood forecasting is a crucial problem for assessing flood risk and consequent damages. Different hydrological models (distributed, semi-distributed or lumped) have been proposed in order to deal with this issue. The choice of the proper model structure has been investigated by many authors and it is one of the main sources of uncertainty for a correct evaluation of the outflow hydrograph. In addition, the recent increasing of data availability makes possible to update hydrological models as response of real-time observations. For these reasons, the aim of this work it is to evaluate the effect of different structure of a semi-distributed hydrological model in the assimilation of distributed uncertain discharge observations. The study was applied to the Bacchiglione catchment, located in Italy. The first methodological step was to divide the basin in different sub-basins according to topographic characteristics. Secondly, two different structures of the semi-distributed hydrological model were implemented in order to estimate the outflow hydrograph. Then, synthetic observations of uncertain value of discharge were generated, as a function of the observed and simulated value of flow at the basin outlet, and assimilated in the semi-distributed models using a Kalman Filter. Finally, different spatial patterns of sensors location were assumed to update the model state as response of the uncertain discharge observations. The results of this work pointed out that, overall, the assimilation of uncertain observations can improve the hydrologic model performance. In particular, it was found that the model structure is an important factor, of difficult characterization, since can induce different forecasts in terms of outflow discharge. This study is partly supported by the FP7 EU Project WeSenseIt.
Resumo:
In the UK, urban river basins are particularly vulnerable to flash floods due to short and intense rainfall. This paper presents potential flood resilience approaches for the highly urbanised Wortley Beck river basin, south west of the Leeds city centre. The reach of Wortley Beck is approximately 6km long with contributing catchment area of 30km2 that drain into the River Aire. Lower Wortley has experienced regular flooding over the last few years from a range of sources, including Wortley Beck and surface and ground water, that affects properties both upstream and downstream of Farnley Lake as well as Wortley Ring Road. This has serious implications for society, the environment and economy activity in the City of Leeds. The first stage of the study involves systematically incorporating Wortley Beck’s land scape features on an Arc-GIS platform to identify existing green features in the region. This process also enables the exploration of potential blue green features: green spaces, green roofs, water retention ponds and swales at appropriate locations and connect them with existing green corridors to maximize their productivity. The next stage is involved in developing a detailed 2D urban flood inundation model for the Wortley Beck region using the CityCat model. CityCat is capable to model the effects of permeable/impermeable ground surfaces and buildings/roofs to generate flood depth and velocity maps at 1m caused by design storm events. The final stage of the study is involved in simulation of range of rainfall and flood event scenarios through CityCat model with different blue green features. Installation of other hard engineering individual property protection measures through water butts and flood walls are also incorporated in the CityCat model. This enables an integrated sustainable flood resilience strategy for this region.
Resumo:
Digital elevation model (DEM) plays a substantial role in hydrological study, from understanding the catchment characteristics, setting up a hydrological model to mapping the flood risk in the region. Depending on the nature of study and its objectives, high resolution and reliable DEM is often desired to set up a sound hydrological model. However, such source of good DEM is not always available and it is generally high-priced. Obtained through radar based remote sensing, Shuttle Radar Topography Mission (SRTM) is a publicly available DEM with resolution of 92m outside US. It is a great source of DEM where no surveyed DEM is available. However, apart from the coarse resolution, SRTM suffers from inaccuracy especially on area with dense vegetation coverage due to the limitation of radar signals not penetrating through canopy. This will lead to the improper setup of the model as well as the erroneous mapping of flood risk. This paper attempts on improving SRTM dataset, using Normalised Difference Vegetation Index (NDVI), derived from Visible Red and Near Infra-Red band obtained from Landsat with resolution of 30m, and Artificial Neural Networks (ANN). The assessment of the improvement and the applicability of this method in hydrology would be highlighted and discussed.
Resumo:
Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.
Resumo:
A procedure for characterizing global uncertainty of a rainfall-runoff simulation model based on using grey numbers is presented. By using the grey numbers technique the uncertainty is characterized by an interval; once the parameters of the rainfall-runoff model have been properly defined as grey numbers, by using the grey mathematics and functions it is possible to obtain simulated discharges in the form of grey numbers whose envelope defines a band which represents the vagueness/uncertainty associated with the simulated variable. The grey numbers representing the model parameters are estimated in such a way that the band obtained from the envelope of simulated grey discharges includes an assigned percentage of observed discharge values and is at the same time as narrow as possible. The approach is applied to a real case study highlighting that a rigorous application of the procedure for direct simulation through the rainfall-runoff model with grey parameters involves long computational times. However, these times can be significantly reduced using a simplified computing procedure with minimal approximations in the quantification of the grey numbers representing the simulated discharges. Relying on this simplified procedure, the conceptual rainfall-runoff grey model is thus calibrated and the uncertainty bands obtained both downstream of the calibration process and downstream of the validation process are compared with those obtained by using a well-established approach, like the GLUE approach, for characterizing uncertainty. The results of the comparison show that the proposed approach may represent a valid tool for characterizing the global uncertainty associable with the output of a rainfall-runoff simulation model.
Resumo:
Due to the increase in water demand and hydropower energy, it is getting more important to operate hydraulic structures in an efficient manner while sustaining multiple demands. Especially, companies, governmental agencies, consultant offices require effective, practical integrated tools and decision support frameworks to operate reservoirs, cascades of run-of-river plants and related elements such as canals by merging hydrological and reservoir simulation/optimization models with various numerical weather predictions, radar and satellite data. The model performance is highly related with the streamflow forecast, related uncertainty and its consideration in the decision making. While deterministic weather predictions and its corresponding streamflow forecasts directly restrict the manager to single deterministic trajectories, probabilistic forecasts can be a key solution by including uncertainty in flow forecast scenarios for dam operation. The objective of this study is to compare deterministic and probabilistic streamflow forecasts on an earlier developed basin/reservoir model for short term reservoir management. The study is applied to the Yuvacık Reservoir and its upstream basin which is the main water supply of Kocaeli City located in the northwestern part of Turkey. The reservoir represents a typical example by its limited capacity, downstream channel restrictions and high snowmelt potential. Mesoscale Model 5 and Ensemble Prediction System data are used as a main input and the flow forecasts are done for 2012 year using HEC-HMS. Hydrometeorological rule-based reservoir simulation model is accomplished with HEC-ResSim and integrated with forecasts. Since EPS based hydrological model produce a large number of equal probable scenarios, it will indicate how uncertainty spreads in the future. Thus, it will provide risk ranges in terms of spillway discharges and reservoir level for operator when it is compared with deterministic approach. The framework is fully data driven, applicable, useful to the profession and the knowledge can be transferred to other similar reservoir systems.
Resumo:
The Delaware River provides half of New York City's drinking water, is a habitat for wild trout, American shad and the federally endangered dwarf wedge mussel. It has suffered four 100‐year floods in the last seven years. A drought during the 1960s stands as a warning of the potential vulnerability of the New York City area to severe water shortages if a similar drought were to recur. The water releases from three New York City dams on the Delaware River's headwaters impact not only the reliability of the city’s water supply, but also the potential impact of floods, and the quality of the aquatic habitat in the upper river. The goal of this work is to influence the Delaware River water release policies (FFMP/OST) to further benefit river habitat and fisheries without increasing New York City's drought risk, or the flood risk to down basin residents. The Delaware water release policies are constrained by the dictates of two US Supreme Court Decrees (1931 and 1954) and the need for unanimity among four states: New York, New Jersey, Pennsylvania, and Delaware ‐‐ and New York City. Coordination of their activities and the operation under the existing decrees is provided by the Delaware River Basin Commission (DRBC). Questions such as the probability of the system approaching drought state based on the current FFMP plan and the severity of the 1960s drought are addressed using long record paleo‐reconstructions of flows. For this study, we developed reconstructed total annual flows (water year) for 3 reservoir inflows using regional tree rings going back upto 1754 (a total of 246 years). The reconstructed flows are used with a simple reservoir model to quantify droughts. We observe that the 1960s drought is by far the worst drought based on 246 years of simulations (since 1754).
Resumo:
Standard models of moral hazard predict a negative relationship between risk and incentives, but the empirical work has not confirmed this prediction. In this paper, we propose a model with adverse selection followed by moral hazard, where effort and the degree of risk aversion are private information of an agent who can control the mean and the variance of profits. For a given contract, more risk-averse agents suppIy more effort in risk reduction. If the marginal utility of incentives decreases with risk aversion, more risk-averse agents prefer lower-incentive contractsj thus, in the optimal contract, incentives are positively correlated with endogenous risk. In contrast, if risk aversion is high enough, the possibility of reduction in risk makes the marginal utility of incentives increasing in risk aversion and, in this case, risk and incentives are negatively related.
Resumo:
In the last years, regulating agencies of rnany countries in the world, following recommendations of the Basel Committee, have compelled financiaI institutions to maintain minimum capital requirements to cover market risk. This paper investigates the consequences of such kind of regulation to social welfare and soundness of financiaI institutions through an equilibrium model. We show that the optimum level of regulation for each financiaI institution (the level that maximizes its utility) depends on its appetite for risk and some of them can perform better in a regulated economy. In addition, another important result asserts that under certain market conditions the financiaI fragility of an institution can be greater in a regulated econolny than in an unregulated one
Resumo:
Parametric term structure models have been successfully applied to innumerous problems in fixed income markets, including pricing, hedging, managing risk, as well as studying monetary policy implications. On their turn, dynamic term structure models, equipped with stronger economic structure, have been mainly adopted to price derivatives and explain empirical stylized facts. In this paper, we combine flavors of those two classes of models to test if no-arbitrage affects forecasting. We construct cross section (allowing arbitrages) and arbitrage-free versions of a parametric polynomial model to analyze how well they predict out-of-sample interest rates. Based on U.S. Treasury yield data, we find that no-arbitrage restrictions significantly improve forecasts. Arbitrage-free versions achieve overall smaller biases and Root Mean Square Errors for most maturities and forecasting horizons. Furthermore, a decomposition of forecasts into forward-rates and holding return premia indicates that the superior performance of no-arbitrage versions is due to a better identification of bond risk premium.
Resumo:
This paper presents a theoretical model which discusses the role played by the entrepreneurial risk on the distribution of the national income. In a two-period general equilibrium framework with competitive risk-averse entrepreneurs it is shown that the highest the risk borne by firms, the lower will be real wages, employment and the labour's share in output. The model helps explain the fall of the labour's share in the Brazilian output during the 1980s.
Resumo:
Verdelhan (2009) mostra que desejando-se explicar o comporta- mento do prêmio de risco nos mercados de títulos estrangeiros usando- se o modelo de formação externa de hábitos proposto por Campbell e Cochrane (1999) será necessário especi car o retorno livre de risco de equilíbrio de maneira pró-cíclica. Mostramos que esta especi cação só é possível sobre parâmetros de calibração implausíveis. Ainda no processo de calibração, para a maioria dos parâmetros razoáveis, a razão preço-consumo diverge. Entretanto, adotando a sugestão pro- posta por Verdelhan (2009) - de xar a função sensibilidade (st) no seu valor de steady-state durante a calibração e liberá-la apenas du- rante a simulação dos dados para se garantir taxas livre de risco pró- cíclicas - conseguimos encontrar um valor nito e bem comportado para a razão preço-consumo de equilíbrio e replicar o foward premium anom- aly. Desconsiderando possíveis inconsistências deste procedimento, so- bre retornos livres de risco pró-cíclicos, conforme sugerido por Wachter (2006), o modelo utilizado gera curvas de yields reais decrescentes na maturidade, independentemente do estado da economia - resultado que se opõe à literatura subjacente e aos dados reais sobre yields.
Resumo:
O projeto de pesquisa é parte do projeto entitulado "Credibilidade de Políticas Monetárias e Fiscais para o Brasil: Risco Soberano, Instituições, Âncoras Nominais, e Acesso aos Mercados Financeiros Internacionais". Dentro do atual plano de estabilização, um estudo empírico sobre a economia brasileira fornece um exemplo vívido do impacto de vários fatores, como o grau de institucionalização das políticas monetárias e orçamentárias que tem sido utilizadas desde a implementação do Plano Real, que aumentariam a credibilidade, sustentando a política cambial e o fluxo positivo do capital internacional, na percepção do mercado do risco de suspensão de pagamento (default risk) da dívida externa de um país em desenvolvimento. O foco dentro deste projeto de pesquisa será na questão de pesquisa: "Prêmio sobre o risco (risk premium) dos títulos soberanos e política fiscal discricionária vs regras de política fiscal para um país em desenvolvimento: o caso do Brasil".