981 resultados para default probability estimation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mendelian models can predict who carries an inherited deleterious mutation of known disease genes based on family history. For example, the BRCAPRO model is commonly used to identify families who carry mutations of BRCA1 and BRCA2, based on familial breast and ovarian cancers. These models incorporate the age of diagnosis of diseases in relatives and current age or age of death. We develop a rigorous foundation for handling multiple diseases with censoring. We prove that any disease unrelated to mutations can be excluded from the model, unless it is sufficiently common and dependent on a mutation-related disease time. Furthermore, if a family member has a disease with higher probability density among mutation carriers, but the model does not account for it, then the carrier probability is deflated. However, even if a family only has diseases the model accounts for, if the model excludes a mutation-related disease, then the carrier probability will be inflated. In light of these results, we extend BRCAPRO to account for surviving all non-breast/ovary cancers as a single outcome. The extension also enables BRCAPRO to extract more useful information from male relatives. Using 1500 familes from the Cancer Genetics Network, accounting for surviving other cancers improves BRCAPRO’s concordance index from 0.758 to 0.762 (p = 0.046), improves its positive predictive value from 35% to 39% (p < 10−6) without impacting its negative predictive value, and improves its overall calibration, although calibration slightly worsens for those with carrier probability < 10%. Copyright c 2000 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper studies the evolution of the default risk premia for European firms during the years surrounding the recent credit crisis. We employ the information embedded in Credit Default Swaps (CDS) and Moody’s KMV EDF default probabilities to analyze the common factors driving this risk premia. The risk premium is characterized in several directions: Firstly, we perform a panel data analysis to capture the relationship between CDS spreads and actual default probabilities. Secondly, we employ the intensity framework of Jarrow et al. (2005) in order to measure the theoretical effect of risk premium on expected bond returns. Thirdly, we carry out a dynamic panel data to identify the macroeconomic sources of risk premium. Finally, a vector autoregressive model analyzes which proportion of the co-movement is attributable to financial or macro variables. Our estimations report coefficients for risk premium substantially higher than previously referred for US firms and a time varying behavior. A dominant factor explains around 60% of the common movements in risk premia. Additionally, empirical evidence suggests a public-to-private risk transfer between the sovereign CDS spreads and corporate risk premia.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

En este trabajo se presenta una aplicación empírica del modelo de Hull-White (2000) al mercado de renta fija español. Este modelo proporciona la expresión por el cálculo de los pagos hechos por el comprador de un credit default swap (CDS), bajo la hipótesis de que no existe riesgo de contrapartida. Se supone, además, que la curva cupón cero, la tasa de recuperación constante y el momento del suceso de crédito son independientes. Se utilizan bonos del Banco Santander Central Hispano para mesurar la probabilidad neutra al riesgo de quiebra y, bajo hipótesis de no arbitraje, se calculan las primas de un CDS, por un bono subyacente con la misma calificación crediticia que la entidad de referencia. Se observa que las primas se ajustan bien a los spreads crediticios del mercado, que se acostumbran a utilizar como alternativa a las mismas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

En este trabajo se presenta una aplicación empírica del modelo de Hull-White (2000) al mercado de renta fija español. Este modelo proporciona la expresión por el cálculo de los pagos hechos por el comprador de un credit default swap (CDS), bajo la hipótesis de que no existe riesgo de contrapartida. Se supone, además, que la curva cupón cero, la tasa de recuperación constante y el momento del suceso de crédito son independientes. Se utilizan bonos del Banco Santander Central Hispano para mesurar la probabilidad neutra al riesgo de quiebra y, bajo hipótesis de no arbitraje, se calculan las primas de un CDS, por un bono subyacente con la misma calificación crediticia que la entidad de referencia. Se observa que las primas se ajustan bien a los spreads crediticios del mercado, que se acostumbran a utilizar como alternativa a las mismas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE: The aim of this study was to develop models based on kernel regression and probability estimation in order to predict and map IRC in Switzerland by taking into account all of the following: architectural factors, spatial relationships between the measurements, as well as geological information. METHODS: We looked at about 240,000 IRC measurements carried out in about 150,000 houses. As predictor variables we included: building type, foundation type, year of construction, detector type, geographical coordinates, altitude, temperature and lithology into the kernel estimation models. We developed predictive maps as well as a map of the local probability to exceed 300 Bq/m(3). Additionally, we developed a map of a confidence index in order to estimate the reliability of the probability map. RESULTS: Our models were able to explain 28% of the variations of IRC data. All variables added information to the model. The model estimation revealed a bandwidth for each variable, making it possible to characterize the influence of each variable on the IRC estimation. Furthermore, we assessed the mapping characteristics of kernel estimation overall as well as by municipality. Overall, our model reproduces spatial IRC patterns which were already obtained earlier. On the municipal level, we could show that our model accounts well for IRC trends within municipal boundaries. Finally, we found that different building characteristics result in different IRC maps. Maps corresponding to detached houses with concrete foundations indicate systematically smaller IRC than maps corresponding to farms with earth foundation. CONCLUSIONS: IRC mapping based on kernel estimation is a powerful tool to predict and analyze IRC on a large-scale as well as on a local level. This approach enables to develop tailor-made maps for different architectural elements and measurement conditions and to account at the same time for geological information and spatial relations between IRC measurements.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In clinical trials, it may be of interest taking into account physical and emotional well-being in addition to survival when comparing treatments. Quality-adjusted survival time has the advantage of incorporating information about both survival time and quality-of-life. In this paper, we discuss the estimation of the expected value of the quality-adjusted survival, based on multistate models for the sojourn times in health states. Semiparametric and parametric (with exponential distribution) approaches are considered. A simulation study is presented to evaluate the performance of the proposed estimator and the jackknife resampling method is used to compute bias and variance of the estimator. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Os modelos hazard, também conhecidos por modelos de tempo até a falência ou duração, são empregados para determinar quais variáveis independentes têm maior poder explicativo na previsão de falência de empresas. Consistem em uma abordagem alternativa aos modelos binários logit e probit, e à análise discriminante. Os modelos de duração deveriam ser mais eficientes que modelos de alternativas discretas, pois levam em consideração o tempo de sobrevivência para estimar a probabilidade instantânea de falência de um conjunto de observações sobre uma variável independente. Os modelos de alternativa discreta tipicamente ignoram a informação de tempo até a falência, e fornecem apenas a estimativa de falhar em um dado intervalo de tempo. A questão discutida neste trabalho é como utilizar modelos hazard para projetar taxas de inadimplência e construir matrizes de migração condicionadas ao estado da economia. Conceitualmente, o modelo é bastante análogo às taxas históricas de inadimplência e mortalidade utilizadas na literatura de crédito. O Modelo Semiparamétrico Proporcional de Cox é testado em empresas brasileiras não pertencentes ao setor financeiro, e observa-se que a probabilidade de inadimplência diminui sensivelmente após o terceiro ano da emissão do empréstimo. Observa-se também que a média e o desvio-padrão das probabilidades de inadimplência são afetados pelos ciclos econômicos. É discutido como o Modelo Proporcional de Cox pode ser incorporado aos quatro modelos mais famosos de gestão de risco .de crédito da atualidade: CreditRisk +, KMV, CreditPortfolio View e CreditMetrics, e as melhorias resultantes dessa incorporação

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We develop a framework to explain the private capital flows between the rest of the world and an emerging economy. The model, based on the monetary premium theory, relates an endogenous supply of foreign capitals to an endogenous differential of interest rates; its estimation uses the econometric techniques initiated by Heckman. Four questions regarding the capital flows phenomenon are explored, including the statistical process that governs the events of default and the impact of the probability of default on the interest rate differential. Using the methodology, we analyse the dynamics of foreign capital movements in Brazil during the 1991- 1998 period.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este trabalho explora a realização de default soberano em função da estrutura de spreads de CDS (Credit Default Swap). Pode-se dizer que os spreads revelam a probabilidade de default de um país. Aplicamos a metodologia proposta neste trabalho para Argentina, Coreia, Equador, Indonésia, México, Peru, Turquia, Ucrânia, Venezuela e Rússia. Nós mostramos que um modelo de um único fator seguindo um processo lognormal captura a probabilidade de default. Também mostramos que as variáveis macro econômicas inflação, desemprego e crescimento não explicam a variável dependente do estudo (probabilidade de default). Cada país reage de maneira diferente a crise econômica que a leva a não honrar seus compromissos com as dívidas contraídas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

O trabalho busca através de um exercício empírico, extrair as curvas de probabilidade implícita de default em debêntures brasileiras. A construção ocorre em duas etapas. O primeiro desafio é obter as estruturas a termo das debêntures brasileiras. Foi utilizada a revisão proposta por Diebold e Li (2006) do modelo de Nelson Siegel (1987) para construç o das ETTJs. A segunda etapa consiste em extrair a probabilidade de default utilizado a forma reduzida do modelo de Duffie e Singleton (1999). A fração de perda em caso de default foi considerada constante conforme estudo de Xu e Nencioni (2000). A taxa de decaimento também foi mantida constante conforme proposto por Diebold e Li (2006) e Araújo (2012). O exercício foi replicado para três datas distintas durante o ciclo de redução de juros no Brasil. Dentre os resultados desse estudo identificou-se que os agentes do mercado reduziram a probabilidade de default dos emissores durante esse período. A redução nos vértices mais curtos foi mais significativa do que em vértices mais longos.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An extension of some standard likelihood based procedures to heteroscedastic nonlinear regression models under scale mixtures of skew-normal (SMSN) distributions is developed. This novel class of models provides a useful generalization of the heteroscedastic symmetrical nonlinear regression models (Cysneiros et al., 2010), since the random term distributions cover both symmetric as well as asymmetric and heavy-tailed distributions such as skew-t, skew-slash, skew-contaminated normal, among others. A simple EM-type algorithm for iteratively computing maximum likelihood estimates of the parameters is presented and the observed information matrix is derived analytically. In order to examine the performance of the proposed methods, some simulation studies are presented to show the robust aspect of this flexible class against outlying and influential observations and that the maximum likelihood estimates based on the EM-type algorithm do provide good asymptotic properties. Furthermore, local influence measures and the one-step approximations of the estimates in the case-deletion model are obtained. Finally, an illustration of the methodology is given considering a data set previously analyzed under the homoscedastic skew-t nonlinear regression model. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper uses the framework developed by Vrugt (2010) to extract the recovery rate and term-structure of risk-neutral default probabilities implied in the cross-section of Portuguese sovereign bonds outstanding between March and August 2011. During this period the expectations on the recovery rate remain firmly anchored around 50 percent while the instantaneous default probability increases steadily from 6 to above 30 percent. These parameters are then used to calculate the fair-value of a 5-year and 10- year CDS contract. A credit-risk-neutral strategy is developed from the difference between the market price of a CDS of the same tenors and the fair-value calculated, yielding a sharpe ratio of 3.2

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Debris flow hazard modelling at medium (regional) scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal), and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy). The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R), developed at the University of Lausanne (Switzerland). An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise mainly from the models applied and analysis scale, which are neglecting local controlling factors of debris flow hazard. The presented approach of debris flow hazard analysis, associating automatic detection of the source areas and a simple assessment of the debris flow spreading, provided results for consequent hazard and risk studies. However, for the validation and transferability of the parameters and results to other study areas, more testing is needed.