936 resultados para Natural Catastrophe, Property Insurance, Loss Distribution, Truncated Data, Ruin Probability


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Catastrophe risk models used by the insurance industry are likely subject to significant uncertainty, but due to their proprietary nature and strict licensing conditions they are not available for experimentation. In addition, even if such experiments were conducted, these would not be repeatable by other researchers because commercial confidentiality issues prevent the details of proprietary catastrophe model structures from being described in public domain documents. However, such experimentation is urgently required to improve decision making in both insurance and reinsurance markets. In this paper we therefore construct our own catastrophe risk model for flooding in Dublin, Ireland, in order to assess the impact of typical precipitation data uncertainty on loss predictions. As we consider only a city region rather than a whole territory and have access to detailed data and computing resources typically unavailable to industry modellers, our model is significantly more detailed than most commercial products. The model consists of four components, a stochastic rainfall module, a hydrological and hydraulic flood hazard module, a vulnerability module, and a financial loss module. Using these we undertake a series of simulations to test the impact of driving the stochastic event generator with four different rainfall data sets: ground gauge data, gauge-corrected rainfall radar, meteorological reanalysis data (European Centre for Medium-Range Weather Forecasts Reanalysis-Interim; ERA-Interim) and a satellite rainfall product (The Climate Prediction Center morphing method; CMORPH). Catastrophe models are unusual because they use the upper three components of the modelling chain to generate a large synthetic database of unobserved and severe loss-driving events for which estimated losses are calculated. We find the loss estimates to be more sensitive to uncertainties propagated from the driving precipitation data sets than to other uncertainties in the hazard and vulnerability modules, suggesting that the range of uncertainty within catastrophe model structures may be greater than commonly believed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Potential future changes in tropical cyclone (TC) characteristics are among the more serious regional threats of global climate change. Therefore, a better understanding of how anthropogenic climate change may affect TCs and how these changes translate in socio-economic impacts is required. Here, we apply a TC detection and tracking method that was developed for ERA-40 data to time-slice experiments of two atmospheric general circulation models, namely the fifth version of the European Centre model of Hamburg model (MPI, Hamburg, Germany, T213) and the Japan Meteorological Agency/ Meteorological research Institute model (MRI, Tsukuba city, Japan, TL959). For each model, two climate simulations are available: a control simulation for present-day conditions to evaluate the model against observations, and a scenario simulation to assess future changes. The evaluation of the control simulations shows that the number of intense storms is underestimated due to the model resolution. To overcome this deficiency, simulated cyclone intensities are scaled to the best track data leading to a better representation of the TC intensities. Both models project an increased number of major hurricanes and modified trajectories in their scenario simulations. These changes have an effect on the projected loss potentials. However, these state-of-the-art models still yield contradicting results, and therefore they are not yet suitable to provide robust estimates of losses due to uncertainties in simulated hurricane intensity, location and frequency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Im Rahmen der aktuellen Diskussion über die effektive Messung operationeller Risiken auf der Basis interner Modelle hat vor allem der Loss Distribution Approach in der Literatur be-sondere Beachtung gefunden. Dieser Ansatz hat seine Wurzeln in einem traditionellen Ansatz der Versicherungsmathematik, der kollektiven Risikotheorie. Die vorliegende Ausarbeitung stellt daher die kollektive Risikotheorie in ihren Grundelemen-ten dar, stellt die Verbindung zur Modellierung operationeller Risiken her und gibt einen Überblick über aktuelle Entwicklungen im Rahmen des Loss Distribution Approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performances of high-speed network communications frequently rest with the distribution of data-stream. In this paper, a dynamic data-stream balancing architecture based on link information is introduced and discussed firstly. Then the algorithms for simultaneously acquiring the passing nodes and links of a path between any two source-destination nodes rapidly, as well as a dynamic data-stream distribution planning are proposed. Some related topics such as data fragment disposal, fair service, etc. are further studied and discussed. Besides, the performance and efficiency of proposed algorithms, especially for fair service and convergence, are evaluated through a demonstration with regard to the rate of bandwidth utilization. Hoping the discussion presented here can be helpful to application developers in selecting an effective strategy for planning the distribution of data-stream.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Binning and truncation of data are common in data analysis and machine learning. This paper addresses the problem of fitting mixture densities to multivariate binned and truncated data. The EM approach proposed by McLachlan and Jones (Biometrics, 44: 2, 571-578, 1988) for the univariate case is generalized to multivariate measurements. The multivariate solution requires the evaluation of multidimensional integrals over each bin at each iteration of the EM procedure. Naive implementation of the procedure can lead to computationally inefficient results. To reduce the computational cost a number of straightforward numerical techniques are proposed. Results on simulated data indicate that the proposed methods can achieve significant computational gains with no loss in the accuracy of the final parameter estimates. Furthermore, experimental results suggest that with a sufficient number of bins and data points it is possible to estimate the true underlying density almost as well as if the data were not binned. The paper concludes with a brief description of an application of this approach to diagnosis of iron deficiency anemia, in the context of binned and truncated bivariate measurements of volume and hemoglobin concentration from an individual's red blood cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Severe wind storms are one of the major natural hazards in the extratropics and inflict substantial economic damages and even casualties. Insured storm-related losses depend on (i) the frequency, nature and dynamics of storms, (ii) the vulnerability of the values at risk, (iii) the geographical distribution of these values, and (iv) the particular conditions of the risk transfer. It is thus of great importance to assess the impact of climate change on future storm losses. To this end, the current study employs—to our knowledge for the first time—a coupled approach, using output from high-resolution regional climate model scenarios for the European sector to drive an operational insurance loss model. An ensemble of coupled climate-damage scenarios is used to provide an estimate of the inherent uncertainties. Output of two state-of-the-art global climate models (HadAM3, ECHAM5) is used for present (1961–1990) and future climates (2071–2100, SRES A2 scenario). These serve as boundary data for two nested regional climate models with a sophisticated gust parametrizations (CLM, CHRM). For validation and calibration purposes, an additional simulation is undertaken with the CHRM driven by the ERA40 reanalysis. The operational insurance model (Swiss Re) uses a European-wide damage function, an average vulnerability curve for all risk types, and contains the actual value distribution of a complete European market portfolio. The coupling between climate and damage models is based on daily maxima of 10 m gust winds, and the strategy adopted consists of three main steps: (i) development and application of a pragmatic selection criterion to retrieve significant storm events, (ii) generation of a probabilistic event set using a Monte-Carlo approach in the hazard module of the insurance model, and (iii) calibration of the simulated annual expected losses with a historic loss data base. The climate models considered agree regarding an increase in the intensity of extreme storms in a band across central Europe (stretching from southern UK and northern France to Denmark, northern Germany into eastern Europe). This effect increases with event strength, and rare storms show the largest climate change sensitivity, but are also beset with the largest uncertainties. Wind gusts decrease over northern Scandinavia and Southern Europe. Highest intra-ensemble variability is simulated for Ireland, the UK, the Mediterranean, and parts of Eastern Europe. The resulting changes on European-wide losses over the 110-year period are positive for all layers and all model runs considered and amount to 44% (annual expected loss), 23% (10 years loss), 50% (30 years loss), and 104% (100 years loss). There is a disproportionate increase in losses for rare high-impact events. The changes result from increases in both severity and frequency of wind gusts. Considerable geographical variability of the expected losses exists, with Denmark and Germany experiencing the largest loss increases (116% and 114%, respectively). All countries considered except for Ireland (−22%) experience some loss increases. Some ramifications of these results for the socio-economic sector are discussed, and future avenues for research are highlighted. The technique introduced in this study and its application to realistic market portfolios offer exciting prospects for future research on the impact of climate change that is relevant for policy makers, scientists and economists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simple storm loss model is applied to an ensemble of ECHAM5/MPI-OM1 GCM simulations in order to estimate changes of insured loss potentials over Europe in the 21st century. Losses are computed based on the daily maximum wind speed for each grid point. The calibration of the loss model is performed using wind data from the ERA40-Reanalysis and German loss data. The obtained annual losses for the present climate conditions (20C, three realisations) reproduce the statistical features of the historical insurance loss data for Germany. The climate change experiments correspond to the SRES-Scenarios A1B and A2, and for each of them three realisations are considered. On average, insured loss potentials increase for all analysed European regions at the end of the 21st century. Changes are largest for Germany and France, and lowest for Portugal/Spain. Additionally, the spread between the single realisations is large, ranging e.g. for Germany from −4% to +43% in terms of mean annual loss. Moreover, almost all simulations show an increasing interannual variability of storm damage. This assessment is even more pronounced if no adaptation of building structure to climate change is considered. The increased loss potentials are linked with enhanced values for the high percentiles of surface wind maxima over Western and Central Europe, which in turn are associated with an enhanced number and increased intensity of extreme cyclones over the British Isles and the North Sea.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report was prepared at the request of the United Nations Economic Commission for Latin America and the Caribbean (ECLAC) with support from the Caribbean Catastrophe Risk Insurance Facility (CCRIF) to assess strategies for linking the ECLAC Damage and Loss Assessment (DaLA) Methology to the Post Disaster Needs Assessment (PDNA). Each metholodolgy was individually outlined and their use in the Caribbean context was explored in detail to set the framework or lens through which their linking would be viewed. Other methologies that are used within the recovery process were identified and outlined. A gap analysis was conducted on moving from the PDNA with a focus on initial rapid reponse to DaLA. DaLA training materials were reviewed to assess where improvements can be made to seamlessly move from one methology to the next. Additionally, both DaLA and PDNA reports were reviewed to identify specific areas of information which could serve as common data links, and note how this linkage could inform the overall disaster assessments in the region. This is in addition to noting any similarities or variance in the application of both methologies. Challenges to linking both methodologies were identified such as countries lacking well defined recovery frameworks and their ability to fund or finance recovery efforts, in addition to recurrent challenges in the Caribbean region such as inadequacy of baseline data, human resource and training, and identifying teams to conduct the data collection. Recommendations made in terms of the strategies to be employed for the successful linking of both the DaLA and PDNA Methodologies included: creating and maintaining a recovery framework and baseline data; creation of a minimum requirements list for the successful implementation of PDNA and DaLA implementation; and increasing political will in addition to identify a champion to push the subject.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estimation of Taylor`s power law for species abundance data may be performed by linear regression of the log empirical variances on the log means, but this method suffers from a problem of bias for sparse data. We show that the bias may be reduced by using a bias-corrected Pearson estimating function. Furthermore, we investigate a more general regression model allowing for site-specific covariates. This method may be efficiently implemented using a Newton scoring algorithm, with standard errors calculated from the inverse Godambe information matrix. The method is applied to a set of biomass data for benthic macrofauna from two Danish estuaries. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background & Aims: An elevated transferrin saturation is the earliest phenotypic abnormality in hereditary hemochromatosis. Determination of transferrin saturation remains the most useful noninvasive screening test for affected individuals, but there is debate as to the appropriate screening level. The aims of this study were to estimate the mean transferrin saturation in hemochromatosis heterozygotes and normal individuals and to evaluate potential transferrin saturation screening levels. Methods: Statistical mixture modeling was applied to data from a survey of asymptomatic Australians to estimate the mean transferrin saturation in hemochromatosis heterozygotes and normal individuals. To evaluate potential transferrin saturation screening levels, modeling results were compared with data from identified hemochromatosis heterozygotes and homozygotes. Results: After removal of hemochromatosis homozygotes, two populations of transferrin saturation were identified in asymptomatic Australians (P < 0.01). In men, 88.2% of the truncated sample had a lower mean transferrin saturation of 24.1%, whereas 11.8% had an increased mean transferrin saturation of 37.3%. Similar results were found in women, A transferrin saturation threshold of 45% identified 98% of homozygotes without misidentifying any normal individuals. Conclusions: The results confirm that hemochromatosis heterozygotes form a distinct transferrin saturation subpopulation and support the use of transferrin saturation as an inexpensive screening test for hemochromatosis. In practice, a fasting transferrin saturation of greater than or equal to 45% identifies virtually all affected homozygous subjects without necessitating further investigation of unaffected normal individuals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The male of Latonigena auricomis Simon, 1893 is described for the first time and the female is redescribed. New records are provided for Argentina, Brazil and Uruguay. Notes on the natural history and a potential distribution model of the species are presented in the Neotropical Region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are far-reaching conceptual similarities between bi-static surface georadar and post-stack, "zero-offset" seismic reflection data, which is expressed in largely identical processing flows. One important difference is, however, that standard deconvolution algorithms routinely used to enhance the vertical resolution of seismic data are notoriously problematic or even detrimental to the overall signal quality when applied to surface georadar data. We have explored various options for alleviating this problem and have tested them on a geologically well-constrained surface georadar dataset. Standard stochastic and direct deterministic deconvolution approaches proved to be largely unsatisfactory. While least-squares-type deterministic deconvolution showed some promise, the inherent uncertainties involved in estimating the source wavelet introduced some artificial "ringiness". In contrast, we found spectral balancing approaches to be effective, practical and robust means for enhancing the vertical resolution of surface georadar data, particularly, but not exclusively, in the uppermost part of the georadar section, which is notoriously plagued by the interference of the direct air- and groundwaves. For the data considered in this study, it can be argued that band-limited spectral blueing may provide somewhat better results than standard band-limited spectral whitening, particularly in the uppermost part of the section affected by the interference of the air- and groundwaves. Interestingly, this finding is consistent with the fact that the amplitude spectrum resulting from least-squares-type deterministic deconvolution is characterized by a systematic enhancement of higher frequencies at the expense of lower frequencies and hence is blue rather than white. It is also consistent with increasing evidence that spectral "blueness" is a seemingly universal, albeit enigmatic, property of the distribution of reflection coefficients in the Earth. Our results therefore indicate that spectral balancing techniques in general and spectral blueing in particular represent simple, yet effective means of enhancing the vertical resolution of surface georadar data and, in many cases, could turn out to be a preferable alternative to standard deconvolution approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, compositional data has been identified with closed data, and the simplex has been considered as the natural sample space of this kind of data. In our opinion, the emphasis on the constrained nature ofcompositional data has contributed to mask its real nature. More crucial than the constraining property of compositional data is the scale-invariant property of this kind of data. Indeed, when we are considering only few parts of a full composition we are not working with constrained data but our data are still compositional. We believe that it is necessary to give a more precisedefinition of composition. This is the aim of this oral contribution