50 resultados para Natural Catastrophe, Property Insurance, Loss Distribution, Truncated Data, Ruin Probability
Resumo:
Winter storms of the midlatitudes are an important factor for property losses caused by natural hazards over Europe. The storm series in early 1990 and late 1999 led to enormous economic damages and insured claims. Although significant trends in North Atlantic/European storm activity have not been identified for the last few decades, recent studies provide evidence that under anthropogenic climate change the number of extreme storms could increase, whereas the total number of cyclones may be slightly reduced. In this study, loss potentials derived from an ensemble of climate models using a simple storm damage model under climate change conditions are shown. For the United Kingdom and Germany ensemble-mean storm-related losses are found to increase by up to 37%. Furthermore, the interannual variability of extreme events will increase leading to a higher risk of extreme storm activity and related losses.
Resumo:
The XWS (eXtreme WindStorms) catalogue consists of storm tracks and model-generated maximum 3 s wind-gust footprints for 50 of the most extreme winter windstorms to hit Europe in the period 1979–2012. The catalogue is intended to be a valuable resource for both academia and industries such as (re)insurance, for example allowing users to characterise extreme European storms, and validate climate and catastrophe models. Several storm severity indices were investigated to find which could best represent a list of known high-loss (severe) storms. The best-performing index was Sft, which is a combination of storm area calculated from the storm footprint and maximum 925 hPa wind speed from the storm track. All the listed severe storms are included in the catalogue, and the remaining ones were selected using Sft. A comparison of the model footprint to station observations revealed that storms were generally well represented, although for some storms the highest gusts were underestimated. Possible reasons for this underestimation include the model failing to simulate strong enough pressure gradients and not representing convective gusts. A new recalibration method was developed to estimate the true distribution of gusts at each grid point and correct for this underestimation. The recalibration model allows for storm-to-storm variation which is essential given that different storms have different degrees of model bias. The catalogue is available at www.europeanwindstorms.org.
Resumo:
Since the advent of the internet in every day life in the 1990s, the barriers to producing, distributing and consuming multimedia data such as videos, music, ebooks, etc. have steadily been lowered for most computer users so that almost everyone with internet access can join the online communities who both produce, consume and of course also share media artefacts. Along with this trend, the violation of personal data privacy and copyright has increased with illegal file sharing being rampant across many online communities particularly for certain music genres and amongst the younger age groups. This has had a devastating effect on the traditional media distribution market; in most cases leaving the distribution companies and the content owner with huge financial losses. To prove that a copyright violation has occurred one can deploy fingerprinting mechanisms to uniquely identify the property. However this is currently based on only uni-modal approaches. In this paper we describe some of the design challenges and architectural approaches to multi-modal fingerprinting currently being examined for evaluation studies within a PhD research programme on optimisation of multi-modal fingerprinting architectures. Accordingly we outline the available modalities that are being integrated through this research programme which aims to establish the optimal architecture for multi-modal media security protection over the internet as the online distribution environment for both legal and illegal distribution of media products.
Resumo:
Understanding links between the El Nino-Southern Oscillation (ENSO) and snow would be useful for seasonal forecasting, but also for understanding natural variability and interpreting climate change predictions. Here, a 545-year run of the general circulation model HadCM3, with prescribed external forcings and fixed greenhouse gas concentrations, is used to explore the impact of ENSO on snow water equivalent (SWE) anomalies. In North America, positive ENSO events reduce the mean SWE and skew the distribution towards lower values, and vice versa during negative ENSO events. This is associated with a dipole SWE anomaly structure, with anomalies of opposite sign centered in western Canada and the central United States. In Eurasia, warm episodes lead to a more positively skewed distribution and the mean SWE is raised. Again, the opposite effect is seen during cold episodes. In Eurasia the largest anomalies are concentrated in the Himalayas. These correlations with February SWE distribution are seen to exist from the previous June-July-August (JJA) ENSO index onwards, and are weakly detected in 50-year subsections of the control run, but only a shifted North American response can be detected in the anaylsis of 40 years of ERA40 reanalysis data. The ENSO signal in SWE from the long run could still contribute to regional predictions although it would be a weak indicator only
Resumo:
Particle size distribution (psd) is one of the most important features of the soil because it affects many of its other properties, and it determines how soil should be managed. To understand the properties of chalk soil, psd analyses should be based on the original material (including carbonates), and not just the acid-resistant fraction. Laser-based methods rather than traditional sedimentation methods are being used increasingly to determine particle size to reduce the cost of analysis. We give an overview of both approaches and the problems associated with them for analyzing the psd of chalk soil. In particular, we show that it is not appropriate to use the widely adopted 8 pm boundary between the clay and silt size fractions for samples determined by laser to estimate proportions of these size fractions that are equivalent to those based on sedimentation. We present data from field and national-scale surveys of soil derived from chalk in England. Results from both types of survey showed that laser methods tend to over-estimate the clay-size fraction compared to sedimentation for the 8 mu m clay/silt boundary, and we suggest reasons for this. For soil derived from chalk, either the sedimentation methods need to be modified or it would be more appropriate to use a 4 pm threshold as an interim solution for laser methods. Correlations between the proportions of sand- and clay-sized fractions, and other properties such as organic matter and volumetric water content, were the opposite of what one would expect for soil dominated by silicate minerals. For water content, this appeared to be due to the predominance of porous, chalk fragments in the sand-sized fraction rather than quartz grains, and the abundance of fine (<2 mu m) calcite crystals rather than phyllosilicates in the clay-sized fraction. This was confirmed by scanning electron microscope (SEM) analyses. "Of all the rocks with which 1 am acquainted, there is none whose formation seems to tax the ingenuity of theorists so severely, as the chalk, in whatever respect we may think fit to consider it". Thomas Allan, FRS Edinburgh 1823, Transactions of the Royal Society of Edinburgh. (C) 2009 Natural Environment Research Council (NERC) Published by Elsevier B.V. All rights reserved.
Resumo:
While over-dispersion in capture–recapture studies is well known to lead to poor estimation of population size, current diagnostic tools to detect the presence of heterogeneity have not been specifically developed for capture–recapture studies. To address this, a simple and efficient method of testing for over-dispersion in zero-truncated count data is developed and evaluated. The proposed method generalizes an over-dispersion test previously suggested for un-truncated count data and may also be used for testing residual over-dispersion in zero-inflation data. Simulations suggest that the asymptotic distribution of the test statistic is standard normal and that this approximation is also reasonable for small sample sizes. The method is also shown to be more efficient than an existing test for over-dispersion adapted for the capture–recapture setting. Studies with zero-truncated and zero-inflated count data are used to illustrate the test procedures.
Resumo:
In this paper we consider the estimation of population size from onesource capture–recapture data, that is, a list in which individuals can potentially be found repeatedly and where the question is how many individuals are missed by the list. As a typical example, we provide data from a drug user study in Bangkok from 2001 where the list consists of drug users who repeatedly contact treatment institutions. Drug users with 1, 2, 3, . . . contacts occur, but drug users with zero contacts are not present, requiring the size of this group to be estimated. Statistically, these data can be considered as stemming from a zero-truncated count distribution.We revisit an estimator for the population size suggested by Zelterman that is known to be robust under potential unobserved heterogeneity. We demonstrate that the Zelterman estimator can be viewed as a maximum likelihood estimator for a locally truncated Poisson likelihood which is equivalent to a binomial likelihood. This result allows the extension of the Zelterman estimator by means of logistic regression to include observed heterogeneity in the form of covariates. We also review an estimator proposed by Chao and explain why we are not able to obtain similar results for this estimator. The Zelterman estimator is applied in two case studies, the first a drug user study from Bangkok, the second an illegal immigrant study in the Netherlands. Our results suggest the new estimator should be used, in particular, if substantial unobserved heterogeneity is present.
Resumo:
There has been a clear lack of common data exchange semantics for inter-organisational workflow management systems where the research has mainly focused on technical issues rather than language constructs. This paper presents the neutral data exchanges semantics required for the workflow integration within the AXAEDIS framework and presents the mechanism for object discovery from the object repository where little or no knowledge about the object is available. The paper also presents workflow independent integration architecture with the AXAEDIS Framework.
Resumo:
Purpose – The purpose of this paper is to consider prospects for UK REITs, which were introduced on 1 January 2007. It specifically focuses on the potential influence of depreciation and expenditure on income and distributions. Design/methodology/approach – First, the ways in which depreciation can affect vehicle earnings and value are discussed. This is then set in the context of the specific rules and features of REITs. An analysis using property income and expenditure data from the Investment Property Databank (IPD) then assesses what gross and net income for a UK REIT might have been like for the period 1984-2003. Findings – A UK REIT must distribute at least 90 per cent of net income from its property rental business. Expenditure therefore plays a significant part in determining what funds remain for distribution. Over 1984-2003, expenditure has absorbed 20 per cent of gross income and been a source of earnings volatility, which would have been exacerbated by gearing. Practical implications – Expenditure must take place to help UK REITs maintain and renew their real estate portfolios. In view of this, investors should moderate expectations of a high and stable income return, although it may well still be so relative to alternative investments. Originality/value – Previous literature on depreciation has not quantified amounts spent on portfolios to keep depreciation at those rates. Nor, to our knowledge, has its ideas been placed in the indirect investor context.