936 resultados para Natural Catastrophe, Property Insurance, Loss Distribution, Truncated Data, Ruin Probability
Resumo:
Spatial data on species distributions are available in two main forms, point locations and distribution maps (polygon ranges and grids). The first are often temporally and spatially biased, and too discontinuous, to be useful (untransformed) in spatial analyses. A variety of modelling approaches are used to transform point locations into maps. We discuss the attributes that point location data and distribution maps must satisfy in order to be useful in conservation planning. We recommend that before point location data are used to produce and/or evaluate distribution models, the dataset should be assessed under a set of criteria, including sample size, age of data, environmental/geographical coverage, independence, accuracy, time relevance and (often forgotten) representation of areas of permanent and natural presence of the species. Distribution maps must satisfy additional attributes if used for conservation analyses and strategies, including minimizing commission and omission errors, credibility of the source/assessors and availability for public screening. We review currently available databases for mammals globally and show that they are highly variable in complying with these attributes. The heterogeneity and weakness of spatial data seriously constrain their utility to global and also sub-global scale conservation analyses.
Resumo:
It has been suggested that converting, via a process of cross-coding, the listing used by the Swiss Disability Insurance (SDI) for their statistics into codes of the International Classification of Impairments, Disabilities, and Handicaps (ICIDH) would improve the quality and international comparability of disability statistics for Switzerland. Using two different methods we tested the feasibility of this cross-coding on a consecutive sample of 204 insured persons, examined at one of the medical observation centres of the SDI. Cross-coding is impossible, for all practical purposes, in a proportion varying between 30% and 100%, depending on the method of cross-coding, the level of disablement and the required quality of the resulting codes. Failure is due to lack of validity of the SDI codes: diseases are poorly described, consequences of diseases (disability and handicap, including loss of earning capacity), insufficiently described or not at all. Assessment of disability and handicap would provide necessary information for the SDI. It is concluded that the SDI should promote the use of the ICIDH in Switzerland, especially among medical practitioners whose assessment of work capacity is the key element in the decision to award benefits or propose rehabilitation.
Resumo:
The 1994 Northridge earthquake sent ripples to insurance conpanieseverywhere. This was one in a series of natural disasters such asHurricane Andrew which together with the problems in Lloyd's of Londonhave insurance companies running for cover. This paper presents a calibration of the U.S. economy in a model with financial markets forinsurance derivatives that suggests the U.S. economy can deal with thedamage of natural catastrophe far better than one might think.
Resumo:
This paper focuses on the switching behaviour of enrolees in the Swiss basic health insurance system. Even though the new Federal Law on Social Health Insurance (LAMal) was implemented in 1996 to promote competition among health insurers in basic insurance, there is limited evidence of premium convergence within cantons. This indicates that competition has not been effective so far, and reveals some inertia among consumers who seem reluctant to switch to less expensive funds. We investigate one possible barrier to switching behaviour, namely the influence of supplementary insurance. We use survey data on health plan choice (a sample of 1943 individuals whose switching behaviours were observed between 1997 and 2000) as well as administrative data relative to all insurance companies that operated in the 26 Swiss cantons between 1996 and 2005. The decision to switch and the decision to subscribe to a supplementary contract are jointly estimated.Our findings show that holding a supplementary insurance contract substantially decreases the propensity to switch. However, there is no negative impact of supplementary insurance on switching when the individual assesses his/her health as 'very good'. Our results give empirical support to one possible mechanism through which supplementary insurance might influence switching decisions: given that subscribing to basic and supplementary contracts with two different insurers may induce some administrative costs for the subscriber, holding supplementary insurance acts as a barrier to switch if customers who consider themselves 'bad risks' also believe that insurers reject applications for supplementary insurance on these grounds. In comparison with previous research, our main contribution is to offer a possible explanation for consumer inertia. Our analysis illustrates how consumer choice for one's basic health plan interacts with the decision to subscribe to supplementary insurance.
Resumo:
Communications play a key role in modern smart grids. New functionalities that make the grids ‘smart’ require the communication network to function properly. Data transmission between intelligent electric devices (IEDs) in the rectifier and the customer-end inverters (CEIs) used for power conversion is also required in the smart grid concept of the low-voltage direct current (LVDC) distribution network. Smart grid applications, such as smart metering, demand side management (DSM), and grid protection applied with communications are all installed in the LVDC system. Thus, besides remote connection to the databases of the grid operators, a local communication network in the LVDC network is needed. One solution applied to implement the communication medium in power distribution grids is power line communication (PLC). There are power cables in the distribution grids, and hence, they may be applied as a communication channel for the distribution-level data. This doctoral thesis proposes an IP-based high-frequency (HF) band PLC data transmission concept for the LVDC network. A general method to implement the Ethernet-based PLC concept between the public distribution rectifier and the customerend inverters in the LVDC grid is introduced. Low-voltage cables are studied as the communication channel in the frequency band of 100 kHz–30 MHz. The communication channel characteristics and the noise in the channel are described. All individual components in the channel are presented in detail, and a channel model, comprising models for each channel component is developed and verified by measurements. The channel noise is also studied by measurements. Theoretical signalto- noise ratio (SNR) and channel capacity analyses and practical data transmission tests are carried out to evaluate the applicability of the PLC concept against the requirements set by the smart grid applications in the LVDC system. The main results concerning the applicability of the PLC concept and its limitations are presented, and suggestion for future research proposed.
Resumo:
In the doctoral dissertation, low-voltage direct current (LVDC) distribution system stability, supply security and power quality are evaluated by computational modelling and measurements on an LVDC research platform. Computational models for the LVDC network analysis are developed. Time-domain simulation models are implemented in the time-domain simulation environment PSCAD/EMTDC. The PSCAD/EMTDC models of the LVDC network are applied to the transient behaviour and power quality studies. The LVDC network power loss model is developed in a MATLAB environment and is capable of fast estimation of the network and component power losses. The model integrates analytical equations that describe the power loss mechanism of the network components with power flow calculations. For an LVDC network research platform, a monitoring and control software solution is developed. The solution is used to deliver measurement data for verification of the developed models and analysis of the modelling results. In the work, the power loss mechanism of the LVDC network components and its main dependencies are described. Energy loss distribution of the LVDC network components is presented. Power quality measurements and current spectra are provided and harmonic pollution on the DC network is analysed. The transient behaviour of the network is verified through time-domain simulations. DC capacitor guidelines for an LVDC power distribution network are introduced. The power loss analysis results show that one of the main optimisation targets for an LVDC power distribution network should be reduction of the no-load losses and efficiency improvement of converters at partial loads. Low-frequency spectra of the network voltages and currents are shown, and harmonic propagation is analysed. Power quality in the LVDC network point of common coupling (PCC) is discussed. Power quality standard requirements are shown to be met by the LVDC network. The network behaviour during transients is analysed by time-domain simulations. The network is shown to be transient stable during large-scale disturbances. Measurement results on the LVDC research platform proving this are presented in the work.
Resumo:
For my Licentiate thesis, I conducted research on risk measures. Continuing with this research, I now focus on capital allocation. In the proportional capital allocation principle, the choice of risk measure plays a very important part. In the chapters Introduction and Basic concepts, we introduce three definitions of economic capital, discuss the purpose of capital allocation, give different viewpoints of capital allocation and present an overview of relevant literature. Risk measures are defined and the concept of coherent risk measure is introduced. Examples of important risk measures are given, e. g., Value at Risk (VaR), Tail Value at Risk (TVaR). We also discuss the implications of dependence and review some important distributions. In the following chapter on Capital allocation we introduce different principles for allocating capital. We prefer to work with the proportional allocation method. In the following chapter, Capital allocation based on tails, we focus on insurance business lines with heavy-tailed loss distribution. To emphasize capital allocation based on tails, we define the following risk measures: Conditional Expectation, Upper Tail Covariance and Tail Covariance Premium Adjusted (TCPA). In the final chapter, called Illustrative case study, we simulate two sets of data with five insurance business lines using Normal copulas and Cauchy copulas. The proportional capital allocation is calculated using TCPA as risk measure. It is compared with the result when VaR is used as risk measure and with covariance capital allocation. In this thesis, it is emphasized that no single allocation principle is perfect for all purposes. When focusing on the tail of losses, the allocation based on TCPA is a good one, since TCPA in a sense includes features of TVaR and Tail covariance.
Resumo:
Traditionally, compositional data has been identified with closed data, and the simplex has been considered as the natural sample space of this kind of data. In our opinion, the emphasis on the constrained nature of compositional data has contributed to mask its real nature. More crucial than the constraining property of compositional data is the scale-invariant property of this kind of data. Indeed, when we are considering only few parts of a full composition we are not working with constrained data but our data are still compositional. We believe that it is necessary to give a more precise definition of composition. This is the aim of this oral contribution
Resumo:
This paper discusses experimental and theoretical investigations and Computational Fluid Dynamics (CFD) modelling considerations to evaluate the performance of a square section wind catcher system connected to the top of a test room for the purpose of natural ventilation. The magnitude and distribution of pressure coefficients (C-p) around a wind catcher and the air flow into the test room were analysed. The modelling results indicated that air was supplied into the test room through the wind catcher's quadrants with positive external pressure coefficients and extracted out of the test room through quadrants with negative pressure coefficients. The air flow achieved through the wind catcher depends on the speed and direction of the wind. The results obtained using the explicit and AIDA implicit calculation procedures and CFX code correlate relatively well with the experimental results at lower wind speeds and with wind incidents at an angle of 0 degrees. Variation in the C-p and air flow results were observed particularly with a wind direction of 45 degrees. The explicit and implicit calculation procedures were found to be quick and easy to use in obtaining results whereas the wind tunnel tests were more expensive in terms of effort, cost and time. CFD codes are developing rapidly and are widely available especially with the decreasing prices of computer hardware. However, results obtained using CFD codes must be considered with care, particularly in the absence of empirical data.
Resumo:
Natural gas, although basically composed by light hydrocarbons, also presents contaminant gases in its composition, such as CO2 (carbon dioxide) and H2S (hydrogen sulfide). The H2S, which commonly occurs in oil and gas exploration and production activities, causes damages in oil and natural gas pipelines. Consequently, the removal of hydrogen sulfide gas will result in an important reduction in operating costs. Also, it is essential to consider the better quality of the oil to be processed in the refinery, thus resulting in benefits in economic, environmental and social areas. All this facts demonstrate the need for the development and improvement in hydrogen sulfide scavengers. Currently, the oil industry uses several processes for hydrogen sulfide removal from natural gas. However, these processes produce amine derivatives which can cause damage in distillation towers, can cause clogging of pipelines by formation of insoluble precipitates, and also produce residues with great environmental impact. Therefore, it is of great importance the obtaining of a stable system, in inorganic or organic reaction media, able to remove hydrogen sulfide without formation of by-products that can affect the quality and cost of natural gas processing, transport, and distribution steps. Seeking the study, evaluation and modeling of mass transfer and kinetics of hydrogen removal, in this study it was used an absorption column packed with Raschig rings, where the natural gas, with H2S as contaminant, passed through an aqueous solution of inorganic compounds as stagnant liquid, being this contaminant gas absorbed by the liquid phase. This absorption column was coupled with a H2S detection system, with interface with a computer. The data and the model equations were solved by the least squares method, modified by Levemberg-Marquardt. In this study, in addition to the water, it were used the following solutions: sodium hydroxide, potassium permanganate, ferric chloride, copper sulfate, zinc chloride, potassium chromate, and manganese sulfate, all at low concentrations (»10 ppm). These solutions were used looking for the evaluation of the interference between absorption physical and chemical parameters, or even to get a better mass transfer coefficient, as in mixing reactors and absorption columns operating in counterflow. In this context, the evaluation of H2S removal arises as a valuable procedure for the treatment of natural gas and destination of process by-products. The study of the obtained absorption curves makes possible to determine the mass transfer predominant stage in the involved processes, the mass transfer volumetric coefficients, and the equilibrium concentrations. It was also performed a kinetic study. The obtained results showed that the H2S removal kinetics is greater for NaOH. Considering that the study was performed at low concentrations of chemical reagents, it was possible to check the effect of secondary reactions in the other chemicals, especially in the case of KMnO4, which shows that your by-product, MnO2, acts in H2S absorption process. In addition, CuSO4 and FeCl3 also demonstrated to have good efficiency in H2S removal
Resumo:
The municipality of Petrolina, located in the semi-arid region of Brazil, is highlighted as an important agricultural growing region, however the irrigated areas have cleared natural vegetation inducing a loss of biodiversity. To analyze the contrast between these two ecosystems the large scale values of biomass production (BIO), evapotranspiration (ET) and water productivity (WP) were quantified. Monteithś equation was applied for estimating the absorbed photosynthetically active radiation (APAR), while the new SAFER (Simple Algorithm For Evapotranspiration Retrieving) algorithm was used to retrieve ET. The water productivity (WP) was analysed by the ratio of BIO by ET at monthly time scale with four bands of MODIS satellite images together with agrometeorological data for the year of 2011. The period with the highest water productivity values were from March to April in the rainy period for both irrigated and not irrigated conditions. However the largest ET rates were in November for irrigated crops and April for natural vegetation. More uniformity of the vegetation and water variables occurs in natural vegetation, evidenced by the lower values of standard deviation when comparing to irrigated crops, due to the different crop stages, cultural and irrigation managements. The models applied with MODIS satellite images on a large scale are considered to be suitable for water productivity assessments and for quantifying the effects of increasing irrigated areas over natural vegetation on regional water consumption in situations of quick changing land use pattern. © 2012 SPIE.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A ligação entre as zonas urbanas e as questões ambientais ficam mais próximas na medida em que cresce a conscientização global de conservar, melhorar e valorizar os serviços ambientais prestados pela natureza para a sustentabilidade da vida, dentro e fora da cidade. Cobertura vegetal (ou cobertura verde) está dentre as principais fontes de tais serviços. Uma vez que o processo de urbanização se mostra irreversível e os problemas ambientais urbanos se alastram em tamanho e extensão, a presença do verde está diretamente relacionada aos indicadores de qualidade de vida urbana. Como reflexo do processo de urbanização, a cidade de Belém perdeu uma grande porcentagem de seus ecossistemas naturais, de modo que este trabalho se concentrou em analisar alguns serviços ecossistêmicos—qualidade do ar, poluição do ar e regulação do clima - fornecidos pela qualidade e pela quantidade de cobertura vegetal local, considerando as alterações na distribuição espaço-temporal, em três distritos administrativos. Um marco teórico foi construído e analisado; a cobertura vegetal foi calculada, utilizando-se NDVI e Cobertura Vegetal Fracional em imagens do LANDSAT 5, ao longo de um período de 23 anos. A partir de uma proposta de escala mais detalhada de NDVI, análises quantitativas e qualitativas da cobertura verde evidenciaram perda significativa de cobertura muito densa, densa, moderada e aumento de áreas de pouca ou nenhuma vegetação. Ademais, lesão das áreas verdes sinalizou tendências de aumento da poluição do ar, da poluição sonora e da temperatura. A carência de dados relacionados ao meio ambiente não deixa dúvida sobre a urgência de investimento nos serviços ambientais provenientes da cobertura vegetal, para a sustentabilidade urbana em Belém, cujos cenários previstos são de drásticas perdas de área verde. Mais pesquisas e iniciativas de instituições públicas e privadas são necessárias para a contribuição aos serviços ambientais em Belém e, consequentemente, ao bem-estar público.
Resumo:
The beta-Birnbaum-Saunders (Cordeiro and Lemonte, 2011) and Birnbaum-Saunders (Birnbaum and Saunders, 1969a) distributions have been used quite effectively to model failure times for materials subject to fatigue and lifetime data. We define the log-beta-Birnbaum-Saunders distribution by the logarithm of the beta-Birnbaum-Saunders distribution. Explicit expressions for its generating function and moments are derived. We propose a new log-beta-Birnbaum-Saunders regression model that can be applied to censored data and be used more effectively in survival analysis. We obtain the maximum likelihood estimates of the model parameters for censored data and investigate influence diagnostics. The new location-scale regression model is modified for the possibility that long-term survivors may be presented in the data. Its usefulness is illustrated by means of two real data sets. (C) 2011 Elsevier B.V. All rights reserved.