995 resultados para Nuclear hazards insurance


Relevância:

20.00% 20.00%

Publicador:

Resumo:

European writers on strategy (in French: strategistes, as opposed to practitioners, stratèges) developed their thoughts on the best strategies and postures of nuclear deterrence against their own beliefs in the identities of their own countries - were they seen as "Europesn" or as "nation-states" who must under no condition surrender their sovereignty?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

If Britain wants to stem the tide of nuclear proliferation, it must continue to assume "the nuclear man's burden" and guarantee the security of non-nuclear allies, as it did in the Cold War.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A mononuclear octahedral nickel(II) complex [Ni(HL(1))(2)](SCN)(2) (1) and an unusual penta-nuclear complex [{(NiL(2))(mu-SCN)}(4)Ni(NCS)(2)]center dot 2CH(3)CN (2) where HL(1) = 3-(2-aminoethylimino)butan-2-one oxime and HL(2) = 3-(hydroxyimino)butan-2-ylidene)amino)propylimino)butan-2-one oxime have been prepared and characterized by X-ray crystallography. The mono-condensed ligand, HL(1), was prepared by the 1:1 condensation of the 1,2-diaminoethane with diacetylmonoxime in methanol under high dilution. Complex 1 is found to be a mer isomer and the amine hydrogen atoms are involved in extensive hydrogen bonding with the thiocyanate anions. The dicondensed ligand, HL(2), was prepared by the 1:2 condensation of the 1,3-diaminopropane with diacetylmonoxime in methanol. The central nickel(II) in 2 is coordinated by six nitrogen atoms of six thiocyanate groups, four of which utilize their sulphur atoms to connect four NiL2 moieties to form a penta-nuclear complex and it is unique in the sense that this is the first thiocyanato bridged penta-nuclear nickel(II) compound with Schiff base ligands.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple storm loss model is applied to an ensemble of ECHAM5/MPI-OM1 GCM simulations in order to estimate changes of insured loss potentials over Europe in the 21st century. Losses are computed based on the daily maximum wind speed for each grid point. The calibration of the loss model is performed using wind data from the ERA40-Reanalysis and German loss data. The obtained annual losses for the present climate conditions (20C, three realisations) reproduce the statistical features of the historical insurance loss data for Germany. The climate change experiments correspond to the SRES-Scenarios A1B and A2, and for each of them three realisations are considered. On average, insured loss potentials increase for all analysed European regions at the end of the 21st century. Changes are largest for Germany and France, and lowest for Portugal/Spain. Additionally, the spread between the single realisations is large, ranging e.g. for Germany from −4% to +43% in terms of mean annual loss. Moreover, almost all simulations show an increasing interannual variability of storm damage. This assessment is even more pronounced if no adaptation of building structure to climate change is considered. The increased loss potentials are linked with enhanced values for the high percentiles of surface wind maxima over Western and Central Europe, which in turn are associated with an enhanced number and increased intensity of extreme cyclones over the British Isles and the North Sea.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although over a hundred thermal indices can be used for assessing thermal health hazards, many ignore the human heat budget, physiology and clothing. The Universal Thermal Climate Index (UTCI) addresses these shortcomings by using an advanced thermo-physiological model. This paper assesses the potential of using the UTCI for forecasting thermal health hazards. Traditionally, such hazard forecasting has had two further limitations: it has been narrowly focused on a particular region or nation and has relied on the use of single ‘deterministic’ forecasts. Here, the UTCI is computed on a global scale,which is essential for international health-hazard warnings and disaster preparedness, and it is provided as a probabilistic forecast. It is shown that probabilistic UTCI forecasts are superior in skill to deterministic forecasts and that despite global variations, the UTCI forecast is skilful for lead times up to 10 days. The paper also demonstrates the utility of probabilistic UTCI forecasts on the example of the 2010 heat wave in Russia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the last decades, several windstorm series hit Europe leading to large aggregated losses. Such storm series are examples of serial clustering of extreme cyclones, presenting a considerable risk for the insurance industry. Clustering of events and return periods of storm series for Germany are quantified based on potential losses using empirical models. Two reanalysis data sets and observations from German weather stations are considered for 30 winters. Histograms of events exceeding selected return levels (1-, 2- and 5-year) are derived. Return periods of historical storm series are estimated based on the Poisson and the negative binomial distributions. Over 4000 years of general circulation model (GCM) simulations forced with current climate conditions are analysed to provide a better assessment of historical return periods. Estimations differ between distributions, for example 40 to 65 years for the 1990 series. For such less frequent series, estimates obtained with the Poisson distribution clearly deviate from empirical data. The negative binomial distribution provides better estimates, even though a sensitivity to return level and data set is identified. The consideration of GCM data permits a strong reduction of uncertainties. The present results support the importance of considering explicitly clustering of losses for an adequate risk assessment for economical applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The XWS (eXtreme WindStorms) catalogue consists of storm tracks and model-generated maximum 3 s wind-gust footprints for 50 of the most extreme winter windstorms to hit Europe in the period 1979–2012. The catalogue is intended to be a valuable resource for both academia and industries such as (re)insurance, for example allowing users to characterise extreme European storms, and validate climate and catastrophe models. Several storm severity indices were investigated to find which could best represent a list of known high-loss (severe) storms. The best-performing index was Sft, which is a combination of storm area calculated from the storm footprint and maximum 925 hPa wind speed from the storm track. All the listed severe storms are included in the catalogue, and the remaining ones were selected using Sft. A comparison of the model footprint to station observations revealed that storms were generally well represented, although for some storms the highest gusts were underestimated. Possible reasons for this underestimation include the model failing to simulate strong enough pressure gradients and not representing convective gusts. A new recalibration method was developed to estimate the true distribution of gusts at each grid point and correct for this underestimation. The recalibration model allows for storm-to-storm variation which is essential given that different storms have different degrees of model bias. The catalogue is available at www.europeanwindstorms.org.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter considers the possible use in armed conflict of low-yield (also known as tactical) nuclear weapons. The Legality of the Threat or Use of Nuclear Weapons Advisory Opinion maintained that it is a cardinal principle that a State must never make civilians an object of attack and must consequently never use weapons that are incapable of distinguishing between civilian and military targets. As international humanitarian law applies equally to any use of nuclear weapons, it is argued that there is no use of nuclear weapons that could spare civilian casualties particularly if you view the long-term health and environmental effects of the use of such weaponry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Catastrophe risk models used by the insurance industry are likely subject to significant uncertainty, but due to their proprietary nature and strict licensing conditions they are not available for experimentation. In addition, even if such experiments were conducted, these would not be repeatable by other researchers because commercial confidentiality issues prevent the details of proprietary catastrophe model structures from being described in public domain documents. However, such experimentation is urgently required to improve decision making in both insurance and reinsurance markets. In this paper we therefore construct our own catastrophe risk model for flooding in Dublin, Ireland, in order to assess the impact of typical precipitation data uncertainty on loss predictions. As we consider only a city region rather than a whole territory and have access to detailed data and computing resources typically unavailable to industry modellers, our model is significantly more detailed than most commercial products. The model consists of four components, a stochastic rainfall module, a hydrological and hydraulic flood hazard module, a vulnerability module, and a financial loss module. Using these we undertake a series of simulations to test the impact of driving the stochastic event generator with four different rainfall data sets: ground gauge data, gauge-corrected rainfall radar, meteorological reanalysis data (European Centre for Medium-Range Weather Forecasts Reanalysis-Interim; ERA-Interim) and a satellite rainfall product (The Climate Prediction Center morphing method; CMORPH). Catastrophe models are unusual because they use the upper three components of the modelling chain to generate a large synthetic database of unobserved and severe loss-driving events for which estimated losses are calculated. We find the loss estimates to be more sensitive to uncertainties propagated from the driving precipitation data sets than to other uncertainties in the hazard and vulnerability modules, suggesting that the range of uncertainty within catastrophe model structures may be greater than commonly believed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nuclear time-dependent Hartree-Fock model formulated in three-dimensional space, based on the full standard Skyrme energy density functional complemented with the tensor force, is presented. Full self-consistency is achieved by the model. The application to the isovector giant dipole resonance is discussed in the linear limit, ranging from spherical nuclei (16O and 120Sn) to systems displaying axial or triaxial deformation (24Mg, 28Si, 178Os, 190W and 238U). Particular attention is paid to the spin-dependent terms from the central sector of the functional, recently included together with the tensor. They turn out to be capable of producing a qualitative change on the strength distribution in this channel. The effect on the deformation properties is also discussed. The quantitative effects on the linear response are small and, overall, the giant dipole energy remains unaffected. Calculations are compared to predictions from the (quasi)-particle random-phase approximation and experimental data where available, finding good agreement

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of the tensor terms in the Skyrme interaction is studied for their effect in dynamic calculations where non-zero contributions to the mean-field may arise, even when the starting nucleus, or nuclei are even-even and have no active time-odd potentials in the ground state. We study collisions in the test-bed 16O-16O system, and give a qualitative analysis of the behaviour of the time-odd tensor-kinetic density, which only appears in the mean field Hamiltonian in the presence of the tensor force. We find an axial excitation of this density is induced by a collision.