75 resultados para Model risk
Resumo:
Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979–2010) simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25×25 km grid, which is then reprojected onto a 1×1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25×25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.
Resumo:
The wood mouse is a common and abundant species in agricultural landscape and is a focal species in pesticide risk assessment. Empirical studies on the ecology of the wood mouse have provided sufficient information for the species to be modelled mechanistically. An individual-based model was constructed to explicitly represent the locations and movement patterns of individual mice. This together with the schedule of pesticide application allows prediction of the risk to the population from pesticide exposure. The model included life-history traits of wood mice as well as typical landscape dynamics in agricultural farmland in the UK. The model obtains a good fit to the available population data and is fit for risk assessment purposes. It can help identify spatio-temporal situations with the largest potential risk of exposure and enables extrapolation from individual-level endpoints to population-level effects. Largest risk of exposure to pesticides was found when good crop growth in the “sink” fields coincided with high “source” population densities in the hedgerows. Keywords: Population dynamics, Pesticides, Ecological risk assessment, Habitat choice, Agent-based model, NetLogo
Resumo:
Abstract: Following a workshop exercise, two models, an individual-based landscape model (IBLM) and a non-spatial life-history model were used to assess the impact of a fictitious insecticide on populations of skylarks in the UK. The chosen population endpoints were abundance, population growth rate, and the chances of population persistence. Both models used the same life-history descriptors and toxicity profiles as the basis for their parameter inputs. The models differed in that exposure was a pre-determined parameter in the life-history model, but an emergent property of the IBLM, and the IBLM required a landscape structure as an input. The model outputs were qualitatively similar between the two models. Under conditions dominated by winter wheat, both models predicted a population decline that was worsened by the use of the insecticide. Under broader habitat conditions, population declines were only predicted for the scenarios where the insecticide was added. Inputs to the models are very different, with the IBLM requiring a large volume of data in order to achieve the flexibility of being able to integrate a range of environmental and behavioural factors. The life-history model has very few explicit data inputs, but some of these relied on extensive prior modelling needing additional data as described in Roelofs et al.(2005, this volume). Both models have strengths and weaknesses; hence the ideal approach is that of combining the use of both simple and comprehensive modeling tools.
Resumo:
Objective To model the overall and income specific effect of a 20% tax on sugar sweetened drinks on the prevalence of overweight and obesity in the UK. Design Econometric and comparative risk assessment modelling study. Setting United Kingdom. Population Adults aged 16 and over. Intervention A 20% tax on sugar sweetened drinks. Main outcome measures The primary outcomes were the overall and income specific changes in the number and percentage of overweight (body mass index ≥25) and obese (≥30) adults in the UK following the implementation of the tax. Secondary outcomes were the effect by age group (16-29, 30-49, and ≥50 years) and by UK constituent country. The revenue generated from the tax and the income specific changes in weekly expenditure on drinks were also estimated. Results A 20% tax on sugar sweetened drinks was estimated to reduce the number of obese adults in the UK by 1.3% (95% credible interval 0.8% to 1.7%) or 180 000 (110 000 to 247 000) people and the number who are overweight by 0.9% (0.6% to 1.1%) or 285 000 (201 000 to 364 000) people. The predicted reductions in prevalence of obesity for income thirds 1 (lowest income), 2, and 3 (highest income) were 1.3% (0.3% to 2.0%), 0.9% (0.1% to 1.6%), and 2.1% (1.3% to 2.9%). The effect on obesity declined with age. Predicted annual revenue was £276m (£272m to £279m), with estimated increases in total expenditure on drinks for income thirds 1, 2, and 3 of 2.1% (1.4% to 3.0%), 1.7% (1.2% to 2.2%), and 0.8% (0.4% to 1.2%). Conclusions A 20% tax on sugar sweetened drinks would lead to a reduction in the prevalence of obesity in the UK of 1.3% (around 180 000 people). The greatest effects may occur in young people, with no significant differences between income groups. Both effects warrant further exploration. Taxation of sugar sweetened drinks is a promising population measure to target population obesity, particularly among younger adults.
Resumo:
Objectives To model the impact on chronic disease of a tax on UK food and drink that internalises the wider costs to society of greenhouse gas (GHG) emissions and to estimate the potential revenue. Design An econometric and comparative risk assessment modelling study. Setting The UK. Participants The UK adult population. Interventions Two tax scenarios are modelled: (A) a tax of £2.72/tonne carbon dioxide equivalents (tCO2e)/100 g product applied to all food and drink groups with above average GHG emissions. (B) As with scenario (A) but food groups with emissions below average are subsidised to create a tax neutral scenario. Outcome measures Primary outcomes are change in UK population mortality from chronic diseases following the implementation of each taxation strategy, the change in the UK GHG emissions and the predicted revenue. Secondary outcomes are the changes to the micronutrient composition of the UK diet. Results Scenario (A) results in 7770 (95% credible intervals 7150 to 8390) deaths averted and a reduction in GHG emissions of 18 683 (14 665to 22 889) ktCO2e/year. Estimated annual revenue is £2.02 (£1.98 to £2.06) billion. Scenario (B) results in 2685 (1966 to 3402) extra deaths and a reduction in GHG emissions of 15 228 (11 245to 19 492) ktCO2e/year. Conclusions Incorporating the societal cost of GHG into the price of foods could save 7770 lives in the UK each year, reduce food-related GHG emissions and generate substantial tax revenue. The revenue neutral scenario (B) demonstrates that sustainability and health goals are not always aligned. Future work should focus on investigating the health impact by population subgroup and on designing fiscal strategies to promote both sustainable and healthy diets.
Resumo:
Climate model ensembles are widely heralded for their potential to quantify uncertainties and generate probabilistic climate projections. However, such technical improvements to modeling science will do little to deliver on their ultimate promise of improving climate policymaking and adaptation unless the insights they generate can be effectively communicated to decision makers. While some of these communicative challenges are unique to climate ensembles, others are common to hydrometeorological modeling more generally, and to the tensions arising between the imperatives for saliency, robustness, and richness in risk communication. The paper reviews emerging approaches to visualizing and communicating climate ensembles and compares them to the more established and thoroughly evaluated communication methods used in the numerical weather prediction domains of day-to-day weather forecasting (in particular probabilities of precipitation), hurricane and flood warning, and seasonal forecasting. This comparative analysis informs recommendations on best practice for climate modelers, as well as prompting some further thoughts on key research challenges to improve the future communication of climate change uncertainties.
Resumo:
An online national survey among the Spanish population (n = 602) was conducted to examine the factors underlying a person’s support for commitments to global climate change reductions. Multiple hierarchical regression analysis was conducted in four steps and a structural equations model was tested. A survey tool designed by the Yale Project on Climate Change Communication was applied in order to build scales for the variables introduced in the study. The results show that perceived consumer effectiveness and risk perception are determinant factors of commitment to mitigating global climate change. However, there are differences in the influence that other factors, such as socio-demographics, view of nature and cultural cognition, have on the last predicted variable.
Resumo:
This paper demonstrates that the use of GARCH-type models for the calculation of minimum capital risk requirements (MCRRs) may lead to the production of inaccurate and therefore inefficient capital requirements. We show that this inaccuracy stems from the fact that GARCH models typically overstate the degree of persistence in return volatility. A simple modification to the model is found to improve the accuracy of MCRR estimates in both back- and out-of-sample tests. Given that internal risk management models are currently in widespread usage in some parts of the world (most notably the USA), and will soon be permitted for EC banks and investment firms, we believe that our paper should serve as a valuable caution to risk management practitioners who are using, or intend to use this popular class of models.
Resumo:
Using a linear factor model, we study the behaviour of French, Germany, Italian and British sovereign yield curves in the run up to EMU. This allows us to determine which of these yield curves might best approximate a benchmark yield curve post EMU. We find that the best approximation for the risk free yield is the UK three month T-bill yield, followed by the German three month T-bill yield. As no one sovereign yield curve dominates all others, we find that a composite yield curve, consisting of French, Italian and UK bonds at different maturity points along the yield curve should be the benchmark post EMU.
Resumo:
Cholesterol is one of the key constituents for maintaining the cellular membrane and thus the integrity of the cell itself. In contrast high levels of cholesterol in the blood are known to be a major risk factor in the development of cardiovascular disease. We formulate a deterministic nonlinear ordinary differential equation model of the sterol regulatory element binding protein 2 (SREBP-2) cholesterol genetic regulatory pathway in an hepatocyte. The mathematical model includes a description of genetic transcription by SREBP-2 which is subsequently translated to mRNA leading to the formation of 3-hydroxy-3-methylglutaryl coenzyme A reductase (HMGCR), a main precursor of cholesterol synthesis. Cholesterol synthesis subsequently leads to the regulation of SREBP-2 via a negative feedback formulation. Parameterised with data from the literature, the model is used to understand how SREBP-2 transcription and regulation affects cellular cholesterol concentration. Model stability analysis shows that the only positive steady-state of the system exhibits purely oscillatory, damped oscillatory or monotic behaviour under certain parameter conditions. In light of our findings we postulate how cholesterol homestasis is maintained within the cell and the advantages of our model formulation are discussed with respect to other models of genetic regulation within the literature.
Resumo:
This paper presents an assessment of the implications of climate change for global river flood risk. It is based on the estimation of flood frequency relationships at a grid resolution of 0.5 × 0.5°, using a global hydrological model with climate scenarios derived from 21 climate models, together with projections of future population. Four indicators of the flood hazard are calculated; change in the magnitude and return period of flood peaks, flood-prone population and cropland exposed to substantial change in flood frequency, and a generalised measure of regional flood risk based on combining frequency curves with generic flood damage functions. Under one climate model, emissions and socioeconomic scenario (HadCM3 and SRES A1b), in 2050 the current 100-year flood would occur at least twice as frequently across 40 % of the globe, approximately 450 million flood-prone people and 430 thousand km2 of flood-prone cropland would be exposed to a doubling of flood frequency, and global flood risk would increase by approximately 187 % over the risk in 2050 in the absence of climate change. There is strong regional variability (most adverse impacts would be in Asia), and considerable variability between climate models. In 2050, the range in increased exposure across 21 climate models under SRES A1b is 31–450 million people and 59 to 430 thousand km2 of cropland, and the change in risk varies between −9 and +376 %. The paper presents impacts by region, and also presents relationships between change in global mean surface temperature and impacts on the global flood hazard. There are a number of caveats with the analysis; it is based on one global hydrological model only, the climate scenarios are constructed using pattern-scaling, and the precise impacts are sensitive to some of the assumptions in the definition and application.
Resumo:
In this paper, we study the role of the volatility risk premium for the forecasting performance of implied volatility. We introduce a non-parametric and parsimonious approach to adjust the model-free implied volatility for the volatility risk premium and implement this methodology using more than 20 years of options and futures data on three major energy markets. Using regression models and statistical loss functions, we find compelling evidence to suggest that the risk premium adjusted implied volatility significantly outperforms other models, including its unadjusted counterpart. Our main finding holds for different choices of volatility estimators and competing time-series models, underlying the robustness of our results.
Resumo:
We present and experimentally test a theoretical model of majority threshold determination as a function of voters’ risk preferences. The experimental results confirm the theoretical prediction of a positive correlation between the voter's risk aversion and the corresponding preferred majority threshold. Furthermore, the experimental results show that a voter's preferred majority threshold negatively relates to the voter's confidence about how others will vote. Moreover, in a treatment in which individuals receive a private signal about others’ voting behaviour, the confidence-related motivation of behaviour loses ground to the signal's strength.
Resumo:
Geoengineering by stratospheric aerosol injection has been proposed as a policy response to warming from human emissions of greenhouse gases, but it may produce unequal regional impacts. We present a simple, intuitive risk-based framework for classifying these impacts according to whether geoengineering increases or decreases the risk of substantial climate change, with further classification by the level of existing risk from climate change from increasing carbon dioxide concentrations. This framework is applied to two climate model simulations of geoengineering counterbalancing the surface warming produced by a quadrupling of carbon dioxide concentrations, with one using a layer of sulphate aerosol in the lower stratosphere, and the other a reduction in total solar irradiance. The solar dimming model simulation shows less regional inequality of impacts compared with the aerosol geoengineering simulation. In the solar dimming simulation, 10% of the Earth’s surface area, containing 10% of its population and 11% of its gross domestic product, experiences greater risk of substantial precipitation changes under geoengineering than under enhanced carbon dioxide concentrations. In the aerosol geoengineering simulation the increased risk of substantial precipitation change is experienced by 42% of Earth’s surface area, containing 36% of its population and 60% of its gross domestic product.
Resumo:
Population modelling is increasingly recognised as a useful tool for pesticide risk assessment. For vertebrates that may ingest pesticides with their food, such as woodpigeon (Columba palumbus), population models that simulate foraging behaviour explicitly can help predicting both exposure and population-level impact. Optimal foraging theory is often assumed to explain the individual-level decisions driving distributions of individuals in the field, but it may not adequately predict spatial and temporal characteristics of woodpigeon foraging because of the woodpigeons’ excellent memory, ability to fly long distances, and distinctive flocking behaviour. Here we present an individual-based model (IBM) of the woodpigeon. We used the model to predict distributions of foraging woodpigeons that use one of six alternative foraging strategies: optimal foraging, memory-based foraging and random foraging, each with or without flocking mechanisms. We used pattern-oriented modelling to determine which of the foraging strategies is best able to reproduce observed data patterns. Data used for model evaluation were gathered during a long-term woodpigeon study conducted between 1961 and 2004 and a radiotracking study conducted in 2003 and 2004, both in the UK, and are summarised here as three complex patterns: the distributions of foraging birds between vegetation types during the year, the number of fields visited daily by individuals, and the proportion of fields revisited by them on subsequent days. The model with a memory-based foraging strategy and a flocking mechanism was the only one to reproduce these three data patterns, and the optimal foraging model produced poor matches to all of them. The random foraging strategy reproduced two of the three patterns but was not able to guarantee population persistence. We conclude that with the memory-based foraging strategy including a flocking mechanism our model is realistic enough to estimate the potential exposure of woodpigeons to pesticides. We discuss how exposure can be linked to our model, and how the model could be used for risk assessment of pesticides, for example predicting exposure and effects in heterogeneous landscapes planted seasonally with a variety of crops, while accounting for differences in land use between landscapes.