869 resultados para Risk model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an assessment of the implications of climate change for global river flood risk. It is based on the estimation of flood frequency relationships at a grid resolution of 0.5 × 0.5°, using a global hydrological model with climate scenarios derived from 21 climate models, together with projections of future population. Four indicators of the flood hazard are calculated; change in the magnitude and return period of flood peaks, flood-prone population and cropland exposed to substantial change in flood frequency, and a generalised measure of regional flood risk based on combining frequency curves with generic flood damage functions. Under one climate model, emissions and socioeconomic scenario (HadCM3 and SRES A1b), in 2050 the current 100-year flood would occur at least twice as frequently across 40 % of the globe, approximately 450 million flood-prone people and 430 thousand km2 of flood-prone cropland would be exposed to a doubling of flood frequency, and global flood risk would increase by approximately 187 % over the risk in 2050 in the absence of climate change. There is strong regional variability (most adverse impacts would be in Asia), and considerable variability between climate models. In 2050, the range in increased exposure across 21 climate models under SRES A1b is 31–450 million people and 59 to 430 thousand km2 of cropland, and the change in risk varies between −9 and +376 %. The paper presents impacts by region, and also presents relationships between change in global mean surface temperature and impacts on the global flood hazard. There are a number of caveats with the analysis; it is based on one global hydrological model only, the climate scenarios are constructed using pattern-scaling, and the precise impacts are sensitive to some of the assumptions in the definition and application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we study the role of the volatility risk premium for the forecasting performance of implied volatility. We introduce a non-parametric and parsimonious approach to adjust the model-free implied volatility for the volatility risk premium and implement this methodology using more than 20 years of options and futures data on three major energy markets. Using regression models and statistical loss functions, we find compelling evidence to suggest that the risk premium adjusted implied volatility significantly outperforms other models, including its unadjusted counterpart. Our main finding holds for different choices of volatility estimators and competing time-series models, underlying the robustness of our results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present and experimentally test a theoretical model of majority threshold determination as a function of voters’ risk preferences. The experimental results confirm the theoretical prediction of a positive correlation between the voter's risk aversion and the corresponding preferred majority threshold. Furthermore, the experimental results show that a voter's preferred majority threshold negatively relates to the voter's confidence about how others will vote. Moreover, in a treatment in which individuals receive a private signal about others’ voting behaviour, the confidence-related motivation of behaviour loses ground to the signal's strength.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geoengineering by stratospheric aerosol injection has been proposed as a policy response to warming from human emissions of greenhouse gases, but it may produce unequal regional impacts. We present a simple, intuitive risk-based framework for classifying these impacts according to whether geoengineering increases or decreases the risk of substantial climate change, with further classification by the level of existing risk from climate change from increasing carbon dioxide concentrations. This framework is applied to two climate model simulations of geoengineering counterbalancing the surface warming produced by a quadrupling of carbon dioxide concentrations, with one using a layer of sulphate aerosol in the lower stratosphere, and the other a reduction in total solar irradiance. The solar dimming model simulation shows less regional inequality of impacts compared with the aerosol geoengineering simulation. In the solar dimming simulation, 10% of the Earth’s surface area, containing 10% of its population and 11% of its gross domestic product, experiences greater risk of substantial precipitation changes under geoengineering than under enhanced carbon dioxide concentrations. In the aerosol geoengineering simulation the increased risk of substantial precipitation change is experienced by 42% of Earth’s surface area, containing 36% of its population and 60% of its gross domestic product.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Population modelling is increasingly recognised as a useful tool for pesticide risk assessment. For vertebrates that may ingest pesticides with their food, such as woodpigeon (Columba palumbus), population models that simulate foraging behaviour explicitly can help predicting both exposure and population-level impact. Optimal foraging theory is often assumed to explain the individual-level decisions driving distributions of individuals in the field, but it may not adequately predict spatial and temporal characteristics of woodpigeon foraging because of the woodpigeons’ excellent memory, ability to fly long distances, and distinctive flocking behaviour. Here we present an individual-based model (IBM) of the woodpigeon. We used the model to predict distributions of foraging woodpigeons that use one of six alternative foraging strategies: optimal foraging, memory-based foraging and random foraging, each with or without flocking mechanisms. We used pattern-oriented modelling to determine which of the foraging strategies is best able to reproduce observed data patterns. Data used for model evaluation were gathered during a long-term woodpigeon study conducted between 1961 and 2004 and a radiotracking study conducted in 2003 and 2004, both in the UK, and are summarised here as three complex patterns: the distributions of foraging birds between vegetation types during the year, the number of fields visited daily by individuals, and the proportion of fields revisited by them on subsequent days. The model with a memory-based foraging strategy and a flocking mechanism was the only one to reproduce these three data patterns, and the optimal foraging model produced poor matches to all of them. The random foraging strategy reproduced two of the three patterns but was not able to guarantee population persistence. We conclude that with the memory-based foraging strategy including a flocking mechanism our model is realistic enough to estimate the potential exposure of woodpigeons to pesticides. We discuss how exposure can be linked to our model, and how the model could be used for risk assessment of pesticides, for example predicting exposure and effects in heterogeneous landscapes planted seasonally with a variety of crops, while accounting for differences in land use between landscapes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earthworms are important organisms in soil communities and so are used as model organisms in environmental risk assessments of chemicals. However current risk assessments of soil invertebrates are based on short-term laboratory studies, of limited ecological relevance, supplemented if necessary by site-specific field trials, which sometimes are challenging to apply across the whole agricultural landscape. Here, we investigate whether population responses to environmental stressors and pesticide exposure can be accurately predicted by combining energy budget and agent-based models (ABMs), based on knowledge of how individuals respond to their local circumstances. A simple energy budget model was implemented within each earthworm Eisenia fetida in the ABM, based on a priori parameter estimates. From broadly accepted physiological principles, simple algorithms specify how energy acquisition and expenditure drive life cycle processes. Each individual allocates energy between maintenance, growth and/or reproduction under varying conditions of food density, soil temperature and soil moisture. When simulating published experiments, good model fits were obtained to experimental data on individual growth, reproduction and starvation. Using the energy budget model as a platform we developed methods to identify which of the physiological parameters in the energy budget model (rates of ingestion, maintenance, growth or reproduction) are primarily affected by pesticide applications, producing four hypotheses about how toxicity acts. We tested these hypotheses by comparing model outputs with published toxicity data on the effects of copper oxychloride and chlorpyrifos on E. fetida. Both growth and reproduction were directly affected in experiments in which sufficient food was provided, whilst maintenance was targeted under food limitation. Although we only incorporate toxic effects at the individual level we show how ABMs can readily extrapolate to larger scales by providing good model fits to field population data. The ability of the presented model to fit the available field and laboratory data for E. fetida demonstrates the promise of the agent-based approach in ecology, by showing how biological knowledge can be used to make ecological inferences. Further work is required to extend the approach to populations of more ecologically relevant species studied at the field scale. Such a model could help extrapolate from laboratory to field conditions and from one set of field conditions to another or from species to species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential risk of agricultural pesticides to mammals typically depends on internal concentrations within individuals, and these are determined by the amount ingested and by absorption, distribution, metabolism, and excretion (ADME). Pesticide residues ingested depend, amongst other things, on individual spatial choices which determine how much and when feeding sites and areas of pesticide application overlap, and can be calculated using individual-based models (IBMs). Internal concentrations can be calculated using toxicokinetic (TK) models, which are quantitative representations of ADME processes. Here we provide a population model for the wood mouse (Apodemus sylvaticus) in which TK submodels were incorporated into an IBM representation of individuals making choices about where to feed. This allows us to estimate the contribution of individual spatial choice and TK processes to risk. We compared the risk predicted by four IBMs: (i) “AllExposed-NonTK”: assuming no spatial choice so all mice have 100% exposure, no TK, (ii) “AllExposed-TK”: identical to (i) except that the TK processes are included where individuals vary because they have different temporal patterns of ingestion in the IBM, (iii) “Spatial-NonTK”: individual spatial choice, no TK, and (iv) “Spatial-TK”: individual spatial choice and with TK. The TK parameters for hypothetical pesticides used in this study were selected such that a conventional risk assessment would fail. Exposures were standardised using risk quotients (RQ; exposure divided by LD50 or LC50). We found that for the exposed sub-population including either spatial choice or TK reduced the RQ by 37–85%, and for the total population the reduction was 37–94%. However spatial choice and TK together had little further effect in reducing RQ. The reasons for this are that when the proportion of time spent in treated crop (PT) approaches 1, TK processes dominate and spatial choice has very little effect, and conversely if PT is small spatial choice dominates and TK makes little contribution to exposure reduction. The latter situation means that a short time spent in the pesticide-treated field mimics exposure from a small gavage dose, but TK only makes a substantial difference when the dose was consumed over a longer period. We concluded that a combined TK-IBM is most likely to bring added value to the risk assessment process when the temporal pattern of feeding, time spent in exposed area and TK parameters are at an intermediate level; for instance wood mice in foliar spray scenarios spending more time in crop fields because of better plant cover.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of rank dependent preference functionals under risk is comprehensively evaluated using Bayesian model averaging. Model comparisons are made at three levels of heterogeneity plus three ways of linking deterministic and stochastic models: the differences in utilities, the differences in certainty equivalents and contextualutility. Overall, the"bestmodel", which is conditional on the form of heterogeneity is a form of Rank Dependent Utility or Prospect Theory that cap tures the majority of behaviour at both the representative agent and individual level. However, the curvature of the probability weighting function for many individuals is S-shaped, or ostensibly concave or convex rather than the inverse S-shape commonly employed. Also contextual utility is broadly supported across all levels of heterogeneity. Finally, the Priority Heuristic model, previously examined within a deterministic setting, is estimated within a stochastic framework, and allowing for endogenous thresholds does improve model performance although it does not compete well with the other specications considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current feed evaluation systems for ruminants are too imprecise to describe diets in terms of their acidosis risk. The dynamic mechanistic model described herein arises from the integration of a lactic acid (La) metabolism module into an extant model of whole-rumen function. The model was evaluated using published data from cows and sheep fed a range of diets or infused with various doses of La. The model performed well in simulating peak rumen La concentrations (coefficient of determination = 0.96; root mean square prediction error = 16.96% of observed mean), although frequency of sampling for the published data prevented a comprehensive comparison of prediction of time to peak La accumulation. The model showed a tendency for increased La accumulation following feeding of diets rich in nonstructural carbohydrates, although less-soluble starch sources such as corn tended to limit rumen La concentration. Simulated La absorption from the rumen remained low throughout the feeding cycle. The competition between bacteria and protozoa for rumen La suggests a variable contribution of protozoa to total La utilization. However, the model was unable to simulate the effects of defaunation on rumen La metabolism, indicating a need for a more detailed description of protozoal metabolism. The model could form the basis of a feed evaluation system with regard to rumen La metabolism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article critically reflects on the widely held view of a causal chain with trust in public authorities impacting technology acceptance via perceived risk. It first puts forward conceptual reason against this view, as the presence of risk is a precondition for trust playing a role in decision making. Second, results from consumer surveys in Italy and Germany are presented that support the associationist model as counter hypothesis. In that view, trust and risk judgments are driven by and thus simply indicators of higher order attitudes toward a certain technology which determine acceptance instead. The implications of these findings are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate controls fire regimes through its influence on the amount and types of fuel present and their dryness. CO2 concentration constrains primary production by limiting photosynthetic activity in plants. However, although fuel accumulation depends on biomass production, and hence on CO2 concentration, the quantitative relationship between atmospheric CO2 concentration and biomass burning is not well understood. Here a fire-enabled dynamic global vegetation model (the Land surface Processes and eXchanges model, LPX) is used to attribute glacial–interglacial changes in biomass burning to an increase in CO2, which would be expected to increase primary production and therefore fuel loads even in the absence of climate change, vs. climate change effects. Four general circulation models provided last glacial maximum (LGM) climate anomalies – that is, differences from the pre-industrial (PI) control climate – from the Palaeoclimate Modelling Intercomparison Project Phase~2, allowing the construction of four scenarios for LGM climate. Modelled carbon fluxes from biomass burning were corrected for the model's observed prediction biases in contemporary regional average values for biomes. With LGM climate and low CO2 (185 ppm) effects included, the modelled global flux at the LGM was in the range of 1.0–1.4 Pg C year-1, about a third less than that modelled for PI time. LGM climate with pre-industrial CO2 (280 ppm) yielded unrealistic results, with global biomass burning fluxes similar to or even greater than in the pre-industrial climate. It is inferred that a substantial part of the increase in biomass burning after the LGM must be attributed to the effect of increasing CO2 concentration on primary production and fuel load. Today, by analogy, both rising CO2 and global warming must be considered as risk factors for increasing biomass burning. Both effects need to be included in models to project future fire risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real estate securities have a number of distinct characteristics that differentiate them from stocks generally. Key amongst them is that under-pinning the firms are both real as well as investment assets. The connections between the underlying macro-economy and listed real estate firms is therefore clearly demonstrated and of heightened importance. To consider the linkages with the underlying macro-economic fundamentals we extract the ‘low-frequency’ volatility component from aggregate volatility shocks in 11 international markets over the 1990-2014 period. This is achieved using Engle and Rangel’s (2008) Spline-Generalized Autoregressive Conditional Heteroskedasticity (Spline-GARCH) model. The estimated low-frequency volatility is then examined together with low-frequency macro data in a fixed-effect pooled regression framework. The analysis reveals that the low-frequency volatility of real estate securities has strong and positive association with most of the macroeconomic risk proxies examined. These include interest rates, inflation, GDP and foreign exchange rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Comparative analyses are used to address the key question of what makes a species more prone to extinction by exploring the links between vulnerability and intrinsic species’ traits and/or extrinsic factors. This approach requires comprehensive species data but information is rarely available for all species of interest. As a result comparative analyses often rely on subsets of relatively few species that are assumed to be representative samples of the overall studied group. 2. Our study challenges this assumption and quantifies the taxonomic, spatial, and data type biases associated with the quantity of data available for 5415 mammalian species using the freely available life-history database PanTHERIA. 3. Moreover, we explore how existing biases influence results of comparative analyses of extinction risk by using subsets of data that attempt to correct for detected biases. In particular, we focus on links between four species’ traits commonly linked to vulnerability (distribution range area, adult body mass, population density and gestation length) and conduct univariate and multivariate analyses to understand how biases affect model predictions. 4. Our results show important biases in data availability with c.22% of mammals completely lacking data. Missing data, which appear to be not missing at random, occur frequently in all traits (14–99% of cases missing). Data availability is explained by intrinsic traits, with larger mammals occupying bigger range areas being the best studied. Importantly, we find that existing biases affect the results of comparative analyses by overestimating the risk of extinction and changing which traits are identified as important predictors. 5. Our results raise concerns over our ability to draw general conclusions regarding what makes a species more prone to extinction. Missing data represent a prevalent problem in comparative analyses, and unfortunately, because data are not missing at random, conventional approaches to fill data gaps, are not valid or present important challenges. These results show the importance of making appropriate inferences from comparative analyses by focusing on the subset of species for which data are available. Ultimately, addressing the data bias problem requires greater investment in data collection and dissemination, as well as the development of methodological approaches to effectively correct existing biases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Previous data support the benefits of reducing dietary saturated fatty acids (SFAs) on insulin resistance (IR) and other metabolic risk factors. However, whether the IR status of those suffering from metabolic syndrome (MetS) affects this response is not established. OBJECTIVE: Our objective was to determine whether the degree of IR influences the effect of substituting high-saturated fatty acid (HSFA) diets by isoenergetic alterations in the quality and quantity of dietary fat on MetS risk factors. DESIGN: In this single-blind, parallel, controlled, dietary intervention study, MetS subjects (n = 472) from 8 European countries classified by different IR levels according to homeostasis model assessment of insulin resistance (HOMA-IR) were randomly assigned to 4 diets: an HSFA diet; a high-monounsaturated fatty acid (HMUFA) diet; a low-fat, high-complex carbohydrate (LFHCC) diet supplemented with long-chain n-3 polyunsaturated fatty acids (1.2 g/d); or an LFHCC diet supplemented with placebo for 12 wk (control). Anthropometric, lipid, inflammatory, and IR markers were determined. RESULTS: Insulin-resistant MetS subjects with the highest HOMA-IR improved IR, with reduced insulin and HOMA-IR concentrations after consumption of the HMUFA and LFHCC n-3 diets (P < 0.05). In contrast, subjects with lower HOMA-IR showed reduced body mass index and waist circumference after consumption of the LFHCC control and LFHCC n-3 diets and increased HDL cholesterol concentrations after consumption of the HMUFA and HSFA diets (P < 0.05). MetS subjects with a low to medium HOMA-IR exhibited reduced blood pressure, triglyceride, and LDL cholesterol levels after the LFHCC n-3 diet and increased apolipoprotein A-I concentrations after consumption of the HMUFA and HSFA diets (all P < 0.05). CONCLUSIONS: Insulin-resistant MetS subjects with more metabolic complications responded differently to dietary fat modification, being more susceptible to a health effect from the substitution of SFAs in the HMUFA and LFHCC n-3 diets. Conversely, MetS subjects without IR may be more sensitive to the detrimental effects of HSFA intake. The metabolic phenotype of subjects clearly determines response to the quantity and quality of dietary fat on MetS risk factors, which suggests that targeted and personalized dietary therapies may be of value for its different metabolic features.