82 resultados para Capture probability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Black carbon aerosol plays a unique and important role in Earth’s climate system. Black carbon is a type of carbonaceous material with a unique combination of physical properties. This assessment provides an evaluation of black-carbon climate forcing that is comprehensive in its inclusion of all known and relevant processes and that is quantitative in providing best estimates and uncertainties of the main forcing terms: direct solar absorption; influence on liquid, mixed phase, and ice clouds; and deposition on snow and ice. These effects are calculated with climate models, but when possible, they are evaluated with both microphysical measurements and field observations. Predominant sources are combustion related, namely, fossil fuels for transportation, solid fuels for industrial and residential uses, and open burning of biomass. Total global emissions of black carbon using bottom-up inventory methods are 7500 Gg yr�-1 in the year 2000 with an uncertainty range of 2000 to 29000. However, global atmospheric absorption attributable to black carbon is too low in many models and should be increased by a factor of almost 3. After this scaling, the best estimate for the industrial-era (1750 to 2005) direct radiative forcing of atmospheric black carbon is +0.71 W m�-2 with 90% uncertainty bounds of (+0.08, +1.27)Wm�-2. Total direct forcing by all black carbon sources, without subtracting the preindustrial background, is estimated as +0.88 (+0.17, +1.48) W m�-2. Direct radiative forcing alone does not capture important rapid adjustment mechanisms. A framework is described and used for quantifying climate forcings, including rapid adjustments. The best estimate of industrial-era climate forcing of black carbon through all forcing mechanisms, including clouds and cryosphere forcing, is +1.1 W m�-2 with 90% uncertainty bounds of +0.17 to +2.1 W m�-2. Thus, there is a very high probability that black carbon emissions, independent of co-emitted species, have a positive forcing and warm the climate. We estimate that black carbon, with a total climate forcing of +1.1 W m�-2, is the second most important human emission in terms of its climate forcing in the present-day atmosphere; only carbon dioxide is estimated to have a greater forcing. Sources that emit black carbon also emit other short-lived species that may either cool or warm climate. Climate forcings from co-emitted species are estimated and used in the framework described herein. When the principal effects of short-lived co-emissions, including cooling agents such as sulfur dioxide, are included in net forcing, energy-related sources (fossil fuel and biofuel) have an industrial-era climate forcing of +0.22 (�-0.50 to +1.08) W m-�2 during the first year after emission. For a few of these sources, such as diesel engines and possibly residential biofuels, warming is strong enough that eliminating all short-lived emissions from these sources would reduce net climate forcing (i.e., produce cooling). When open burning emissions, which emit high levels of organic matter, are included in the total, the best estimate of net industrial-era climate forcing by all short-lived species from black-carbon-rich sources becomes slightly negative (�-0.06 W m�-2 with 90% uncertainty bounds of �-1.45 to +1.29 W m�-2). The uncertainties in net climate forcing from black-carbon-rich sources are substantial, largely due to lack of knowledge about cloud interactions with both black carbon and co-emitted organic carbon. In prioritizing potential black-carbon mitigation actions, non-science factors, such as technical feasibility, costs, policy design, and implementation feasibility play important roles. The major sources of black carbon are presently in different stages with regard to the feasibility for near-term mitigation. This assessment, by evaluating the large number and complexity of the associated physical and radiative processes in black-carbon climate forcing, sets a baseline from which to improve future climate forcing estimates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Techniques are proposed for evaluating forecast probabilities of events. The tools are especially useful when, as in the case of the Survey of Professional Forecasters (SPF) expected probability distributions of inflation, recourse cannot be made to the method of construction in the evaluation of the forecasts. The tests of efficiency and conditional efficiency are applied to the forecast probabilities of events of interest derived from the SPF distributions, and supplement a whole-density evaluation of the SPF distributions based on the probability integral transform approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider different methods for combining probability forecasts. In empirical exercises, the data generating process of the forecasts and the event being forecast is not known, and therefore the optimal form of combination will also be unknown. We consider the properties of various combination schemes for a number of plausible data generating processes, and indicate which types of combinations are likely to be useful. We also show that whether forecast encompassing is found to hold between two rival sets of forecasts or not may depend on the type of combination adopted. The relative performances of the different combination methods are illustrated, with an application to predicting recession probabilities using leading indicators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider whether survey respondents’ probability distributions, reported as histograms, provide reliable and coherent point predictions, when viewed through the lens of a Bayesian learning model. We argue that a role remains for eliciting directly-reported point predictions in surveys of professional forecasters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Particle filters are fully non-linear data assimilation techniques that aim to represent the probability distribution of the model state given the observations (the posterior) by a number of particles. In high-dimensional geophysical applications the number of particles required by the sequential importance resampling (SIR) particle filter in order to capture the high probability region of the posterior, is too large to make them usable. However particle filters can be formulated using proposal densities, which gives greater freedom in how particles are sampled and allows for a much smaller number of particles. Here a particle filter is presented which uses the proposal density to ensure that all particles end up in the high probability region of the posterior probability density function. This gives rise to the possibility of non-linear data assimilation in large dimensional systems. The particle filter formulation is compared to the optimal proposal density particle filter and the implicit particle filter, both of which also utilise a proposal density. We show that when observations are available every time step, both schemes will be degenerate when the number of independent observations is large, unlike the new scheme. The sensitivity of the new scheme to its parameter values is explored theoretically and demonstrated using the Lorenz (1963) model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the winter of 2013/14, much of the UK experienced repeated intense rainfall events and flooding. This had a considerable impact on property and transport infrastructure. A key question is whether the burning of fossil fuels is changing the frequency of extremes, and if so to what extent. We assess the scale of the winter flooding before reviewing a broad range of Earth system drivers affecting UK rainfall. Some drivers can be potentially disregarded for these specific storms whereas others are likely to have increased their risk of occurrence. We discuss the requirements of hydrological models to transform rainfall into river flows and flooding. To determine any general changing flood risk, we argue that accurate modelling needs to capture evolving understanding of UK rainfall interactions with a broad set of factors. This includes changes to multiscale atmospheric, oceanic, solar and sea-ice features, and land-use and demographics. Ensembles of such model simulations may be needed to build probability distributions of extremes for both pre-industrial and contemporary concentration levels of atmospheric greenhouse gases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Windstorms are a main feature of the European climate and exert strong socioeconomic impacts. Large effort has been made in developing and enhancing models to simulate the intensification of windstorms, resulting footprints, and associated impacts. Simulated wind or gust speeds usually differ from observations, as regional climate models have biases and cannot capture all local effects. An approach to adjust regional climate model (RCM) simulations of wind and wind gust toward observations is introduced. For this purpose, 100 windstorms are selected and observations of 173 (111) test sites of the German Weather Service are considered for wind (gust) speed. Theoretical Weibull distributions are fitted to observed and simulated wind and gust speeds, and the distribution parameters of the observations are interpolated onto the RCM computational grid. A probability mapping approach is applied to relate the distributions and to correct the modeled footprints. The results are not only achieved for single test sites but for an area-wide regular grid. The approach is validated using root-mean-square errors on event and site basis, documenting that the method is generally able to adjust the RCM output toward observations. For gust speeds, an improvement on 88 of 100 events and at about 64% of the test sites is reached. For wind, 99 of 100 improved events and ~84% improved sites can be obtained. This gives confidence on the potential of the introduced approach for many applications, in particular those considering wind data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss the characteristics of magnetosheath plasma precipitation in the “cusp” ionosphere for when the reconnection at the dayside magnetopause takes place only in a series of pulses. It is shown that even in this special case, the low-altitude cusp precipitation is continuous, unless the intervals between the pulses are longer than observed intervals between magnetopause flux transfer event (FTE) signatures. We use FTE observation statistics to predict, for this case of entirely pulsed reconnection, the occurrence frequency, the distribution of latitudinal widths, and the number of ion dispersion steps of the cusp precipitation for a variety of locations of the reconnection site and a range of values of the local de-Hoffman Teller velocity. It is found that the cusp occurrence frequency is comparable with observed values for virtually all possible locations of the reconnection site. The distribution of cusp width is also comparable with observations and is shown to be largely dependent on the distribution of the mean reconnection rate, but pulsing the reconnection does very slightly increase the width of that distribution compared with the steady state case. We conclude that neither cusp occurrence probability nor width can be used to evaluate the relative occurrence of reconnection behaviors that are entirely pulsed, pulsed but continuous and quasi-steady. We show that the best test of the relative frequency of these three types of reconnection is to survey the distribution of steps in the cusp ion dispersion characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is an on-going debate on the environmental effects of genetically modified crops to which this paper aims to contribute. First, data on environmental impacts of genetically modified (GM) and conventional crops are collected from peer-reviewed journals, and secondly an analysis is conducted in order to examine which crop type is less harmful for the environment. Published data on environmental impacts are measured using an array of indicators, and their analysis requires their normalisation and aggregation. Taking advantage of composite indicators literature, this paper builds composite indicators to measure the impact of GM and conventional crops in three dimensions: (1) non-target key species richness, (2) pesticide use, and (3) aggregated environmental impact. The comparison between the three composite indicators for both crop types allows us to establish not only a ranking to elucidate which crop is more convenient for the environment but the probability that one crop type outperforms the other from an environmental perspective. Results show that GM crops tend to cause lower environmental impacts than conventional crops for the analysed indicators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While state-of-the-art models of Earth's climate system have improved tremendously over the last 20 years, nontrivial structural flaws still hinder their ability to forecast the decadal dynamics of the Earth system realistically. Contrasting the skill of these models not only with each other but also with empirical models can reveal the space and time scales on which simulation models exploit their physical basis effectively and quantify their ability to add information to operational forecasts. The skill of decadal probabilistic hindcasts for annual global-mean and regional-mean temperatures from the EU Ensemble-Based Predictions of Climate Changes and Their Impacts (ENSEMBLES) project is contrasted with several empirical models. Both the ENSEMBLES models and a “dynamic climatology” empirical model show probabilistic skill above that of a static climatology for global-mean temperature. The dynamic climatology model, however, often outperforms the ENSEMBLES models. The fact that empirical models display skill similar to that of today's state-of-the-art simulation models suggests that empirical forecasts can improve decadal forecasts for climate services, just as in weather, medium-range, and seasonal forecasting. It is suggested that the direct comparison of simulation models with empirical models becomes a regular component of large model forecast evaluations. Doing so would clarify the extent to which state-of-the-art simulation models provide information beyond that available from simpler empirical models and clarify current limitations in using simulation forecasting for decision support. Ultimately, the skill of simulation models based on physical principles is expected to surpass that of empirical models in a changing climate; their direct comparison provides information on progress toward that goal, which is not available in model–model intercomparisons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new class of parameter estimation algorithms is introduced for Gaussian process regression (GPR) models. It is shown that the integration of the GPR model with probability distance measures of (i) the integrated square error and (ii) Kullback–Leibler (K–L) divergence are analytically tractable. An efficient coordinate descent algorithm is proposed to iteratively estimate the kernel width using golden section search which includes a fast gradient descent algorithm as an inner loop to estimate the noise variance. Numerical examples are included to demonstrate the effectiveness of the new identification approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the impact of the auction process of residential properties that whilst unsuccessful at auction sold subsequently. The empirical analysis considers both the probability of sale and the premium of the subsequent sale price over the guide price, reserve and opening bid. The findings highlight that the final achieved sale price is influenced by key price variables revealed both prior to and during the auction itself. Factors such as auction participation, the number of individual bidders and the number of bids are significant in a number of the alternative specifications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two experiments were undertaken to examine whether there is an age-related change in the speed with which readers can capture visual information during fixations in reading. Children’s and adults’ eye movements were recorded as they read sentences that were presented either normally or as “disappearing text”. The disappearing text manipulation had a surprisingly small effect on the children, inconsistent with the notion of an age-related change in the speed with which readers can capture visual information from the page. Instead, we suggest that differences between adults and children are related to the level of difficulty of the sentences for readers of different ages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article provides new insights into the dependence of firm growth on age along the entire distribution of growth rates, and conditional on survival. Using data from the European firms in a global economy survey, and adopting a quantile regression approach, we uncover evidence for a sample of French, Italian and Spanish manufacturing firms with more than ten employees in the period from 2001 to 2008. We find that: (1) young firms grow faster than old firms, especially in the highest growth quantiles; (2) young firms face the same probability of declining as their older counterparts; (3) results are robust to the inclusion of other firms’ characteristics such as labor productivity, capital intensity and the financial structure; (4) high growth is associated with younger chief executive officers and other attributes that capture the attitude of the firm toward growth and change. The effect of age on firm growth is rather similar across countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report between-subject results on the effect of monetary stakes on risk attitudes. While we find the typical risk seeking for small probabilities, risk seeking is reduced under high stakes. This suggests that utility is not consistently concave.