19 resultados para 2-sigma error


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate assessments of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the climate policy process, and project future climate change. Present-day analysis requires the combination of a range of data, algorithms, statistics and model estimates and their interpretation by a broad scientific community. Here we describe datasets and a methodology developed by the global carbon cycle science community to quantify all major components of the global carbon budget, including their uncertainties. We discuss changes compared to previous estimates, consistency within and among components, and methodology and data limitations. CO2 emissions from fossil fuel combustion and cement production (EFF) are based on energy statistics, while emissions from Land-Use Change (ELUC), including deforestation, are based on combined evidence from land cover change data, fire activity in regions undergoing deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. Finally, the global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms. For the last decade available (2002–2011), EFF was 8.3 ± 0.4 PgC yr−1, ELUC 1.0 ± 0.5 PgC yr−1, GATM 4.3 ± 0.1 PgC yr−1, SOCEAN 2.5 ± 0.5 PgC yr−1, and SLAND 2.6 ± 0.8 PgC yr−1. For year 2011 alone, EFF was 9.5 ± 0.5 PgC yr−1, 3.0 percent above 2010, reflecting a continued trend in these emissions; ELUC was 0.9 ± 0.5 PgC yr−1, approximately constant throughout the decade; GATM was 3.6 ± 0.2 PgC yr−1, SOCEAN was 2.7 ± 0.5 PgC yr−1, and SLAND was 4.1 ± 0.9 PgC yr−1. GATM was low in 2011 compared to the 2002–2011 average because of a high uptake by the land probably in response to natural climate variability associated to La Niña conditions in the Pacific Ocean. The global atmospheric CO2 concentration reached 391.31 ± 0.13 ppm at the end of year 2011. We estimate that EFF will have increased by 2.6% (1.9–3.5%) in 2012 based on projections of gross world product and recent changes in the carbon intensity of the economy. All uncertainties are reported as ±1 sigma (68% confidence assuming Gaussian error distributions that the real value lies within the given interval), reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. This paper is intended to provide a baseline to keep track of annual carbon budgets in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We derive the fermion loop formulation for the supersymmetric nonlinear O(N) sigma model by performing a hopping expansion using Wilson fermions. In this formulation the fermionic contribution to the partition function becomes a sum over all possible closed non-oriented fermion loop configurations. The interaction between the bosonic and fermionic degrees of freedom is encoded in the constraints arising from the supersymmetry and induces flavour changing fermion loops. For N ≥ 3 this leads to fermion loops which are no longer self-avoiding and hence to a potential sign problem. Since we use Wilson fermions the bare mass needs to be tuned to the chiral point. For N = 2 we determine the critical point and present boson and fermion masses in the critical regime.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the climate policy process, and project future climate change. Present-day analysis requires the combination of a range of data, algorithms, statistics and model estimates and their interpretation by a broad scientific community. Here we describe datasets and a methodology developed by the global carbon cycle science community to quantify all major components of the global carbon budget, including their uncertainties. We discuss changes compared to previous estimates, consistency within and among components, and methodology and data limitations. Based on energy statistics, we estimate that the global emissions of CO2 from fossil fuel combustion and cement production were 9.5 ± 0.5 PgC yr−1 in 2011, 3.0 percent above 2010 levels. We project these emissions will increase by 2.6% (1.9–3.5%) in 2012 based on projections of Gross World Product and recent changes in the carbon intensity of the economy. Global net CO2 emissions from Land-Use Change, including deforestation, are more difficult to update annually because of data availability, but combined evidence from land cover change data, fire activity in regions undergoing deforestation and models suggests those net emissions were 0.9 ± 0.5 PgC yr−1 in 2011. The global atmospheric CO2 concentration is measured directly and reached 391.38 ± 0.13 ppm at the end of year 2011, increasing 1.70 ± 0.09 ppm yr−1 or 3.6 ± 0.2 PgC yr−1 in 2011. Estimates from four ocean models suggest that the ocean CO2 sink was 2.6 ± 0.5 PgC yr−1 in 2011, implying a global residual terrestrial CO2 sink of 4.1 ± 0.9 PgC yr−1. All uncertainties are reported as ±1 sigma (68% confidence assuming Gaussian error distributions that the real value lies within the given interval), reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. This paper is intended to provide a baseline to keep track of annual carbon budgets in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.