825 resultados para confidence in sentencing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sea-level rise is an important aspect of climate change because of its impact on society and ecosystems. Here we present an intercomparison of results from ten coupled atmosphere-ocean general circulation models (AOGCMs) for sea-level changes simulated for the twentieth century and projected to occur during the twenty first century in experiments following scenario IS92a for greenhouse gases and sulphate aerosols. The model results suggest that the rate of sea-level rise due to thermal expansion of sea water has increased during the twentieth century, but the small set of tide gauges with long records might not be adequate to detect this acceleration. The rate of sea-level rise due to thermal expansion continues to increase throughout the twenty first century, and the projected total is consequently larger than in the twentieth century; for 1990-2090 it amounts to 0.20-0.37 in. This wide range results from systematic uncertainty in modelling of climate change and of heat uptake by the ocean. The AOGCMs agree that sea-level rise is expected to be geographically non-uniform, with some regions experiencing as much as twice the global average, and others practically zero, but they do not agree about the geographical pattern. The lack of agreement indicates that we cannot currently have confidence in projections of local sea- level changes, and reveals a need for detailed analysis and intercomparison in order to understand and reduce the disagreements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

What impact do international state-building missions have on the domestic politics of states they seek to build, and how can we measure this impact with confidence? This article seeks to address these questions and challenge some existing approaches that often appear to assume that state-builders leave lasting legacies rather than demonstrating such influence with the use of carefully chosen empirical evidence. Too often, domestic conditions that follow in the wake of international state-building are assumed to follow as a result of international intervention, usually due to insufficient attention to the causal processes that link international actions to domestic outcomes. The article calls for greater appreciation of the methodological challenges to establishing causal inferences regarding the legacies of state-building and identifies three qualitative methodological strategies—process tracing, counterfactual analysis, and the use of control cases—that can be used to improve confidence in causal claims about state-building legacies. The article concludes with a case study of international state-building in East Timor, highlighting several flaws of existing evaluations of the United Nations' role in East Timor and identifying the critical role that domestic actors play even in the context of authoritative international intervention

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In order to validate the reported precision of space‐based atmospheric composition measurements, validation studies often focus on measurements in the tropical stratosphere, where natural variability is weak. The scatter in tropical measurements can then be used as an upper limit on single‐profile measurement precision. Here we introduce a method of quantifying the scatter of tropical measurements which aims to minimize the effects of short‐term atmospheric variability while maintaining large enough sample sizes that the results can be taken as representative of the full data set. We apply this technique to measurements of O3, HNO3, CO, H2O, NO, NO2, N2O, CH4, CCl2F2, and CCl3F produced by the Atmospheric Chemistry Experiment–Fourier Transform Spectrometer (ACE‐FTS). Tropical scatter in the ACE‐FTS retrievals is found to be consistent with the reported random errors (RREs) for H2O and CO at altitudes above 20 km, validating the RREs for these measurements. Tropical scatter in measurements of NO, NO2, CCl2F2, and CCl3F is roughly consistent with the RREs as long as the effect of outliers in the data set is reduced through the use of robust statistics. The scatter in measurements of O3, HNO3, CH4, and N2O in the stratosphere, while larger than the RREs, is shown to be consistent with the variability simulated in the Canadian Middle Atmosphere Model. This result implies that, for these species, stratospheric measurement scatter is dominated by natural variability, not random error, which provides added confidence in the scientific value of single‐profile measurements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Evaluating CCMs with the presented framework will increase our confidence in predictions of stratospheric ozone change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a benchmark system for global vegetation models. This system provides a quantitative evaluation of multiple simulated vegetation properties, including primary production; seasonal net ecosystem production; vegetation cover, composition and 5 height; fire regime; and runoff. The benchmarks are derived from remotely sensed gridded datasets and site-based observations. The datasets allow comparisons of annual average conditions and seasonal and inter-annual variability, and they allow the impact of spatial and temporal biases in means and variability to be assessed separately. Specifically designed metrics quantify model performance for each process, 10 and are compared to scores based on the temporal or spatial mean value of the observations and a “random” model produced by bootstrap resampling of the observations. The benchmark system is applied to three models: a simple light-use efficiency and water-balance model (the Simple Diagnostic Biosphere Model: SDBM), and the Lund-Potsdam-Jena (LPJ) and Land Processes and eXchanges (LPX) dynamic global 15 vegetation models (DGVMs). SDBM reproduces observed CO2 seasonal cycles, but its simulation of independent measurements of net primary production (NPP) is too high. The two DGVMs show little difference for most benchmarks (including the interannual variability in the growth rate and seasonal cycle of atmospheric CO2), but LPX represents burnt fraction demonstrably more accurately. Benchmarking also identified 20 several weaknesses common to both DGVMs. The benchmarking system provides a quantitative approach for evaluating how adequately processes are represented in a model, identifying errors and biases, tracking improvements in performance through model development, and discriminating among models. Adoption of such a system would do much to improve confidence in terrestrial model predictions of climate change 25 impacts and feedbacks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sea surface temperature (SST) measurements are required by operational ocean and atmospheric forecasting systems to constrain modeled upper ocean circulation and thermal structure. The Global Ocean Data Assimilation Experiment (GODAE) High Resolution SST Pilot Project (GHRSST-PP) was initiated to address these needs by coordinating the provision of accurate, high-resolution, SST products for the global domain. The pilot project is now complete, but activities continue within the Group for High Resolution SST (GHRSST). The pilot project focused on harmonizing diverse satellite and in situ data streams that were indexed, processed, quality controlled, analyzed, and documented within a Regional/Global Task Sharing (R/GTS) framework implemented in an internationally distributed manner. Data with meaningful error estimates developed within GHRSST are provided by services within R/GTS. Currently, several terabytes of data are processed at international centers daily, creating more than 25 gigabytes of product. Ensemble SST analyses together with anomaly SST outputs are generated each day, providing confidence in SST analyses via diagnostic outputs. Diagnostic data sets are generated and Web interfaces are provided to monitor the quality of observation and analysis products. GHRSST research and development projects continue to tackle problems of instrument calibration, algorithm development, diurnal variability, skin temperature deviation, and validation/verification of GHRSST products. GHRSST also works closely with applications and users, providing a forum for discussion and feedback between SST users and producers on a regular basis. All data within the GHRSST R/GTS framework are freely available. This paper reviews the progress of GHRSST-PP, highlighting achievements that have been fundamental to the success of the pilot project.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Future climate change projections are often derived from ensembles of simulations from multiple global circulation models using heuristic weighting schemes. This study provides a more rigorous justification for this by introducing a nested family of three simple analysis of variance frameworks. Statistical frameworks are essential in order to quantify the uncertainty associated with the estimate of the mean climate change response. The most general framework yields the “one model, one vote” weighting scheme often used in climate projection. However, a simpler additive framework is found to be preferable when the climate change response is not strongly model dependent. In such situations, the weighted multimodel mean may be interpreted as an estimate of the actual climate response, even in the presence of shared model biases. Statistical significance tests are derived to choose the most appropriate framework for specific multimodel ensemble data. The framework assumptions are explicit and can be checked using simple tests and graphical techniques. The frameworks can be used to test for evidence of nonzero climate response and to construct confidence intervals for the size of the response. The methodology is illustrated by application to North Atlantic storm track data from the Coupled Model Intercomparison Project phase 5 (CMIP5) multimodel ensemble. Despite large variations in the historical storm tracks, the cyclone frequency climate change response is not found to be model dependent over most of the region. This gives high confidence in the response estimates. Statistically significant decreases in cyclone frequency are found on the flanks of the North Atlantic storm track and in the Mediterranean basin.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a benchmark system for global vegetation models. This system provides a quantitative evaluation of multiple simulated vegetation properties, including primary production; seasonal net ecosystem production; vegetation cover; composition and height; fire regime; and runoff. The benchmarks are derived from remotely sensed gridded datasets and site-based observations. The datasets allow comparisons of annual average conditions and seasonal and inter-annual variability, and they allow the impact of spatial and temporal biases in means and variability to be assessed separately. Specifically designed metrics quantify model performance for each process, and are compared to scores based on the temporal or spatial mean value of the observations and a "random" model produced by bootstrap resampling of the observations. The benchmark system is applied to three models: a simple light-use efficiency and water-balance model (the Simple Diagnostic Biosphere Model: SDBM), the Lund-Potsdam-Jena (LPJ) and Land Processes and eXchanges (LPX) dynamic global vegetation models (DGVMs). In general, the SDBM performs better than either of the DGVMs. It reproduces independent measurements of net primary production (NPP) but underestimates the amplitude of the observed CO2 seasonal cycle. The two DGVMs show little difference for most benchmarks (including the inter-annual variability in the growth rate and seasonal cycle of atmospheric CO2), but LPX represents burnt fraction demonstrably more accurately. Benchmarking also identified several weaknesses common to both DGVMs. The benchmarking system provides a quantitative approach for evaluating how adequately processes are represented in a model, identifying errors and biases, tracking improvements in performance through model development, and discriminating among models. Adoption of such a system would do much to improve confidence in terrestrial model predictions of climate change impacts and feedbacks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effects of auditory distraction in memory tasks have been examined to date with procedures that minimize participants’ control over their own memory processes. Surprisingly little attention has been paid to metacognitive control factors which might affect memory performance. In this study, we investigate the effects of auditory distraction on metacognitive control of memory, examining the effects of auditory distraction in recognition tasks utilizing the metacognitive framework of Koriat and Goldsmith (1996), to determine whether strategic regulation of memory accuracy is impacted by auditory distraction. Results replicated previous findings in showing that auditory distraction impairs memory performance in tasks minimizing participants’ metacognitive control (forced-report test). However, the results revealed also that when metacognitive control is allowed (free-report tests), auditory distraction impacts upon a range of metacognitive indices. In the present study, auditory distraction undermined accuracy of metacognitive monitoring (resolution), reduced confidence in responses provided and, correspondingly, increased participants’ propensity to withhold responses in free-report recognition. Crucially, changes in metacognitive processes were related to impairment in free-report recognition performance, as the use of the ‘don’t know’ option under distraction led to a reduction in the number of correct responses volunteered in free-report tests. Overall, the present results show how auditory distraction exerts its influence on memory performance via both memory and metamemory processes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Decades of research attest that memory processes suffer under conditions of auditory distraction. What is however less well understood is whether people are able to modify how their memory processes are deployed in order to compensate for disruptive effects of distraction. The metacognitive approach to memory describes a variety of ways people can exert control over their cognitive processes to optimize performance. Here we describe our recent investigations into how these control processes change under conditions of auditory distraction. We specifically looked at control of encoding in the form of decisions about how long to study a word when it is presented and control of memory reporting in the form of decisions whether to volunteer or withhold retrieved details. Regarding control of encoding, we expected that people would compensate for disruptive effects of distraction by extending study time under noise. Our results revealed, however, that when exposed to irrelevant speech, people curtail rather than extend study. Regarding control of memory reporting, we expected that people would compensate for the loss of access to memory records by volunteering responses held with lower confidence. Our results revealed, however, that people’s reporting strategies do not differ when memory task is performed in silence or under auditory distraction, although distraction seriously undermines people’s confidence in their own responses. Together, our studies reveal novel avenues for investigating the psychological effects of auditory distraction within a metacognitive framework.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To date there has been no systematic study of the relationship between individuals’ opinions of different institutions and their perceptions of world affairs. This article tries to fill this gap by using a large cross-country data set comprising nine EU members and seven Asian nations and instrumental variable bivariate probit regression analysis. Controlling for a host of factors, the article shows that individuals’ confidence in multilateral institutions affects their perceptions of whether or not their country is being treated fairly in international affairs. This finding expands upon both theoretical work on multilateral institutions that has focused on state actors’ rationale for engaging in multilateral cooperation and empirical work that has treated confidence in multilateral institutions as a dependent variable. The article also shows that individuals’ confidence in different international organizations has undifferentiated effects on their perceptions of whether or not their country is being treated fairly in international affairs, though individuals more knowledgeable about international affairs exhibit slightly different attitudes. Finally, the article demonstrates significant differences in opinion across Europe and Asia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Following recent findings, the interaction between resolved (Rossby) wave drag and parameterized orographic gravity wave drag (OGWD) is investigated, in terms of their driving of the Brewer–Dobson circulation (BDC), in a comprehensive climate model. To this end, the parameter that effectively determines the strength of OGWD in present-day and doubled CO2 simulations is varied. The authors focus on the Northern Hemisphere during winter when the largest response of the BDC to climate change is predicted to occur. It is found that increases in OGWD are to a remarkable degree compensated by a reduction in midlatitude resolved wave drag, thereby reducing the impact of changes in OGWD on the BDC. This compensation is also found for the response to climate change: changes in the OGWD contribution to the BDC response to climate change are compensated by opposite changes in the resolved wave drag contribution to the BDC response to climate change, thereby reducing the impact of changes in OGWD on the BDC response to climate change. By contrast, compensation does not occur at northern high latitudes, where resolved wave driving and the associated downwelling increase with increasing OGWD, both for the present-day climate and the response to climate change. These findings raise confidence in the credibility of climate model projections of the strengthened BDC.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We evaluate the ability of process based models to reproduce observed global mean sea-level change. When the models are forced by changes in natural and anthropogenic radiative forcing of the climate system and anthropogenic changes in land-water storage, the average of the modelled sea-level change for the periods 1900–2010, 1961–2010 and 1990–2010 is about 80%, 85% and 90% of the observed rise. The modelled rate of rise is over 1 mm yr−1 prior to 1950, decreases to less than 0.5 mm yr−1 in the 1960s, and increases to 3 mm yr−1 by 2000. When observed regional climate changes are used to drive a glacier model and an allowance is included for an ongoing adjustment of the ice sheets, the modelled sea-level rise is about 2 mm yr−1 prior to 1950, similar to the observations. The model results encompass the observed rise and the model average is within 20% of the observations, about 10% when the observed ice sheet contributions since 1993 are added, increasing confidence in future projections for the 21st century. The increased rate of rise since 1990 is not part of a natural cycle but a direct response to increased radiative forcing (both anthropogenic and natural), which will continue to grow with ongoing greenhouse gas emissions

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Confidence in projections of global-mean sea level rise (GMSLR) depends on an ability to account for GMSLR during the twentieth century. There are contributions from ocean thermal expansion, mass loss from glaciers and ice sheets, groundwater extraction, and reservoir impoundment. Progress has been made toward solving the “enigma” of twentieth-century GMSLR, which is that the observed GMSLR has previously been found to exceed the sum of estimated contributions, especially for the earlier decades. The authors propose the following: thermal expansion simulated by climate models may previously have been underestimated because of their not including volcanic forcing in their control state; the rate of glacier mass loss was larger than previously estimated and was not smaller in the first half than in the second half of the century; the Greenland ice sheet could have made a positive contribution throughout the century; and groundwater depletion and reservoir impoundment, which are of opposite sign, may have been approximately equal in magnitude. It is possible to reconstruct the time series of GMSLR from the quantified contributions, apart from a constant residual term, which is small enough to be explained as a long-term contribution from the Antarctic ice sheet. The reconstructions account for the observation that the rate of GMSLR was not much larger during the last 50 years than during the twentieth century as a whole, despite the increasing anthropogenic forcing. Semiempirical methods for projecting GMSLR depend on the existence of a relationship between global climate change and the rate of GMSLR, but the implication of the authors' closure of the budget is that such a relationship is weak or absent during the twentieth century.