173 resultados para Asymptotic Mean Squared Errors


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulations of the last 500 yr carried out using the Third Hadley Centre Coupled Ocean-Atmosphere GCM (HadCM3) with anthropogenic and natural (solar and volcanic) forcings have been analyzed. Global-mean surface temperature change during the twentieth century is well reproduced. Simulated contributions to global-mean sea level rise during recent decades due to thermal expansion (the largest term) and to mass loss from glaciers and ice caps agree within uncertainties with observational estimates of these terms, but their sum falls short of the observed rate of sea level rise. This discrepancy has been discussed by previous authors; a completely satisfactory explanation of twentieth-century sea level rise is lacking. The model suggests that the apparent onset of sea level rise and glacier retreat during the first part of the nineteenth century was due to natural forcing. The rate of sea level rise was larger during the twentieth century than during the previous centuries because of anthropogenic forcing, but decreasing natural forcing during the second half of the twentieth century tended to offset the anthropogenic acceleration in the rate. Volcanic eruptions cause rapid falls in sea level, followed by recovery over several decades. The model shows substantially less decadal variability in sea level and its thermal expansion component than twentieth-century observations indicate, either because it does not generate sufficient ocean internal variability, or because the observational analyses overestimate the variability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the impact of aerosol forcing uncertainty on the robustness of estimates of the twentieth-century warming attributable to anthropogenic greenhouse gas emissions. Attribution analyses on three coupled climate models with very different sensitivities and aerosol forcing are carried out. The Third Hadley Centre Coupled Ocean - Atmosphere GCM (HadCM3), Parallel Climate Model (PCM), and GFDL R30 models all provide good simulations of twentieth-century global mean temperature changes when they include both anthropogenic and natural forcings. Such good agreement could result from a fortuitous cancellation of errors, for example, by balancing too much ( or too little) greenhouse warming by too much ( or too little) aerosol cooling. Despite a very large uncertainty for estimates of the possible range of sulfate aerosol forcing obtained from measurement campaigns, results show that the spatial and temporal nature of observed twentieth-century temperature change constrains the component of past warming attributable to anthropogenic greenhouse gases to be significantly greater ( at the 5% level) than the observed warming over the twentieth century. The cooling effects of aerosols are detected in all three models. Both spatial and temporal aspects of observed temperature change are responsible for constraining the relative roles of greenhouse warming and sulfate cooling over the twentieth century. This is because there are distinctive temporal structures in differential warming rates between the hemispheres, between land and ocean, and between mid- and low latitudes. As a result, consistent estimates of warming attributable to greenhouse gas emissions are obtained from all three models, and predictions are relatively robust to the use of more or less sensitive models. The transient climate response following a 1% yr(-1) increase in CO2 is estimated to lie between 2.2 and 4 K century(-1) (5-95 percentiles).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance of boreal winter forecasts made with the European Centre for Medium-Range Weather Forecasts (ECMWF) System 11 Seasonal Forecasting System is investigated through analyses of ensemble hindcasts for the period 1987-2001. The predictability, or signal-to-noise ratio, associated with the forecasts, and the forecast skill are examined. On average, forecasts of 500 hPa geopotential height (GPH) have skill in most of the Tropics and in a few regions of the extratropics. There is broad, but not perfect, agreement between regions of high predictability and regions of high skill. However, model errors are also identified, in particular regions where the forecast ensemble spread appears too small. For individual winters the information provided by t-values, a simple measure of the forecast signal-to-noise ratio, is investigated. For 2 m surface air temperature (T2m), highest t-values are found in the Tropics but there is considerable interannual variability, and in the tropical Atlantic and Indian basins this variability is not directly tied to the El Nino Southern Oscillation. For GPH there is also large interannual variability in t-values, but these variations cannot easily be predicted from the strength of the tropical sea-surface-temperature anomalies. It is argued that the t-values for 500 hPa GPH can give valuable insight into the oceanic forcing of the atmosphere that generates predictable signals in the model. Consequently, t-values may be a useful tool for understanding, at a mechanistic level, forecast successes and failures. Lastly, the extent to which t-values are useful as a predictor of forecast skill is investigated. For T2m, t-values provide a useful predictor of forecast skill in both the Tropics and extratropics. Except in the equatorial east Pacific, most of the information in t-values is associated with interannual variability of the ensemble-mean forecast rather than interannual variability of the ensemble spread. For GPH, however, t-values provide a useful predictor of forecast skill only in the tropical Pacific region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using the Met Office large-eddy model (LEM) we simulate a mixed-phase altocumulus cloud that was observed from Chilbolton in southern England by a 94 GHz Doppler radar, a 905 nm lidar, a dual-wavelength microwave radiometer and also by four radiosondes. It is important to test and evaluate such simulations with observations, since there are significant differences between results from different cloud-resolving models for ice clouds. Simulating the Doppler radar and lidar data within the LEM allows us to compare observed and modelled quantities directly, and allows us to explore the relationships between observed and unobserved variables. For general-circulation models, which currently tend to give poor representations of mixed-phase clouds, the case shows the importance of using: (i) separate prognostic ice and liquid water, (ii) a vertical resolution that captures the thin layers of liquid water, and (iii) an accurate representation the subgrid vertical velocities that allow liquid water to form. It is shown that large-scale ascents and descents are significant for this case, and so the horizontally averaged LEM profiles are relaxed towards observed profiles to account for these. The LEM simulation then gives a reasonable. cloud, with an ice-water path approximately two thirds of that observed, with liquid water at the cloud top, as observed. However, the liquid-water cells that form in the updraughts at cloud top in the LEM have liquid-water paths (LWPs) up to half those observed, and there are too few cells, giving a mean LWP five to ten times smaller than observed. In reality, ice nucleation and fallout may deplete ice-nuclei concentrations at the cloud top, allowing more liquid water to form there, but this process is not represented in the model. Decreasing the heterogeneous nucleation rate in the LEM increased the LWP, which supports this hypothesis. The LEM captures the increase in the standard deviation in Doppler velocities (and so vertical winds) with height, but values are 1.5 to 4 times smaller than observed (although values are larger in an unforced model run, this only increases the modelled LWP by a factor of approximately two). The LEM data show that, for values larger than approximately 12 cm s(-1), the standard deviation in Doppler velocities provides an almost unbiased estimate of the standard deviation in vertical winds, but provides an overestimate for smaller values. Time-smoothing the observed Doppler velocities and modelled mass-squared-weighted fallspeeds shows that observed fallspeeds are approximately two-thirds of the modelled values. Decreasing the modelled fallspeeds to those observed increases the modelled IWC, giving an IWP 1.6 times that observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Direct numerical simulations of turbulent flow over regular arrays of urban-like, cubical obstacles are reported. Results are analysed in terms of a formal spatial averaging procedure to enable interpretation of the flow within the arrays as a canopy flow, and of the flow above as a rough wall boundary layer. Spatial averages of the mean velocity, turbulent stresses and pressure drag are computed. The statistics compare very well with data from wind-tunnel experiments. Within the arrays the time-averaged flow structure gives rise to significant 'dispersive stress' whereas above the Reynolds stress dominates. The mean flow structure and turbulence statistics depend significantly on the layout of the cubes. Unsteady effects are important, especially in the lower canopy layer where turbulent fluctuations dominate over the mean flow.