14 resultados para Inhomogeneity
em CentAUR: Central Archive University of Reading - UK
Resumo:
Radiation schemes in general circulation models currently make a number of simplifications when accounting for clouds, one of the most important being the removal of horizontal inhomogeneity. A new scheme is presented that attempts to account for the neglected inhomogeneity by using two regions of cloud in each vertical level of the model as opposed to one. One of these regions is used to represent the optically thinner cloud in the level, and the other represents the optically thicker cloud. So, along with the clear-sky region, the scheme has three regions in each model level and is referred to as “Tripleclouds.” In addition, the scheme has the capability to represent arbitrary vertical overlap between the three regions in pairs of adjacent levels. This scheme is implemented in the Edwards–Slingo radiation code and tested on 250 h of data from 12 different days. The data are derived from cloud retrievals using radar, lidar, and a microwave radiometer at Chilbolton, southern United Kingdom. When the data are grouped into periods equivalent in size to general circulation model grid boxes, the shortwave plane-parallel albedo bias is found to be 8%, while the corresponding bias is found to be less than 1% using Tripleclouds. Similar results are found for the longwave biases. Tripleclouds is then compared to a more conventional method of accounting for inhomogeneity that multiplies optical depths by a constant scaling factor, and Tripleclouds is seen to improve on this method both in terms of top-of-atmosphere radiative flux biases and internal heating rates.
Resumo:
In order to calculate unbiased microphysical and radiative quantities in the presence of a cloud, it is necessary to know not only the mean water content but also the distribution of this water content. This article describes a study of the in-cloud horizontal inhomogeneity of ice water content, based on CloudSat data. In particular, by focusing on the relations with variables that are already available in general circulation models (GCMs), a parametrization of inhomogeneity that is suitable for inclusion in GCM simulations is developed. Inhomogeneity is defined in terms of the fractional standard deviation (FSD), which is given by the standard deviation divided by the mean. The FSD of ice water content is found to increase with the horizontal scale over which it is calculated and also with the thickness of the layer. The connection to cloud fraction is more complicated; for small cloud fractions FSD increases as cloud fraction increases while FSD decreases sharply for overcast scenes. The relations to horizontal scale, layer thickness and cloud fraction are parametrized in a relatively simple equation. The performance of this parametrization is tested on an independent set of CloudSat data. The parametrization is shown to be a significant improvement on the assumption of a single-valued global FSD
Resumo:
The impact of selected observing systems on the European Centre for Medium-Range Weather Forecasts (ECMWF) 40-yr reanalysis (ERA40) is explored by mimicking observational networks of the past. This is accomplished by systematically removing observations from the present observational data base used by ERA40. The observing systems considered are a surface-based system typical of the period prior to 1945/50, obtained by only retaining the surface observations, a terrestrial-based system typical of the period 1950-1979, obtained by removing all space-based observations, and finally a space-based system, obtained by removing all terrestrial observations except those for surface pressure. Experiments using these different observing systems have been limited to seasonal periods selected from the last 10 yr of ERA40. The results show that the surface-based system has severe limitations in reconstructing the atmospheric state of the upper troposphere and stratosphere. The terrestrial system has major limitations in generating the circulation of the Southern Hemisphere with considerable errors in the position and intensity of individual weather systems. The space-based system is able to analyse the larger-scale aspects of the global atmosphere almost as well as the present observing system but performs less well in analysing the smaller-scale aspects as represented by the vorticity field. Here, terrestrial data such as radiosondes and aircraft observations are of paramount importance. The terrestrial system in the form of a limited number of radiosondes in the tropics is also required to analyse the quasi-biennial oscillation phenomenon in a proper way. The results also show the dominance of the satellite observing system in the Southern Hemisphere. These results all indicate that care is required in using current reanalyses in climate studies due to the large inhomogeneity of the available observations, in particular in time.
Resumo:
A poor representation of cloud structure in a general circulation model (GCM) is widely recognised as a potential source of error in the radiation budget. Here, we develop a new way of representing both horizontal and vertical cloud structure in a radiation scheme. This combines the ‘Tripleclouds’ parametrization, which introduces inhomogeneity by using two cloudy regions in each layer as opposed to one, each with different water content values, with ‘exponential-random’ overlap, in which clouds in adjacent layers are not overlapped maximally, but according to a vertical decorrelation scale. This paper, Part I of two, aims to parametrize the two effects such that they can be used in a GCM. To achieve this, we first review a number of studies for a globally applicable value of fractional standard deviation of water content for use in Tripleclouds. We obtain a value of 0.75 ± 0.18 from a variety of different types of observations, with no apparent dependence on cloud type or gridbox size. Then, through a second short review, we create a parametrization of decorrelation scale for use in exponential-random overlap, which varies the scale linearly with latitude from 2.9 km at the Equator to 0.4 km at the poles. When applied to radar data, both components are found to have radiative impacts capable of offsetting biases caused by cloud misrepresentation. Part II of this paper implements Tripleclouds and exponential-random overlap into a radiation code and examines both their individual and combined impacts on the global radiation budget using re-analysis data.
Resumo:
Reliably representing both horizontal cloud inhomogeneity and vertical cloud overlap is fundamentally important for the radiation budget of a general circulation model. Here, we build on the work of Part One of this two-part paper by applying a pair of parameterisations that account for horizontal inhomogeneity and vertical overlap to global re-analysis data. These are applied both together and separately in an attempt to quantify the effects of poor representation of the two components on radiation budget. Horizontal inhomogeneity is accounted for using the “Tripleclouds” scheme, which uses two regions of cloud in each layer of a gridbox as opposed to one; vertical overlap is accounted for using “exponential-random” overlap, which aligns vertically continuous cloud according to a decorrelation height. These are applied to a sample of scenes from a year of ERA-40 data. The largest radiative effect of horizontal inhomogeneity is found to be in areas of marine stratocumulus; the effect of vertical overlap is found to be fairly uniform, but with larger individual short-wave and long-wave effects in areas of deep, tropical convection. The combined effect of the two parameterisations is found to reduce the magnitude of the net top-of-atmosphere cloud radiative forcing (CRF) by 2.25 W m−2, with shifts of up to 10 W m−2 in areas of marine stratocumulus. The effects of the uncertainty in our parameterisations on radiation budget is also investigated. It is found that the uncertainty in the impact of horizontal inhomogeneity is of order ±60%, while the uncertainty in the impact of vertical overlap is much smaller. This suggests an insensitivity of the radiation budget to the exact nature of the global decorrelation height distribution derived in Part One.
Resumo:
Rainfall can be modeled as a spatially correlated random field superimposed on a background mean value; therefore, geostatistical methods are appropriate for the analysis of rain gauge data. Nevertheless, there are certain typical features of these data that must be taken into account to produce useful results, including the generally non-Gaussian mixed distribution, the inhomogeneity and low density of observations, and the temporal and spatial variability of spatial correlation patterns. Many studies show that rigorous geostatistical analysis performs better than other available interpolation techniques for rain gauge data. Important elements are the use of climatological variograms and the appropriate treatment of rainy and nonrainy areas. Benefits of geostatistical analysis for rainfall include ease of estimating areal averages, estimation of uncertainties, and the possibility of using secondary information (e.g., topography). Geostatistical analysis also facilitates the generation of ensembles of rainfall fields that are consistent with a given set of observations, allowing for a more realistic exploration of errors and their propagation in downstream models, such as those used for agricultural or hydrological forecasting. This article provides a review of geostatistical methods used for kriging, exemplified where appropriate by daily rain gauge data from Ethiopia.
Resumo:
Road transport and shipping are copious sources of aerosols, which exert a 9 significant radiative forcing, compared to, for example, the CO2 emitted by these sectors. An 10 advanced atmospheric general circulation model, coupled to a mixed-layer ocean, is used to 11 calculate the climate response to the direct radiative forcing from such aerosols. The cases 12 considered include imposed distributions of black carbon and sulphate aerosols from road 13 transport, and sulphate aerosols from shipping; these are compared to the climate response 14 due to CO2 increases. The difficulties in calculating the climate response due to small 15 forcings are discussed, as the actual forcings have to be scaled by large amounts to enable a 16 climate response to be easily detected. Despite the much greater geographical inhomogeneity 17 in the sulphate forcing, the patterns of zonal and annual-mean surface temperature response 18 (although opposite in sign) closely resembles that resulting from homogeneous changes in 19 CO2. The surface temperature response to black carbon aerosols from road transport is shown 20 to be notably non-linear in scaling applied, probably due to the semi-direct response of clouds 21 to these aerosols. For the aerosol forcings considered here, the most widespread method of 22 calculating radiative forcing significantly overestimates their effect, relative to CO2, 23 compared to surface temperature changes calculated using the climate model.
Resumo:
Satellite data are used to quantify and examine the bias in the outgoing long-wave (LW) radiation over North Africa during May–July simulated by a range of climate models and the Met Office global numerical weather prediction (NWP) model. Simulations from an ensemble-mean of multiple climate models overestimate outgoing clear-sky long-wave radiation (LWc) by more than 20 W m−2 relative to observations from Clouds and the Earth's Radiant Energy System (CERES) for May–July 2000 over parts of the west Sahara, and by 9 W m−2 for the North Africa region (20°W–30°E, 10–40°N). Experiments with the atmosphere-only version of the High-resolution Hadley Centre Global Environment Model (HiGEM), suggest that including mineral dust radiative effects removes this bias. Furthermore, only by reducing surface temperature and emissivity by unrealistic amounts is it possible to explain the magnitude of the bias. Comparing simulations from the Met Office NWP model with satellite observations from Geostationary Earth Radiation Budget (GERB) instruments suggests that the model overestimates the LW by 20–40 W m−2 during North African summer. The bias declines over the period 2003–2008, although this is likely to relate to improvements in the model and inhomogeneity in the satellite time series. The bias in LWc coincides with high aerosol dust loading estimated from the Ozone Monitoring Instrument (OMI), including during the GERBILS field campaign (18–28 June 2007) where model overestimates in LWc greater than 20 W m−2 and OMI-estimated aerosol optical depth (AOD) greater than 0.8 are concurrent around 20°N, 0–20°W. A model-minus-GERB LW bias of around 30 W m−2 coincides with high AOD during the period 18–21 June 2007, although differences in cloud cover also impact the model–GERB differences. Copyright © Royal Meteorological Society and Crown Copyright, 2010
Resumo:
The theory of homogeneous barotropic beta-plane turbulence is here extended to include effects arising from spatial inhomogeneity in the form of a zonal shear flow. Attention is restricted to the geophysically important case of zonal flows that are barotropically stable and are of larger scale than the resulting transient eddy field. Because of the presumed scale separation, the disturbance enstrophy is approximately conserved in a fully nonlinear sense, and the (nonlinear) wave-mean-flow interaction may be characterized as a shear-induced spectral transfer of disturbance enstrophy along lines of constant zonal wavenumber k. In this transfer the disturbance energy is generally not conserved. The nonlinear interactions between different disturbance components are turbulent for scales smaller than the inverse of Rhines's cascade-arrest scale κβ[identical with] (β0/2urms)½ and in this regime their leading-order effect may be characterized as a tendency to spread the enstrophy (and energy) along contours of constant total wavenumber κ [identical with] (k2 + l2)½. Insofar as this process of turbulent isotropization involves spectral transfer of disturbance enstrophy across lines of constant zonal wavenumber k, it can be readily distinguished from the shear-induced transfer which proceeds along them. However, an analysis in terms of total wavenumber K alone, which would be justified if the flow were homogeneous, would tend to mask the differences. The foregoing theoretical ideas are tested by performing direct numerical simulation experiments. It is found that the picture of classical beta-plane turbulence is altered, through the effect of the large-scale zonal flow, in the following ways: (i) while the turbulence is still confined to K Kβ, the disturbance field penetrates to the largest scales of motion; (ii) the larger disturbance scales K < Kβ exhibit a tendency to meridional rather than zonal anisotropy, namely towards v2 > u2 rather than vice versa; (iii) the initial spectral transfer rate away from an isotropic intermediate-scale source is significantly enhanced by the shear-induced transfer associated with straining by the zonal flow. This last effect occurs even when the large-scale shear appears weak to the energy-containing eddies, in the sense that dU/dy [double less-than sign] κ for typical eddy length and velocity scales.
Resumo:
Faced with the strongly nonlinear and apparently random behaviour of the energy-containing scales in the atmosphere, geophysical fluid dynamicists have attempted to understand the synoptic-scale atmospheric flow within the context of two-dimensional homogeneous turbulence theory (e.g. FJØRTOFT [1]; LEITH [2]). However atmospheric observations (BOER and SHEPHERD [3] and Fig.1) show that the synoptic-scale transient flow evolves in the presence of a planetary-scale, quasi-stationary background flow which is approximately zonal (east-west). Classical homogeneous 2-D turbulence theory is therefore not strictly applicable to the transient flow. One is led instead to study 2-D turbulence in the presence of a large-scale (barotropically stable) zonal jet inhomogeneity.
Resumo:
EVENT has been used to examine the effects of 3D cloud structure, distribution, and inhomogeneity on the scattering of visible solar radiation and the resulting 3D radiation field. Large eddy simulation and aircraft measurements are used to create realistic cloud fields which are continuous or broken with smooth or uneven tops. The values, patterns and variance in the resulting downwelling and upwelling radiation from incident visible solar radiation at different angles are then examined and compared to measurements. The results from EVENT confirm that 3D cloud structure is important in determining the visible radiation field, and that these results are strongly influenced by the solar zenith angle. The results match those from other models using visible solar radiation, and are supported by aircraft measurements of visible radiation, providing confidence in the new model.
Resumo:
The subgrid-scale spatial variability in cloud water content can be described by a parameter f called the fractional standard deviation. This is equal to the standard deviation of the cloud water content divided by the mean. This parameter is an input to schemes that calculate the impact of subgrid-scale cloud inhomogeneity on gridbox-mean radiative fluxes and microphysical process rates. A new regime-dependent parametrization of the spatial variability of cloud water content is derived from CloudSat observations of ice clouds. In addition to the dependencies on horizontal and vertical resolution and cloud fraction included in previous parametrizations, the new parametrization includes an explicit dependence on cloud type. The new parametrization is then implemented in the Global Atmosphere 6 (GA6) configuration of the Met Office Unified Model and used to model the effects of subgrid variability of both ice and liquid water content on radiative fluxes and autoconversion and accretion rates in three 20-year atmosphere-only climate simulations. These simulations show the impact of the new regime-dependent parametrization on diagnostic radiation calculations, interactive radiation calculations and both interactive radiation calculations and in a new warm microphysics scheme. The control simulation uses a globally constant f value of 0.75 to model the effect of cloud water content variability on radiative fluxes. The use of the new regime-dependent parametrization in the model results in a global mean which is higher than the control's fixed value and a global distribution of f which is closer to CloudSat observations. When the new regime-dependent parametrization is used in radiative transfer calculations only, the magnitudes of short-wave and long-wave top of atmosphere cloud radiative forcing are reduced, increasing the existing global mean biases in the control. When also applied in a new warm microphysics scheme, the short-wave global mean bias is reduced.
Resumo:
The Monte Carlo Independent Column Approximation (McICA) is a flexible method for representing subgrid-scale cloud inhomogeneity in radiative transfer schemes. It does, however, introduce conditional random errors but these have been shown to have little effect on climate simulations, where spatial and temporal scales of interest are large enough for effects of noise to be averaged out. This article considers the effect of McICA noise on a numerical weather prediction (NWP) model, where the time and spatial scales of interest are much closer to those at which the errors manifest themselves; this, as we show, means that noise is more significant. We suggest methods for efficiently reducing the magnitude of McICA noise and test these methods in a global NWP version of the UK Met Office Unified Model (MetUM). The resultant errors are put into context by comparison with errors due to the widely used assumption of maximum-random-overlap of plane-parallel homogeneous cloud. For a simple implementation of the McICA scheme, forecasts of near-surface temperature are found to be worse than those obtained using the plane-parallel, maximum-random-overlap representation of clouds. However, by applying the methods suggested in this article, we can reduce noise enough to give forecasts of near-surface temperature that are an improvement on the plane-parallel maximum-random-overlap forecasts. We conclude that the McICA scheme can be used to improve the representation of clouds in NWP models, with the provision that the associated noise is sufficiently small.