468 resultados para Krook, Lennart
Resumo:
The recent global tropospheric temperature trend can be reproduced by climate models that are forced only by observed sea surface temperature (SST) anomalies. In this study, simulations with the Hamburg climate model (ECHAM) are compared to temperatures from microwave sounding units (MSU) and to reanalyses from the European Centre for Medium-Range Weather Forecasts. There is overall agreement of observed and simulated tropospheric temperature anomalies in many regions, in particular in the tropics and over the oceans, which lack conventional observing systems. This provides the opportunity to link physically different quantities, such as surface observations or analyses (SST) and satellite soundings (MSU) by means of a general circulation model. The proposed method can indicate inconsistencies between MSU temperatures and SSTs and has apparently done so. Differences between observed and simulated tropospheric temperature anomalies can partly be attributed to stratospheric aerosol variations due to major volcanic eruptions.
Resumo:
The use of a high resolution atmospheric model at T106 resolution, for studying the influence of greenhouse warming on tropical storm climatology, is investigated. The same method for identifying the storms has been used as in a previous study by Bengtsson et al. The sea surface temperature anomalies have been taken from a previous transient climate change experiment, obtained with a low resolution ocean-atmosphere coupled model. The global distribution of the storms, at the time when the CO2 concentration in the atmosphere had doubled, agrees in geographical position and seasonal variability with that of the present climate, but the number of storms is significantly reduced, particularly at the Southern Hemisphere. The main reason to this, appear to be connected to changes in the large scale circulation, such as a weaker Hadley circulation and stronger upper air westerlies. The low level vorticity in the hurricane genesis regions is generally reduced compared to the present climate, while the vertical tropospheric wind shear is somewhat increased. Most tropical storm regions indicate reduced surface windspeeds and a slightly weaker hydrological cycle.
Resumo:
A simple four-dimensional assimilation technique, called Newtonian relaxation, has been applied to the Hamburg climate model (ECHAM), to enable comparison of model output with observations for short periods of time. The prognostic model variables vorticity, divergence, temperature, and surface pressure have been relaxed toward European Center for Medium-Range Weather Forecasts (ECMWF) global meteorological analyses. Several experiments have been carried out, in which the values of the relaxation coefficients have been varied to find out which values are most usable for our purpose. To be able to use the method for validation of model physics or chemistry, good agreement of the model simulated mass and wind field is required. In addition, the model physics should not be disturbed too strongly by the relaxation forcing itself. Both aspects have been investigated. Good agreement with basic observed quantities, like wind, temperature, and pressure is obtained for most simulations in the extratropics. Derived variables, like precipitation and evaporation, have been compared with ECMWF forecasts and observations. Agreement for these variables is smaller than for the basic observed quantities. Nevertheless, considerable improvement is obtained relative to a control run without assimilation. Differences between tropics and extratropics are smaller than for the basic observed quantities. Results also show that precipitation and evaporation are affected by a sort of continuous spin-up which is introduced by the relaxation: the bias (ECMWF-ECHAM) is increasing with increasing relaxation forcing. In agreement with this result we found that with increasing relaxation forcing the vertical exchange of tracers by turbulent boundary layer mixing and, in a lesser extent, by convection, is reduced.
Resumo:
The climate and natural variability of the large-scale stratospheric circulation simulated by a newly developed general circulation model are evaluated against available global observations. The simulation consisted of a 30-year annual cycle integration performed with a comprehensive model of the troposphere and stratosphere. The observations consisted of a 15-year dataset from global operational analyses of the troposphere and stratosphere. The model evaluation concentrates on the simulation of the evolution of the extratropical stratospheric circulation in both hemispheres. The December–February climatology of the observed zonal mean winter circulation is found to be reasonably well captured by the model, although in the Northern Hemisphere upper stratosphere the simulated westerly winds are systematically stronger and a cold bias is apparent in the polar stratosphere. This Northern Hemisphere stratospheric cold bias virtually disappears during spring (March–May), consistent with a realistic simulation of the spring weakening of the mean westerly winds in the model. A considerable amount of monthly interannual variability is also found in the simulation in the Northern Hemisphere in late winter and early spring. The simulated interannual variability is predominantly caused by polar warmings of the stratosphere, in agreement with observations. The breakdown of the Northern Hemisphere stratospheric polar vortex appears therefore to occur in a realistic way in the model. However, in early winter the model severely underestimates the interannual variability, especially in the upper troposphere. The Southern Hemisphere winter (June–August) zonal mean temperature is systematically colder in the model, and the simulated winds are somewhat too strong in the upper stratosphere. Contrary to the results for the Northern Hemisphere spring, this model cold bias worsens during the Southern Hemisphere spring (September–November). Significant discrepancies between the model results and the observations are therefore found during the breakdown of the Southern Hemisphere polar vortex. For instance, the simulated Southern Hemisphere stratosphere westerly jet continuously decreases in intensity more or less in situ from June to November, while the observed stratospheric jet moves downward and poleward.
Resumo:
A high-resolution GCM is found to simulate precipitation and surface energy balance of high latitudes with high accuracy. This opens new possibilities to investigate the future mass balance of polar glaciers and its effect on sea level. The surface mass balance of the Greenland and the Antarctic ice sheets is simulated using the ECHAM3 GCM with TI06 horizontal resolution. With this model, two 5-year integrations for the present and doubled carbon dioxide conditions based on the boundary conditions provided by the ECHAM1/T21 transient experiment have been conducted. A comparison of the two experiments over Greenland and Antarctica shows to what extent the effect of climate change on the mass balance on the two largest glaciers of the world can differ. On Greenland one sees a slight decrease in accumulation and a substantial increase in melt, while on Antarctica a large increase in accumulation without melt is projected. Translating the mass balances into terms of sea-level equivalent. the Greenland discharge causes a sea level rise of 1.1 mm yr−1, while the accumulation on Antarctica tends to lower it by 0.9 mm yr−1. The change in the combined mass balance of the two continents is almost zero. The sea level change of the next century can be affected more effectively by the thermal expansion of seawater and the mass balance of smaller glaciers outside of Greenland and Antarctica.
Resumo:
A very high resolution atmospheric general circulation model, T106-L19, has been used for the simulation of hurricanes in a multi-year numerical experiment. Individual storms as well as their geographical and seasonal distribution agree remarkably well with observations. In spite of the fact that only the thermal and dynamical structure of the storms have been used as criteria of their identification, practically all of them occur in areas where the sea surface temperature is higher or equal to 26 °C. There are considerable variations from year to year in the number of storms in spite of the fact that there are no interannual variations in the SST pattern. It is found that the number of storms in particular areas appear to depend on the intensity of the Hadley-Walker cell. The result is clearly resolution-dependant. At lower horizonal resolution, T42, for example, the intensity of the storms is significantly reduced and their overall structure is less realistic, including their vertical form and extent.
Resumo:
Long-range global climate forecasts have been made by use of a model for predicting a tropical Pacific sea surface temperature (SST) in tandem with an atmospheric general circulation model. The SST is predicted first at long lead times into the future. These ocean forecasts are then used to force the atmospheric model and so produce climate forecasts at lead times of the SST forecasts. Prediction of the wintertime 500 mb height, surface air temperature and precipitation for seven large climatic events of the 1970–1990s by this two-tiered technique agree well in general with observations over many regions of the globe. The levels of agreement are high enough in some regions to have practical utility.
Resumo:
The warm event which spread in the tropical Atlantic during Spring-Summer 1984 is assumed to be partially initiated by atmospheric disturbances, themselves related to the major 1982–1983 El-Niño which occurred 1 year earlier in the Pacific. This paper tests such an hypothesis. For that purpose, an atmospheric general circulation model (AGCM) is forced by different conditions of climatic and observed sea surface temperature and an Atlantic ocean general circulation model (OGCM) is subsequently forced by the outputs of the AGCM. It is firstly shown that both the AGCM and the OGCM correctly behave when globally observed SST are used: the strengthening of the trades over the tropical Atlantic during 1983 and their subsequent weakening at the beginning of 1984 are well captured by the AGCM, and so is the Spring 1984 deepening of the thermocline in the eastern equatorial Atlantic, simulated by the OGCM. As assumed, the SST anomalies located in the El-Niño Pacific area are partly responsible for wind signal anomaly in the tropical Atlantic. Though this remotely forced atmospheric signal has a small amplitude, it can generate, in the OGCM run, an anomalous sub-surface signal leading to a flattening of the thermocline in the equatorial Atlantic. This forced oceanic experiment cannot explain the amplitude and phase of the observed sub-surface oceanic anomaly: part of the Atlantic ocean response, due to local interaction between ocean and atmosphere, requires a coupled approach. Nevertheless this experiment showed that anomalous conditions in the Pacific during 82–83 created favorable conditions for anomaly development in the Atlantic.
Resumo:
ECHO is a new global coupled ocean-atmosphere general circulation model (GCM), consisting of the Hamburg version of the European Centre atmospheric GCM (ECHAM) and the Hamburg Primitive Equation ocean GCM (HOPE). We performed a 20-year integration with ECHO. Climate drift is significant, but typical annual mean errors in sea surface temperature (SST) do not exceed 2° in the open oceans. Near the boundaries, however, SST errors are considerably larger. The coupled model simulates an irregular ENSO cycle in the tropical Pacific, with spatial patterns similar to those observed. The variability, however, is somewhat weaker relative to observations. ECHO also simulates significant interannual variability in mid-latitudes. Consistent with observations, variability over the North Pacific can be partly attributed to remote forcing from the tropics. In contrast, the interannual variability over the North Atlantic appears to be generated locally.
Resumo:
The ECMWF operational grid point model (with a resolution of 1.875° of latitude and longitude) and its limited area version (with a resolution of !0.47° of latitude and longitude) with boundary values from the global model have been used to study the simulation of the typhoon Tip. The fine-mesh model was capable of simulating the main structural features of the typhoon and predicting a fall in central pressure of 60 mb in 3 days. The structure of the forecast typhoon, with a warm core (maximum potential temperature anomaly 17 K). intense swirling wind (maximum 55 m s-1 at 850 mb) and spiralling precipitation patterns is characteristic of a tropical cyclone. Comparison with the lower resolution forecast shows that the horizontal resolution is a determining factor in predicting not only the structure and intensity but even the movement of these vortices. However, an accurate and refined initial analysis is considered to be a prerequisite for a correct forecast of this phenomenon.
Resumo:
A study of intense hurricane-type vortices in the ECMWF operational model is reported. These vortices develop around day 4 in the forecast and occur in the tropical belt in areas and at times where intense tropical cyclones normally occur. The frequency resembles that observed over most tropical regions with a pronounced maximum in the western North Pacific. The life time of the vortices and their 3-dimensional structure agree in some fundamental way with observations although, because of the resolution, the systems are less intense than the observed ones. The general large-scale conditions for active and inactive cyclone periods are discussed. The model cyclones are sensitive to the sea-surface temperature and do not develop with sea surface temperatures lower than 28–29°C. The dynamical conditions favouring cyclone development are characterized by intense large-scale divergence in the upper troposphere. Cyclogenesis appears to take place when these conditions are found outside the equatorial zone and over oceans where the water is sufficiently warm.
Resumo:
The possibility of using a time sequence of surface pressure observations in four-dimensional data assimilation is being investigated. It is shown that a linear multilevel quasi-geostrophic model can be updated successfully with surface data alone, provided the number of time levels are at least as many as the number of vertical levels. It is further demonstrated that current statistical analysis procedures are very inefficient to assimilate surface observations, and it is shown by numerical experiments that the vertical interpolation must be carried out using the structure of the most dominating baroclinic mode in order to obtain a satisfactory updating. Different possible ways towards finding a practical solution are being discussed.
Resumo:
The present study investigates the growth of error in baroclinic waves. It is found that stable or neutral waves are particularly sensitive to errors in the initial condition. Short stable waves are mainly sensitive to phase errors and the ultra long waves to amplitude errors. Analysis simulation experiments have indicated that the amplitudes of the very long waves become usually too small in the free atmosphere, due to the sparse and very irregular distribution of upper air observations. This also applies to the four-dimensional data assimilation experiments, since the amplitudes of the very long waves are usually underpredicted. The numerical experiments reported here show that if the very long waves have these kinds of amplitude errors in the upper troposphere or lower stratosphere the error is rapidly propagated (within a day or two) to the surface and to the lower troposphere.
Resumo:
A system for continuous data assimilation described recently (Bengtsson & Gustavsson, 1971) has been further developed and tested under more realistic conditions. A balanced barotropic model is used and the integration is performed over an octagon covering the area to the north of 20° N. Comparisons have been made between using data from the actual aerological network and data from a satellite in a polar orbit. The result of the analyses has been studied in different subregions situated in data sparse as well as in data dense areas. The errors of the analysis have also been studied in the wave spectrum domain. Updating is performed using data generated by the model but also by model-independent data. Rather great differences are obtained between the two experiments especially with respect to the ultra-long waves. The more realistic approach gives much larger analysis error. In general the satellite updating yields somewhat better result than the updating from the conventional aerological network especially in the data sparse areas over the oceans. Most of the experiments are performed by a satellite making 200 observations/track, a sidescan capability of 40° and with a RMS-error of 20 m. It is found that the effect of increasing the number of satellite observations from 100 to 200 per orbit is almost negligible. Similarly the effect is small of improving the observations by diminishing the RMS-error below a certain value. An observing system using two satellites 90° out of phase has also been investigated. This is found to imply a substantial improvement. Finally an experiment has been performed using actual SIRS-soundings from NIMBUS IV. With respect to the very small number of soundings at 500 mb, 142 during 48 hours, the result can be regarded as quite satisfactory.
Resumo:
A system for continuous data assimilation is presented and discussed. To simulate the dynamical development a channel version of a balanced barotropic model is used and geopotential (height) data are assimilated into the models computations as data become available. In the first experiment the updating is performed every 24th, 12th and 6th hours with a given network. The stations are distributed at random in 4 groups in order to simulate 4 areas with different density of stations. Optimum interpolation is performed for the difference between the forecast and the valid observations. The RMS-error of the analyses is reduced in time, and the error being smaller the more frequent the updating is performed. The updating every 6th hour yields an error in the analysis less than the RMS-error of the observation. In a second experiment the updating is performed by data from a moving satellite with a side-scan capability of about 15°. If the satellite data are analysed at every time step before they are introduced into the system the error of the analysis is reduced to a value below the RMS-error of the observation already after 24 hours and yields as a whole a better result than updating from a fixed network. If the satellite data are introduced without any modification the error of the analysis is reduced much slower and it takes about 4 days to reach a comparable result to the one where the data have been analysed.