45 resultados para Fokker-Planck, Equação de

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The correlated k-distribution (CKD) method is widely used in the radiative transfer schemes of atmospheric models and involves dividing the spectrum into a number of bands and then reordering the gaseous absorption coefficients within each one. The fluxes and heating rates for each band may then be computed by discretizing the reordered spectrum into of order 10 quadrature points per major gas and performing a monochromatic radiation calculation for each point. In this presentation it is shown that for clear-sky longwave calculations, sufficient accuracy for most applications can be achieved without the need for bands: reordering may be performed on the entire longwave spectrum. The resulting full-spectrum correlated k (FSCK) method requires significantly fewer monochromatic calculations than standard CKD to achieve a given accuracy. The concept is first demonstrated by comparing with line-by-line calculations for an atmosphere containing only water vapor, in which it is shown that the accuracy of heating-rate calculations improves approximately in proportion to the square of the number of quadrature points. For more than around 20 points, the root-mean-squared error flattens out at around 0.015 K/day due to the imperfect rank correlation of absorption spectra at different pressures in the profile. The spectral overlap of m different gases is treated by considering an m-dimensional hypercube where each axis corresponds to the reordered spectrum of one of the gases. This hypercube is then divided up into a number of volumes, each approximated by a single quadrature point, such that the total number of quadrature points is slightly fewer than the sum of the number that would be required to treat each of the gases separately. The gaseous absorptions for each quadrature point are optimized such that they minimize a cost function expressing the deviation of the heating rates and fluxes calculated by the FSCK method from line-by-line calculations for a number of training profiles. This approach is validated for atmospheres containing water vapor, carbon dioxide, and ozone, in which it is found that in the troposphere and most of the stratosphere, heating-rate errors of less than 0.2 K/day can be achieved using a total of 23 quadrature points, decreasing to less than 0.1 K/day for 32 quadrature points. It would be relatively straightforward to extend the method to include other gases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The predictability of high impact weather events on multiple time scales is a crucial issue both in scientific and socio-economic terms. In this study, a statistical-dynamical downscaling (SDD) approach is applied to an ensemble of decadal hindcasts obtained with the Max-Planck-Institute Earth System Model (MPI-ESM) to estimate the decadal predictability of peak wind speeds (as a proxy for gusts) over Europe. Yearly initialized decadal ensemble simulations with ten members are investigated for the period 1979–2005. The SDD approach is trained with COSMO-CLM regional climate model simulations and ERA-Interim reanalysis data and applied to the MPI-ESM hindcasts. The simulations for the period 1990–1993, which was characterized by several windstorm clusters, are analyzed in detail. The anomalies of the 95 % peak wind quantile of the MPI-ESM hindcasts are in line with the positive anomalies in reanalysis data for this period. To evaluate both the skill of the decadal predictability system and the added value of the downscaling approach, quantile verification skill scores are calculated for both the MPI-ESM large-scale wind speeds and the SDD simulated regional peak winds. Skill scores are predominantly positive for the decadal predictability system, with the highest values for short lead times and for (peak) wind speeds equal or above the 75 % quantile. This provides evidence that the analyzed hindcasts and the downscaling technique are suitable for estimating wind and peak wind speeds over Central Europe on decadal time scales. The skill scores for SDD simulated peak winds are slightly lower than those for large-scale wind speeds. This behavior can be largely attributed to the fact that peak winds are a proxy for gusts, and thus have a higher variability than wind speeds. The introduced cost-efficient downscaling technique has the advantage of estimating not only wind speeds but also estimates peak winds (a proxy for gusts) and can be easily applied to large ensemble datasets like operational decadal prediction systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tropical Cyclones (TC) under different climate conditions in the Northern Hemisphere have been investigated with the Max Planck Institute (MPI) coupled (ECHAM5/MPIOM) and atmosphere (ECHAM5) climate models. The intensity and size of the TC depend crucially on resolution with higher wind speed and smaller scales at the higher resolutions. The typical size of the TC is reduced by a factor of 2.3 from T63 to T319 using the distance of the maximum wind speed from the centre of the storm as a measure. The full three dimensional structure of the storms becomes increasingly more realistic as the resolution is increased. For the T63 resolution, three ensemble runs are explored for the period 1860 until 2100 using the IPCC SRES scenario A1B and evaluated for three 30 year periods at the end of the 19th, 20th and 21st century, respectively. While there is no significant change between the 19th and the 20th century, there is a considerable reduction in the number of the TC by some 20% in the 21st century, but no change in the number of the more intense storms. Reduction in the number of storms occurs in all regions. A single additional experiment at T213 resolution was run for the two latter 30-year periods. The T213 is an atmospheric only experiment using the transient Sea Surface Temperatures (SST) of the T63 resolution experiment. Also in this case, there is a reduction by some 10% in the number of simulated TC in the 21st century compared to the 20th century but a marked increase in the number of intense storms. The number of storms with maximum wind speeds greater than 50ms-1 increases by a third. Most of the intensification takes place in 2 the Eastern Pacific and in the Atlantic where also the number of storms more or less stays the same. We identify two competing processes effecting TC in a warmer climate. First, the increase in the static stability and the reduced vertical circulation is suggested to contribute to the reduction in the number of storms. Second, the increase in temperature and water vapor provide more energy for the storms so that when favorable conditions occur, the higher SST and higher specific humidity will contribute to more intense storms. As the maximum intensity depends crucially on resolution, this will require higher resolution to have its full effect. The distribution of storms between different regions does not, at first approximation, depend on the temperature itself but on the distribution of the SST anomalies and their influence on the atmospheric circulation. Two additional transient experiments at T319 resolution where run for 20 years at the end of the 20th and 21st century, respectively using the same conditions as in the T213 experiments. The results are consistent with the T213 study. The total number of tropical cyclones were similar to the T213 experiment but were generally more intense. The change from the 20th to the 21st century was also similar with fewer TC in total but with more intense cyclones.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The International System of Units (SI) is founded on seven base units, the metre, kilogram, second, ampere, kelvin, mole and candela corresponding to the seven base quantities of length, mass, time, electric current, thermodynamic temperature, amount of substance and luminous intensity. At its 94th meeting in October 2005, the International Committee for Weights and Measures (CIPM) adopted a recommendation on preparative steps towards redefining the kilogram, ampere, kelvin and mole so that these units are linked to exactly known values of fundamental constants. We propose here that these four base units should be given new definitions linking them to exactly defined values of the Planck constant h, elementary charge e, Boltzmann constant k and Avogadro constant NA, respectively. This would mean that six of the seven base units of the SI would be defined in terms of true invariants of nature. In addition, not only would these four fundamental constants have exactly defined values but also the uncertainties of many of the other fundamental constants of physics would be either eliminated or appreciably reduced. In this paper we present the background and discuss the merits of these proposed changes, and we also present possible wordings for the four new definitions. We also suggest a novel way to define the entire SI explicitly using such definitions without making any distinction between base units and derived units. We list a number of key points that should be addressed when the new definitions are adopted by the General Conference on Weights and Measures (CGPM), possibly by the 24th CGPM in 2011, and we discuss the implications of these changes for other aspects of metrology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The kilogram, the base unit of mass in the International System of Units (SI), is defined as the mass m(K) of the international prototype of the kilogram. Clearly, this definition has the effect of fixing the value of m(K) to be one kilogram exactly. In this paper, we review the benefits that would accrue if the kilogram were redefined so as to fix the value of either the Planck constant h or the Avogadro constant NA instead of m(K), without waiting for the experiments to determine h or NA currently underway to reach their desired relative standard uncertainty of about 10−8. A significant reduction in the uncertainties of the SI values of many other fundamental constants would result from either of these new definitions, at the expense of making the mass m(K) of the international prototype a quantity whose value would have to be determined by experiment. However, by assigning a conventional value to m(K), the present highly precise worldwide uniformity of mass standards could still be retained. The advantages of redefining the kilogram immediately outweigh any apparent disadvantages, and we review the alternative forms that a new definition might take.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The High Resolution Dynamics Limb Sounder is described, with particular reference to the atmospheric measurements to be made and the rationale behind the measurement strategy. The demands this strategy places on the filters to be used in the instrument and the designs to which this leads to are described. A second set of filters at an intermediate image plane to reduce "Ghost Imaging" is discussed together with their required spectral properties. A method of combining the spectral characteristics of the primary and secondary filters in each channel are combined together with the spectral response of the detectors and other optical elements to obtain the system spectral response weighted appropriately for the Planck function and atmospheric limb absorption. This method is used to demonstrate whether the out-of-band spectral blocking requirement for a channel is being met and an example calculation is demonstrated showing how the blocking is built up for a representative channel. Finally, the techniques used to produce filters of the necessary sub-millimetre sizes together with the testing methods and procedures used to assess the environmental durability and establish space flight quality are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A series of model experiments with the coupled Max-Planck-Institute ECHAM5/OM climate model have been investigated and compared with microwave measurements from the Microwave Sounding Unit (MSU) and re-analysis data for the period 1979–2008. The evaluation is carried out by computing the Temperature in the Lower Troposphere (TLT) and Temperature in the Middle Troposphere (TMT) using the MSU weights from both University of Alabama (UAH) and Remote Sensing Systems (RSS) and restricting the study to primarily the tropical oceans. When forced by analysed sea surface temperature the model reproduces accurately the time-evolution of the mean outgoing tropospheric microwave radiation especially over tropical oceans but with a minor bias towards higher temperatures in the upper troposphere. The latest reanalyses data from the 25 year Japanese re-analysis (JRA25) and European Center for Medium Range Weather Forecasts Interim Reanalysis are in very close agreement with the time-evolution of the MSU data with a correlation of 0.98 and 0.96, respectively. The re-analysis trends are similar to the trends obtained from UAH but smaller than the trends from RSS. Comparison of TLT, computed from observations from UAH and RSS, with Sea Surface Temperature indicates that RSS has a warm bias after 1993. In order to identify the significance of the tropospheric linear temperature trends we determined the natural variability of 30-year trends from a 500 year control integration of the coupled ECHAM5 model. The model exhibits natural unforced variations of the 30 year tropospheric trend that vary within ±0.2 K/decade for the tropical oceans. This general result is supported by similar results from the Geophysical Fluid Dynamics Laboratory (GFDL) coupled climate model. Present MSU observations from UAH for the period 1979–2008 are well within this range but RSS is close to the upper positive limit of this variability. We have also compared the trend of the vertical lapse rate over the tropical oceans assuming that the difference between TLT and TMT is an approximate measure of the lapse rate. The TLT–TMT trend is larger in both the measurements and in the JRA25 than in the model runs by 0.04–0.06 K/decade. Furthermore, a calculation of all 30 year TLT–TMT trends of the unforced 500-year integration vary between ±0.03 K/decade suggesting that the models have a minor systematic warm bias in the upper troposphere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We review the proposal of the International Committee for Weights and Measures (Comité International des Poids et Mesures, CIPM), currently being considered by the General Conference on Weights and Measures (Conférences Générales des Poids et Mesures, CGPM), to revise the International System of Units (Le Système International d’Unitès, SI). The proposal includes new definitions for four of the seven base units of the SI, and a new form of words to present the definitions of all the units. The objective of the proposed changes is to adopt definitions referenced to constants of nature, taken in the widest sense, so that the definitions may be based on what are believed to be true invariants. In particular, whereas in the current SI the kilogram, ampere, kelvin and mole are linked to exact numerical values of the mass of the international prototype of the kilogram, the magnetic constant (permeability of vacuum), the triple-point temperature of water and the molar mass of carbon-12, respectively, in the new SI these units are linked to exact numerical values of the Planck constant, the elementary charge, the Boltzmann constant and the Avogadro constant, respectively. The new wording used expresses the definitions in a simple and unambiguous manner without the need for the distinction between base and derived units. The importance of relations among the fundamental constants to the definitions, and the importance of establishing a mise en pratique for the realization of each definition, are also discussed.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Model differences in projections of extratropical regional climate change due to increasing greenhouse gases are investigated using two atmospheric general circulation models (AGCMs): ECHAM4 (Max Planck Institute, version 4) and CCM3 (National Center for Atmospheric Research Community Climate Model version 3). Sea-surface temperature (SST) fields calculated from observations and coupled versions of the two models are used to force each AGCM in experiments based on time-slice methodology. Results from the forced AGCMs are then compared to coupled model results from the Coupled Model Intercomparison Project 2 (CMIP2) database. The time-slice methodology is verified by showing that the response of each model to doubled CO2 and SST forcing from the CMIP2 experiments is consistent with the results of the coupled GCMs. The differences in the responses of the models are attributed to (1) the different tropical SST warmings in the coupled simulations and (2) the different atmospheric model responses to the same tropical SST warmings. Both are found to have important contributions to differences in implied Northern Hemisphere (NH) winter extratropical regional 500 mb height and tropical precipitation climate changes. Forced teleconnection patterns from tropical SST differences are primarily responsible for sensitivity differences in the extratropical North Pacific, but have relatively little impact on the North Atlantic. There are also significant differences in the extratropical response of the models to the same tropical SST anomalies due to differences in numerical and physical parameterizations. Differences due to parameterizations dominate in the North Atlantic. Differences in the control climates of the two coupled models from the current climate, in particular for the coupled model containing CCM3, are also demonstrated to be important in leading to differences in extratropical regional sensitivity.