206 resultados para Numerical weather prediction


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations (SSPs) as well as for model error representation, uncertainty quantification, data assimilation, and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large-scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochastic components and non-Markovian (memory) terms. Stochastic approaches in numerical weather and climate prediction models also lead to the reduction of model biases. Hence, there is a clear need for systematic stochastic approaches in weather and climate modeling. In this review, we present evidence for stochastic effects in laboratory experiments. Then we provide an overview of stochastic climate theory from an applied mathematics perspective. We also survey the current use of stochastic methods in comprehensive weather and climate prediction models and show that stochastic parameterizations have the potential to remedy many of the current biases in these comprehensive models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Helsinki Urban Boundary-Layer Atmosphere Network (UrBAN: http://urban.fmi.fi) is a dedicated research-grade observational network where the physical processes in the atmosphere above the city are studied. Helsinki UrBAN is the most poleward intensive urban research observation network in the world and thus will allow studying some unique features such as strong seasonality. The network's key purpose is for the understanding of the physical processes in the urban boundary layer and associated fluxes of heat, momentum, moisture, and other gases. A further purpose is to secure a research-grade database, which can be used internationally to validate and develop numerical models of air quality and weather prediction. Scintillometers, a scanning Doppler lidar, ceilometers, a sodar, eddy-covariance stations, and radiometers are used. This equipment is supplemented by auxiliary measurements, which were primarily set up for general weather and/or air-quality mandatory purposes, such as vertical soundings and the operational Doppler radar network. Examples are presented as a testimony to the potential of the network for urban studies, such as (i) evidence of a stable boundary layer possibly coupled to an urban surface, (ii) the comparison of scintillometer data with sonic anemometry above an urban surface, (iii) the application of scanning lidar over a city, and (iv) combination of sodar and lidar to give a fuller range of sampling heights for boundary layer profiling.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Analysis of the vertical velocity of ice crystals observed with a 1.5micron Doppler lidar from a continuous sample of stratiform ice clouds over 17 months show that the distribution of Doppler velocity varies strongly with temperature, with mean velocities of 0.2m/s at -40C, increasing to 0.6m/s at -10C due to particle growth and broadening of the size spectrum. We examine the likely influence of crystals smaller than 60microns by forward modelling their effect on the area-weighted fall speed, and comparing the results to the lidar observations. The comparison strongly suggests that the concentration of small crystals in most clouds is much lower than measured in-situ by some cloud droplet probes. We argue that the discrepancy is likely due to shattering of large crystals on the probe inlet, and that numerous small particles should not be included in numerical weather and climate model parameterizations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A life cycle of the Madden–Julian oscillation (MJO) was constructed, based on 21 years of outgoing long-wave radiation data. Regression maps of NCEP–NCAR reanalysis data for the northern winter show statistically significant upper-tropospheric equatorial wave patterns linked to the tropical convection anomalies, and extratropical wave patterns over the North Pacific, North America, the Atlantic, the Southern Ocean and South America. To assess the cause of the circulation anomalies, a global primitive-equation model was initialized with the observed three-dimensional (3D) winter climatological mean flow and forced with a time-dependent heat source derived from the observed MJO anomalies. A model MJO cycle was constructed from the global response to the heating, and both the tropical and extratropical circulation anomalies generally matched the observations well. The equatorial wave patterns are established in a few days, while it takes approximately two weeks for the extratropical patterns to appear. The model response is robust and insensitive to realistic changes in damping and basic state. The model tropical anomalies are consistent with a forced equatorial Rossby–Kelvin wave response to the tropical MJO heating, although it is shifted westward by approximately 20° longitude relative to observations. This may be due to a lack of damping processes (cumulus friction) in the regions of convective heating. Once this shift is accounted for, the extratropical response is consistent with theories of Rossby wave forcing and dispersion on the climatological flow, and the pattern correlation between the observed and modelled extratropical flow is up to 0.85. The observed tropical and extratropical wave patterns account for a significant fraction of the intraseasonal circulation variance, and this reproducibility as a response to tropical MJO convection has implications for global medium-range weather prediction. Copyright © 2004 Royal Meteorological Society

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Real-time rainfall monitoring in Africa is of great practical importance for operational applications in hydrology and agriculture. Satellite data have been used in this context for many years because of the lack of surface observations. This paper describes an improved artificial neural network algorithm for operational applications. The algorithm combines numerical weather model information with the satellite data. Using this algorithm, daily rainfall estimates were derived for 4 yr of the Ethiopian and Zambian main rainy seasons and were compared with two other algorithms-a multiple linear regression making use of the same information as that of the neural network and a satellite-only method. All algorithms were validated against rain gauge data. Overall, the neural network performs best, but the extent to which it does so depends on the calibration/validation protocol. The advantages of the neural network are most evident when calibration data are numerous and close in space and time to the validation data. This result emphasizes the importance of a real-time calibration system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Three existing models of Interplanetary Coronal Mass Ejection (ICME) transit between the Sun and the Earth are compared to coronagraph and in situ observations: all three models are found to perform with a similar level of accuracy (i.e. an average error between observed and predicted 1AU transit times of approximately 11 h). To improve long-term space weather prediction, factors influencing CME transit are investigated. Both the removal of the plane of sky projection (as suffered by coronagraph derived speeds of Earth directed CMEs) and the use of observed values of solar wind speed, fail to significantly improve transit time prediction. However, a correlation is found to exist between the late/early arrival of an ICME and the width of the preceding sheath region, suggesting that the error is a geometrical effect that can only be removed by a more accurate determination of a CME trajectory and expansion. The correlation between magnetic field intensity and speed of ejecta at 1AU is also investigated. It is found to be weak in the body of the ICME, but strong in the sheath, if the upstream solar wind conditions are taken into account.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The one-dimensional variational assimilation of vertical temperature information in the presence of a boundary-layer capping inversion is studied. For an optimal analysis of the vertical temperature profile, an accurate representation of the background error covariances is essential. The background error covariances are highly flow-dependent due to the variability in the presence, structure and height of the boundary-layer capping inversion. Flow-dependent estimates of the background error covariances are shown by studying the spread in an ensemble of forecasts. A forecast of the temperature profile (used as a background state) may have a significant error in the position of the capping inversion with respect to observations. It is shown that the assimilation of observations may weaken the inversion structure in the analysis if only magnitude errors are accounted for as is the case for traditional data assimilation methods used for operational weather prediction. The positional error is treated explicitly here in a new data assimilation scheme to reduce positional error, in addition to the traditional framework to reduce magnitude error. The distribution of the positional error of the background inversion is estimated for use with the new scheme.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ability of four operational weather forecast models [ECMWF, Action de Recherche Petite Echelle Grande Echelle model (ARPEGE), Regional Atmospheric Climate Model (RACMO), and Met Office] to generate a cloud at the right location and time (the cloud frequency of occurrence) is assessed in the present paper using a two-year time series of observations collected by profiling ground-based active remote sensors (cloud radar and lidar) located at three different sites in western Europe (Cabauw. Netherlands; Chilbolton, United Kingdom; and Palaiseau, France). Particular attention is given to potential biases that may arise from instrumentation differences (especially sensitivity) from one site to another and intermittent sampling. In a second step the statistical properties of the cloud variables involved in most advanced cloud schemes of numerical weather forecast models (ice water content and cloud fraction) are characterized and compared with their counterparts in the models. The two years of observations are first considered as a whole in order to evaluate the accuracy of the statistical representation of the cloud variables in each model. It is shown that all models tend to produce too many high-level clouds, with too-high cloud fraction and ice water content. The midlevel and low-level cloud occurrence is also generally overestimated, with too-low cloud fraction but a correct ice water content. The dataset is then divided into seasons to evaluate the potential of the models to generate different cloud situations in response to different large-scale forcings. Strong variations in cloud occurrence are found in the observations from one season to the same season the following year as well as in the seasonal cycle. Overall, the model biases observed using the whole dataset are still found at seasonal scale, but the models generally manage to well reproduce the observed seasonal variations in cloud occurrence. Overall, models do not generate the same cloud fraction distributions and these distributions do not agree with the observations. Another general conclusion is that the use of continuous ground-based radar and lidar observations is definitely a powerful tool for evaluating model cloud schemes and for a responsive assessment of the benefit achieved by changing or tuning a model cloud

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A Kriging interpolation method is combined with an object-based evaluation measure to assess the ability of the UK Met Office's dispersion and weather prediction models to predict the evolution of a plume of tracer as it was transported across Europe. The object-based evaluation method, SAL, considers aspects of the Structure, Amplitude and Location of the pollutant field. The SAL method is able to quantify errors in the predicted size and shape of the pollutant plume, through the structure component, the over- or under-prediction of the pollutant concentrations, through the amplitude component, and the position of the pollutant plume, through the location component. The quantitative results of the SAL evaluation are similar for both models and close to a subjective visual inspection of the predictions. A negative structure component for both models, throughout the entire 60 hour plume dispersion simulation, indicates that the modelled plumes are too small and/or too peaked compared to the observed plume at all times. The amplitude component for both models is strongly positive at the start of the simulation, indicating that surface concentrations are over-predicted by both models for the first 24 hours, but modelled concentrations are within a factor of 2 of the observations at later times. Finally, for both models, the location component is small for the first 48 hours after the start of the tracer release, indicating that the modelled plumes are situated close to the observed plume early on in the simulation, but this plume location error grows at later times. The SAL methodology has also been used to identify differences in the transport of pollution in the dispersion and weather prediction models. The convection scheme in the weather prediction model is found to transport more pollution vertically out of the boundary layer into the free troposphere than the dispersion model convection scheme resulting in lower pollutant concentrations near the surface and hence a better forecast for this case study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In most near-infrared atmospheric windows, absorption of solar radiation is dominated by the water vapor self-continuum and yet there is a paucity of measurements in these windows. We report new laboratory measurements of the self-continuum absorption at temperatures between 293 and 472 K and pressures from 0.015 to 5 atm in four near-infrared windows between 1 and 4 m (10000-2500 cm-1); the measurements are made over a wider range of wavenumber, temperatures and pressures than any previous measurements. They show that the self-continuum in these windows is typically one order of magnitude stronger than given in representations of the continuum widely used in climate and weather prediction models. These results are also not consistent with current theories attributing the self continuum within windows to the far-wings of strong spectral lines in the nearby water vapor absorption bands; we suggest that they are more consistent with water dimers being the major contributor to the continuum. The calculated global-average clear-sky atmospheric absorption of solar radiation is increased by 0.75 W/m2 (which is about 1% of the total clear-sky absorption) by using these new measurements as compared to calculations with the MT_CKD-2.5 self-continuum model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

For a long time, it has been believed that atmospheric absorption of radiation within wavelength regions of relatively high infrared transmittance (so-called ‘windows’) was dominated by the water vapour self-continuum, that is, spectrally smooth absorption caused by H2O−H2O pair interaction. Absorption due to the foreign continuum (i.e. caused mostly by H2O−N2 bimolecular absorption in the Earth's atmosphere) was considered to be negligible in the windows. We report new retrievals of the water vapour foreign continuum from high-resolution laboratory measurements at temperatures between 350 and 430 K in four near-infrared windows between 1.1 and 5 μm (9000–2000 cm−1). Our results indicate that the foreign continuum in these windows has a very weak temperature dependence and is typically between one and two orders of magnitude stronger than that given in representations of the continuum currently used in many climate and weather prediction models. This indicates that absorption owing to the foreign continuum may be comparable to the self-continuum under atmospheric conditions in the investigated windows. The calculated global-average clear-sky atmospheric absorption of solar radiation is increased by approximately 0.46 W m−2 (or 0.6% of the total clear-sky absorption) by using these new measurements when compared with calculations applying the widely used MTCKD (Mlawer–Tobin–Clough–Kneizys–Davies) foreign-continuum model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The application of forecast ensembles to probabilistic weather prediction has spurred considerable interest in their evaluation. Such ensembles are commonly interpreted as Monte Carlo ensembles meaning that the ensemble members are perceived as random draws from a distribution. Under this interpretation, a reasonable property to ask for is statistical consistency, which demands that the ensemble members and the verification behave like draws from the same distribution. A widely used technique to assess statistical consistency of a historical dataset is the rank histogram, which uses as a criterion the number of times that the verification falls between pairs of members of the ordered ensemble. Ensemble evaluation is rendered more specific by stratification, which means that ensembles that satisfy a certain condition (e.g., a certain meteorological regime) are evaluated separately. Fundamental relationships between Monte Carlo ensembles, their rank histograms, and random sampling from the probability simplex according to the Dirichlet distribution are pointed out. Furthermore, the possible benefits and complications of ensemble stratification are discussed. The main conclusion is that a stratified Monte Carlo ensemble might appear inconsistent with the verification even though the original (unstratified) ensemble is consistent. The apparent inconsistency is merely a result of stratification. Stratified rank histograms are thus not necessarily flat. This result is demonstrated by perfect ensemble simulations and supplemented by mathematical arguments. Possible methods to avoid or remove artifacts that stratification induces in the rank histogram are suggested.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.