966 resultados para Numerical Weather Prediction


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many problems in spatial statistics it is necessary to infer a global problem solution by combining local models. A principled approach to this problem is to develop a global probabilistic model for the relationships between local variables and to use this as the prior in a Bayesian inference procedure. We show how a Gaussian process with hyper-parameters estimated from Numerical Weather Prediction Models yields meteorologically convincing wind fields. We use neural networks to make local estimates of wind vector probabilities. The resulting inference problem cannot be solved analytically, but Markov Chain Monte Carlo methods allow us to retrieve accurate wind fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ERS-1 Satellite was launched in July 1991 by the European Space Agency into a polar orbit at about 800 km, carrying a C-band scatterometer. A scatterometer measures the amount of backscatter microwave radiation reflected by small ripples on the ocean surface induced by sea-surface winds, and so provides instantaneous snap-shots of wind flow over large areas of the ocean surface, known as wind fields. Inherent in the physics of the observation process is an ambiguity in wind direction; the scatterometer cannot distinguish if the wind is blowing toward or away from the sensor device. This ambiguity implies that there is a one-to-many mapping between scatterometer data and wind direction. Current operational methods for wind field retrieval are based on the retrieval of wind vectors from satellite scatterometer data, followed by a disambiguation and filtering process that is reliant on numerical weather prediction models. The wind vectors are retrieved by the local inversion of a forward model, mapping scatterometer observations to wind vectors, and minimising a cost function in scatterometer measurement space. This thesis applies a pragmatic Bayesian solution to the problem. The likelihood is a combination of conditional probability distributions for the local wind vectors given the scatterometer data. The prior distribution is a vector Gaussian process that provides the geophysical consistency for the wind field. The wind vectors are retrieved directly from the scatterometer data by using mixture density networks, a principled method to model multi-modal conditional probability density functions. The complexity of the mapping and the structure of the conditional probability density function are investigated. A hybrid mixture density network, that incorporates the knowledge that the conditional probability distribution of the observation process is predominantly bi-modal, is developed. The optimal model, which generalises across a swathe of scatterometer readings, is better on key performance measures than the current operational model. Wind field retrieval is approached from three perspectives. The first is a non-autonomous method that confirms the validity of the model by retrieving the correct wind field 99% of the time from a test set of 575 wind fields. The second technique takes the maximum a posteriori probability wind field retrieved from the posterior distribution as the prediction. For the third technique, Markov Chain Monte Carlo (MCMC) techniques were employed to estimate the mass associated with significant modes of the posterior distribution, and make predictions based on the mode with the greatest mass associated with it. General methods for sampling from multi-modal distributions were benchmarked against a specific MCMC transition kernel designed for this problem. It was shown that the general methods were unsuitable for this application due to computational expense. On a test set of 100 wind fields the MAP estimate correctly retrieved 72 wind fields, whilst the sampling method correctly retrieved 73 wind fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since wind has an intrinsically complex and stochastic nature, accurate wind power forecasts are necessary for the safety and economics of wind energy utilization. In this paper, we investigate a combination of numeric and probabilistic models: one-day-ahead wind power forecasts were made with Gaussian Processes (GPs) applied to the outputs of a Numerical Weather Prediction (NWP) model. Firstly the wind speed data from NWP was corrected by a GP. Then, as there is always a defined limit on power generated in a wind turbine due the turbine controlling strategy, a Censored GP was used to model the relationship between the corrected wind speed and power output. To validate the proposed approach, two real world datasets were used for model construction and testing. The simulation results were compared with the persistence method and Artificial Neural Networks (ANNs); the proposed model achieves about 11% improvement in forecasting accuracy (Mean Absolute Error) compared to the ANN model on one dataset, and nearly 5% improvement on another.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ocean wind retrievals from satellite sensors are typically performed for the standard level of 10 m. This restricts their full exploitation for wind energy planning, which requires wind information at much higher levels where wind turbines operate. A new method is presented for the vertical extrapolation of satellite-based wind maps. Winds near the sea surface are obtained from satellite data and used together with an adaptation of the Monin–Obukhov similarity theory to estimate the wind speed at higher levels. The thermal stratification of the atmosphere is taken into account through a long-term stability correction that is based on numerical weather prediction (NWP) model outputs. The effect of the long-term stability correction on the wind profile is significant. The method is applied to Envisat Advanced Synthetic Aperture Radar scenes acquired over the south Baltic Sea. This leads to maps of the long-term stability correction and wind speed at a height of 100 m with a spatial resolution of 0.02°. Calculations of the corresponding wind power density and Weibull parameters are shown. Comparisons with mast observations reveal that NWP model outputs can correct successfully for long-term stability effects and also, to some extent, for the limited number of satellite samples. The satellite-based and NWP-simulated wind profiles are almost equally accurate with respect to those from the mast. However, the satellite-based maps have a higher spatial resolution, which is particularly important in nearshore areas where most offshore wind farms are built.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing in resolution of numerical weather prediction models has allowed more and more realistic forecasts of atmospheric parameters. Due to the growing variability into predicted fields the traditional verification methods are not always able to describe the model ability because they are based on a grid-point-by-grid-point matching between observation and prediction. Recently, new spatial verification methods have been developed with the aim of show the benefit associated to the high resolution forecast. Nested in among of the MesoVICT international project, the initially aim of this work is to compare the newly tecniques remarking advantages and disadvantages. First of all, the MesoVICT basic examples, represented by synthetic precipitation fields, have been examined. Giving an error evaluation in terms of structure, amplitude and localization of the precipitation fields, the SAL method has been studied more thoroughly respect to the others approaches with its implementation in the core cases of the project. The verification procedure has concerned precipitation fields over central Europe: comparisons between the forecasts performed by the 00z COSMO-2 model and the VERA (Vienna Enhanced Resolution Analysis) have been done. The study of these cases has shown some weaknesses of the methodology examined; in particular has been highlighted the presence of a correlation between the optimal domain size and the extention of the precipitation systems. In order to increase ability of SAL, a subdivision of the original domain in three subdomains has been done and the method has been applied again. Some limits have been found in cases in which at least one of the two domains does not show precipitation. The overall results for the subdomains have been summarized on scatter plots. With the aim to identify systematic errors of the model the variability of the three parameters has been studied for each subdomain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the fundamental questions in dynamical meteorology, and one of the basic objectives of GARP, is to determine the predictability of the atmosphere. In the early planning stage and preparation for GARP a number of theoretical and numerical studies were undertaken, indicating that there existed an inherent unpredictability in the atmosphere which even with the most ideal observing system would limit useful weather forecasting to 2-3 weeks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. Weather forecasts using multiple NWPs from various weather centres implemented on catchment hydrology can provide significantly improved early flood warning. The availability of global ensemble weather prediction systems through the ‘THORPEX Interactive Grand Global Ensemble’ (TIGGE) offers a new opportunity for the development of state-of-the-art early flood forecasting systems. This paper presents a case study using the TIGGE database for flood warning on a meso-scale catchment (4062 km2) located in the Midlands region of England. For the first time, a research attempt is made to set up a coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE database. The study shows that precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial precipitation variability on such a comparatively small catchment, which indicates need to improve NWPs resolution and/or disaggregating techniques to narrow down the spatial gap between meteorology and hydrology. The spread of discharge forecasts varies from centre to centre, but it is generally large and implies a significant level of uncertainties. Nevertheless, the results show the TIGGE database is a promising tool to forecast flood inundation, comparable with that driven by raingauge observation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article develops methods for spatially predicting daily change of dissolved oxygen (Dochange) at both sampled locations (134 freshwater sites in 2002 and 2003) and other locations of interest throughout a river network in South East Queensland, Australia. In order to deal with the relative sparseness of the monitoring locations in comparison to the number of locations where one might want to make predictions, we make a classification of the river and stream locations. We then implement optimal spatial prediction (ordinary and constrained kriging) from geostatistics. Because of their directed-tree structure, rivers and streams offer special challenges. A complete approach to spatial prediction on a river network is given, with special attention paid to environmental exceedances. The methodology is used to produce a map of Dochange predictions for 2003. Dochange is one of the variables measured as part of the Ecosystem Health Monitoring Program conducted within the Moreton Bay Waterways and Catchments Partnership.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis contains three subject areas concerning particulate matter in urban area air quality: 1) Analysis of the measured concentrations of particulate matter mass concentrations in the Helsinki Metropolitan Area (HMA) in different locations in relation to traffic sources, and at different times of year and day. 2) The evolution of traffic exhaust originated particulate matter number concentrations and sizes in local street scale are studied by a combination of a dispersion model and an aerosol process model. 3) Some situations of high particulate matter concentrations are analysed with regard to their meteorological origins, especially temperature inversion situations, in the HMA and three other European cities. The prediction of the occurrence of meteorological conditions conducive to elevated particulate matter concentrations in the studied cities is examined. The performance of current numerical weather forecasting models in the case of air pollution episode situations is considered. The study of the ambient measurements revealed clear diurnal variation of the PM10 concentrations in the HMA measurement sites, irrespective of the year and the season of the year. The diurnal variation of local vehicular traffic flows seemed to have no substantial correlation with the PM2.5 concentrations, indicating that the PM10 concentrations were originated mainly from local vehicular traffic (direct emissions and suspension), while the PM2.5 concentrations were mostly of regionally and long-range transported origin. The modelling study of traffic exhaust dispersion and transformation showed that the number concentrations of particles originating from street traffic exhaust undergo a substantial change during the first tens of seconds after being emitted from the vehicle tailpipe. The dilution process was shown to dominate total number concentrations. Minimal effect of both condensation and coagulation was seen in the Aitken mode number concentrations. The included air pollution episodes were chosen on the basis of occurrence in either winter or spring, and having at least partly local origin. In the HMA, air pollution episodes were shown to be linked to predominantly stable atmospheric conditions with high atmospheric pressure and low wind speeds in conjunction with relatively low ambient temperatures. For the other European cities studied, the best meteorological predictors for the elevated concentrations of PM10 were shown to be temporal (hourly) evolutions of temperature inversions, stable atmospheric stability and in some cases, wind speed. Concerning the weather prediction during particulate matter related air pollution episodes, the use of the studied models were found to overpredict pollutant dispersion, leading to underprediction of pollutant concentration levels.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many meteorological phenomena occur at different locations simultaneously. These phenomena vary temporally and spatially. It is essential to track these multiple phenomena for accurate weather prediction. Efficient analysis require high-resolution simulations which can be conducted by introducing finer resolution nested simulations, nests at the locations of these phenomena. Simultaneous tracking of these multiple weather phenomena requires simultaneous execution of the nests on different subsets of the maximum number of processors for the main weather simulation. Dynamic variation in the number of these nests require efficient processor reallocation strategies. In this paper, we have developed strategies for efficient partitioning and repartitioning of the nests among the processors. As a case study, we consider an application of tracking multiple organized cloud clusters in tropical weather systems. We first present a parallel data analysis algorithm to detect such clouds. We have developed a tree-based hierarchical diffusion method which reallocates processors for the nests such that the redistribution cost is less. We achieve this by a novel tree reorganization approach. We show that our approach exhibits up to 25% lower redistribution cost and 53% lesser hop-bytes than the processor reallocation strategy that does not consider the existing processor allocation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

I. Foehn winds of southern California.
An investigation of the hot, dry and dust laden winds occurring in the late fall and early winter in the Los Angeles Basin and attributed in the past to the influences of the desert regions to the north revealed that these currents were of a foehn nature. Their properties were found to be entirely due to dynamical heating produced in the descent from the high level areas in the interior to the lower Los Angeles Basin. Any dust associated with the phenomenon was found to be acquired from the Los Angeles area rather than transported from the desert. It was found that the frequency of occurrence of a mild type foehn of this nature during this season was sufficient to warrant its classification as a winter monsoon. This results from the topography of the Los Angeles region which allows an easy entrance to the air from the interior by virtue of the low level mountain passes north of the area. This monsoon provides the mild winter climate of southern California since temperatures associated with the foehn currents are far higher than those experienced when maritime air from the adjacent Pacific Ocean occupies the region.

II. Foehn wind cyclo-genesis.
Intense anticyclones frequently build up over the high level regions of the Great Basin and Columbia Plateau which lie between the Sierra Nevada and Cascade Mountains to the west and the Rocky Mountains to the east. The outflow from these anticyclones produce extensive foehns east of the Rockies in the comparatively low level areas of the middle west and the Canadian provinces of Alberta and Saskatchewan. Normally at this season of the year very cold polar continental air masses are present over this territory and with the occurrence of these foehns marked discontinuity surfaces arise between the warm foehn current, which is obliged to slide over a colder mass, and the Pc air to the east. Cyclones are easily produced from this phenomenon and take the form of unstable waves which propagate along the discontinuity surface between the two dissimilar masses. A continual series of such cyclones was found to occur as long as the Great Basin anticyclone is maintained with undiminished intensity.

III. Weather conditions associated with the Akron disaster.
This situation illustrates the speedy development and propagation of young disturbances in the eastern United States during the spring of the year under the influence of the conditionally unstable tropical maritime air masses which characterise the region. It also furnishes an excellent example of the superiority of air mass and frontal methods of weather prediction for aircraft operation over the older methods based upon pressure distribution.

IV. The Los Angeles storm of December 30, 1933 to January 1, 1934.
This discussion points out some of the fundamental interactions occurring between air masses of the North Pacific Ocean in connection with Pacific Coast storms and the value of topographic and aerological considerations in predicting them. Estimates of rainfall intensity and duration from analyses of this type may be made and would prove very valuable in the Los Angeles area in connection with flood control problems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Esta dissertação apresenta resultados da aplicação de filtros adaptativos, utilizando os algoritmos NLMS (Normalized Least Mean Square) e RLS (Recursive Least Square), para a redução de desvios em previsões climáticas. As discrepâncias existentes entre o estado real da atmosfera e o previsto por um modelo numérico tendem a aumentar ao longo do período de integração. O modelo atmosférico Eta é utilizado operacionalmente para previsão numérica no CPTEC/INPE e como outros modelos atmosféricos, apresenta imprecisão nas previsões climáticas. Existem pesquisas que visam introduzir melhorias no modelo atmosférico Eta e outras que avaliam as previsões e identificam os erros do modelo para que seus produtos sejam utilizados de forma adequada. Dessa forma, neste trabalho pretende-se filtrar os dados provenientes do modelo Eta e ajustá-los, de modo a minimizar os erros entre os resultados fornecidos pelo modelo Eta e as reanálises do NCEP. Assim, empregamos técnicas de processamento digital de sinais e imagens com o intuito de reduzir os erros das previsões climáticas do modelo Eta. Os filtros adaptativos nesta dissertação ajustarão as séries ao longo do tempo de previsão. Para treinar os filtros foram utilizadas técnicas de agrupamento de regiões, como por exemplo o algoritmo de clusterização k-means, de modo a selecionar séries climáticas que apresentem comportamentos semelhantes entre si. As variáveis climáticas estudadas são o vento meridional e a altura geopotencial na região coberta pelo modelo de previsão atmosférica Eta com resolução de 40 km, a um nível de pressão de 250 hPa. Por fim, os resultados obtidos mostram que o filtro com 4 coeficientes, adaptado pelo algoritmo RLS em conjunto com o critério de seleção de regiões por meio do algoritmo k-means apresenta o melhor desempenho ao reduzir o erro médio e a dispersão do erro, tanto para a variável vento meridional quanto para a variável altura geopotencial.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There is a growing interest in using stochastic parametrizations in numerical weather and climate prediction models. Previously, Palmer (2001) outlined the issues that give rise to the need for a stochastic parametrization and the forms such a parametrization could take. In this article a method is presented that uses a comparison between a standard-resolution version and a high-resolution version of the same model to gain information relevant for a stochastic parametrization in that model. A correction term that could be used in a stochastic parametrization is derived from the thermodynamic equations of both models. The origin of the components of this term is discussed. It is found that the component related to unresolved wave-wave interactions is important and can act to compensate for large parametrized tendencies. The correction term is not proportional to the parametrized tendency. Finally, it is explained how the correction term could be used to give information about the shape of the random distribution to be used in a stochastic parametrization. Copyright © 2009 Royal Meteorological Society

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many numerical models for weather prediction and climate studies are run at resolutions that are too coarse to resolve convection explicitly, but too fine to justify the local equilibrium assumed by conventional convective parameterizations. The Plant-Craig (PC) stochastic convective parameterization scheme, developed in this paper, solves this problem by removing the assumption that a given grid-scale situation must always produce the same sub-grid-scale convective response. Instead, for each timestep and gridpoint, one of the many possible convective responses consistent with the large-scale situation is randomly selected. The scheme requires as input the large-scale state as opposed to the instantaneous grid-scale state, but must nonetheless be able to account for genuine variations in the largescale situation. Here we investigate the behaviour of the PC scheme in three-dimensional simulations of radiative-convective equilibrium, demonstrating in particular that the necessary space-time averaging required to produce a good representation of the input large-scale state is not in conflict with the requirement to capture large-scale variations. The resulting equilibrium profiles agree well with those obtained from established deterministic schemes, and with corresponding cloud-resolving model simulations. Unlike the conventional schemes the statistics for mass flux and rainfall variability from the PC scheme also agree well with relevant theory and vary appropriately with spatial scale. The scheme is further shown to adapt automatically to changes in grid length and in forcing strength.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Our group considered the desirability of including representations of uncertainty in the development of parameterizations. (By ‘uncertainty’ here we mean the deviation of sub-grid scale fluxes or tendencies in any given model grid box from truth.) We unanimously agreed that the ECWMF should attempt to provide a more physical basis for uncertainty estimates than the very effective but ad hoc methods being used at present. Our discussions identified several issues that will arise.