279 resultados para Resolution algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of NWP models with grid spacing down to 1 km should produce more realistic forecasts of convective storms. However, greater realism does not necessarily mean more accurate precipitation forecasts. The rapid growth of errors on small scales in conjunction with preexisting errors on larger scales may limit the usefulness of such models. The purpose of this paper is to examine whether improved model resolution alone is able to produce more skillful precipitation forecasts on useful scales, and how the skill varies with spatial scale. A verification method will be described in which skill is determined from a comparison of rainfall forecasts with radar using fractional coverage over different sized areas. The Met Office Unified Model was run with grid spacings of 12, 4, and 1 km for 10 days in which convection occurred during the summers of 2003 and 2004. All forecasts were run from 12-km initial states for a clean comparison. The results show that the 1-km model was the most skillful over all but the smallest scales (approximately <10–15 km). A measure of acceptable skill was defined; this was attained by the 1-km model at scales around 40–70 km, some 10–20 km less than that of the 12-km model. The biggest improvement occurred for heavier, more localized rain, despite it being more difficult to predict. The 4-km model did not improve much on the 12-km model because of the difficulties of representing convection at that resolution, which was accentuated by the spinup from 12-km fields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The realistic representation of rainfall on the local scale in climate models remains a key challenge. Realism encompasses the full spatial and temporal structure of rainfall, and is a key indicator of model skill in representing the underlying processes. In particular, if rainfall is more realistic in a climate model, there is greater confidence in its projections of future change. In this study, the realism of rainfall in a very high-resolution (1.5 km) regional climate model (RCM) is compared to a coarser-resolution 12-km RCM. This is the first time a convection-permitting model has been run for an extended period (1989–2008) over a region of the United Kingdom, allowing the characteristics of rainfall to be evaluated in a climatological sense. In particular, the duration and spatial extent of hourly rainfall across the southern United Kingdom is examined, with a key focus on heavy rainfall. Rainfall in the 1.5-km RCM is found to be much more realistic than in the 12-km RCM. In the 12-km RCM, heavy rain events are not heavy enough, and tend to be too persistent and widespread. While the 1.5-km model does have a tendency for heavy rain to be too intense, it still gives a much better representation of its duration and spatial extent. Long-standing problems in climate models, such as the tendency for too much persistent light rain and errors in the diurnal cycle, are also considerably reduced in the 1.5-km RCM. Biases in the 12-km RCM appear to be linked to deficiencies in the representation of convection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

On the 8 January 2005 the city of Carlisle in north-west England was severely flooded following 2 days of almost continuous rain over the nearby hills. Orographic enhancement of the rain through the seeder–feeder mechanism led to the very high rainfall totals. This paper shows the impact of running the Met Office Unified Model (UM) with a grid spacing of 4 and 1 km compared to the 12 km available at the time of the event. These forecasts, and forecasts from the Nimrod nowcasting system, were fed into the Probability Distributed Model (PDM) to predict river flow at the outlets of two catchments important for flood warning. The results show the benefit of increased resolution in the UM, the benefit of coupling the high-resolution rainfall forecasts to the PDM and the improvement in timeliness of flood warning that might have been possible. Copyright © 2008 Royal Meteorological Society

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evolutionary meta-algorithms for pulse shaping of broadband femtosecond duration laser pulses are proposed. The genetic algorithm searching the evolutionary landscape for desired pulse shapes consists of a population of waveforms (genes), each made from two concatenated vectors, specifying phases and magnitudes, respectively, over a range of frequencies. Frequency domain operators such as mutation, two-point crossover average crossover, polynomial phase mutation, creep and three-point smoothing as well as a time-domain crossover are combined to produce fitter offsprings at each iteration step. The algorithm applies roulette wheel selection; elitists and linear fitness scaling to the gene population. A differential evolution (DE) operator that provides a source of directed mutation and new wavelet operators are proposed. Using properly tuned parameters for DE, the meta-algorithm is used to solve a waveform matching problem. Tuning allows either a greedy directed search near the best known solution or a robust search across the entire parameter space.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to retrieve information from different layers within a stratified sample using terahertz pulsed reflection imaging and spectroscopy has traditionally been resolution limited by the pulse width available. In this paper, a deconvolution algorithm is presented which circumvents this resolution limit, enabling deep sub-wavelength and sub-pulse width depth resolution. The algorithm is explained through theoretical investigation, and demonstrated by reconstructing signals reflected from boundaries in stratified materials that cannot be resolved directly from the unprocessed time-domain reflection signal. Furthermore, the deconvolution technique has been used to recreate sub-surface images from a stratified sample: imaging the reverse side of a piece of paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hamburg atmospheric general circulation model ECHAM3 at T106 resolution (1.125' lat.Aon.) has considerable skill in reproducing the observed seasonal reversal of mean sea level pressure, the location of the summer heat low as well as the position of the monsoon trough over the Indian subcontinent. The present-day climate and its seasonal cycle are realistically simulated by the model over this region. The model simulates the structure, intensity, frequency, movement and lifetime of monsoon depressions remarkably well. The number of monsoon depressions/storms simulated by the model in a year ranged from 5 to 12 with an average frequency of 8.4 yr-', not significantly different from the observed climatology. The model also simulates the interannual variability in the formation of depressions over the north Bay of Bengal during the summer monsoon season. In the warmer atmosphere under doubled CO2 conditions, the number of monsoon depressions/cyclonic storms forming in Indian seas in a year ranged from 5 to 11 with an average frequency of 7.6 yr-', not significantly different from those inferred in the control run of the model. However, under doubled CO2 conditions, fewer depressions formed in the month of June. Neither the lowest central pressure nor the maximum wind speed changes appreciably in monsoon depressions identified under simulated enhanced greenhouse conditions. The analysis suggests there will be no significant changes in the number and intensity of monsoon depressions in a warmer atmosphere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyze and study a pervasive computing system in a mining environment to track people based on RFID (radio frequency identification) technology. In first instance, we explain the RFID fundamentals and the LANDMARC (location identification based on dynamic active RFID calibration) algorithm, then we present the proposed algorithm combining LANDMARC and trilateration technique to collect the coordinates of the people inside the mine, next we generalize a pervasive computing system that can be implemented in mining, and finally we show the results and conclusions.