1000 resultados para Intervalos de credibilidade e intervalos H.P.D.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method of clear-air turbulence (CAT) forecasting based on the Lighthill–Ford theory of spontaneous imbalance and emission of inertia–gravity waves has been derived and applied on episodic and seasonal time scales. A scale analysis of this shallow-water theory for midlatitude synoptic-scale flows identifies advection of relative vorticity as the leading-order source term. Examination of leading- and second-order terms elucidates previous, more empirically inspired CAT forecast diagnostics. Application of the Lighthill–Ford theory to the Upper Mississippi and Ohio Valleys CAT outbreak of 9 March 2006 results in good agreement with pilot reports of turbulence. Application of Lighthill–Ford theory to CAT forecasting for the 3 November 2005–26 March 2006 period using 1-h forecasts of the Rapid Update Cycle (RUC) 2 1500 UTC model run leads to superior forecasts compared to the current operational version of the Graphical Turbulence Guidance (GTG1) algorithm, the most skillful operational CAT forecasting method in existence. The results suggest that major improvements in CAT forecasting could result if the methods presented herein become operational.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finite computing resources limit the spatial resolution of state-of-the-art global climate simulations to hundreds of kilometres. In neither the atmosphere nor the ocean are small-scale processes such as convection, clouds and ocean eddies properly represented. Climate simulations are known to depend, sometimes quite strongly, on the resulting bulk-formula representation of unresolved processes. Stochastic physics schemes within weather and climate models have the potential to represent the dynamical effects of unresolved scales in ways which conventional bulk-formula representations are incapable of so doing. The application of stochastic physics to climate modelling is a rapidly advancing, important and innovative topic. The latest research findings are gathered together in the Theme Issue for which this paper serves as the introduction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes laboratory observations of inertia–gravity waves emitted from balanced fluid flow. In a rotating two-layer annulus experiment, the wavelength of the inertia–gravity waves is very close to the deformation radius. Their amplitude varies linearly with Rossby number in the range 0.05–0.14, at constant Burger number (or rotational Froude number). This linear scaling challenges the notion, suggested by several dynamical theories, that inertia–gravity waves generated by balanced motion will be exponentially small. It is estimated that the balanced flow leaks roughly 1% of its energy each rotation period into the inertia–gravity waves at the peak of their generation. The findings of this study imply an inevitable emission of inertia–gravity waves at Rossby numbers similar to those of the large-scale atmospheric and oceanic flow. Extrapolation of the results suggests that inertia–gravity waves might make a significant contribution to the energy budgets of the atmosphere and ocean. In particular, emission of inertia–gravity waves from mesoscale eddies may be an important source of energy for deep interior mixing in the ocean.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A traditional method of validating the performance of a flood model when remotely sensed data of the flood extent are available is to compare the predicted flood extent to that observed. The performance measure employed often uses areal pattern-matching to assess the degree to which the two extents overlap. Recently, remote sensing of flood extents using synthetic aperture radar (SAR) and airborne scanning laser altimetry (LIDAR) has made more straightforward the synoptic measurement of water surface elevations along flood waterlines, and this has emphasised the possibility of using alternative performance measures based on height. This paper considers the advantages that can accrue from using a performance measure based on waterline elevations rather than one based on areal patterns of wet and dry pixels. The two measures were compared for their ability to estimate flood inundation uncertainty maps from a set of model runs carried out to span the acceptable model parameter range in a GLUE-based analysis. A 1 in 5-year flood on the Thames in 1992 was used as a test event. As is typical for UK floods, only a single SAR image of observed flood extent was available for model calibration and validation. A simple implementation of a two-dimensional flood model (LISFLOOD-FP) was used to generate model flood extents for comparison with that observed. The performance measure based on height differences of corresponding points along the observed and modelled waterlines was found to be significantly more sensitive to the channel friction parameter than the measure based on areal patterns of flood extent. The former was able to restrict the parameter range of acceptable model runs and hence reduce the number of runs necessary to generate an inundation uncertainty map. A result of this was that there was less uncertainty in the final flood risk map. The uncertainty analysis included the effects of uncertainties in the observed flood extent as well as in model parameters. The height-based measure was found to be more sensitive when increased heighting accuracy was achieved by requiring that observed waterline heights varied slowly along the reach. The technique allows for the decomposition of the reach into sections, with different effective channel friction parameters used in different sections, which in this case resulted in lower r.m.s. height differences between observed and modelled waterlines than those achieved by runs using a single friction parameter for the whole reach. However, a validation of the modelled inundation uncertainty using the calibration event showed a significant difference between the uncertainty map and the observed flood extent. While this was true for both measures, the difference was especially significant for the height-based one. This is likely to be due to the conceptually simple flood inundation model and the coarse application resolution employed in this case. The increased sensitivity of the height-based measure may lead to an increased onus being placed on the model developer in the production of a valid model

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flooding is a major hazard in both rural and urban areas worldwide, but it is in urban areas that the impacts are most severe. An investigation of the ability of high resolution TerraSAR-X Synthetic Aperture Radar (SAR) data to detect flooded regions in urban areas is described. The study uses a TerraSAR-X image of a 1 in 150 year flood near Tewkesbury, UK, in 2007, for which contemporaneous aerial photography exists for validation. The DLR SAR End-To-End simulator (SETES) was used in conjunction with airborne scanning laser altimetry (LiDAR) data to estimate regions of the image in which water would not be visible due to shadow or layover caused by buildings and taller vegetation. A semi-automatic algorithm for the detection of floodwater in urban areas is described, together with its validation using the aerial photographs. 76% of the urban water pixels visible to TerraSAR-X were correctly detected, with an associated false positive rate of 25%. If all urban water pixels were considered, including those in shadow and layover regions, these figures fell to 58% and 19% respectively. The algorithm is aimed at producing urban flood extents with which to calibrate and validate urban flood inundation models, and these findings indicate that TerraSAR-X is capable of providing useful data for this purpose.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of a 2D numerical model of flood hydraulics is tested for a major event in Carlisle, UK, in 2005. This event is associated with a unique data set, with GPS surveyed wrack lines and flood extent surveyed 3 weeks after the flood. The Simple Finite Volume (SFV) model is used to solve the 2D Saint-Venant equations over an unstructured mesh of 30000 elements representing channel and floodplain, and allowing detailed hydraulics of flow around bridge piers and other influential features to be represented. The SFV model is also used to corroborate flows recorded for the event at two gauging stations. Calibration of Manning's n is performed with a two stage strategy, with channel values determined by calibration of the gauging station models, and floodplain values determined by optimising the fit between model results and observed water levels and flood extent for the 2005 event. RMS error for the calibrated model compared with surveyed water levels is ~±0.4m, the same order of magnitude as the estimated error in the survey data. The study demonstrates the ability of unstructured mesh hydraulic models to represent important hydraulic processes across a range of scales, with potential applications to flood risk management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper discusses the wide variety of ways in which remotely sensed data are being utilized in river flood inundation modeling. Model parameterization is being aided using airborne LiDAR data to provide topography of the floodplain for use as model bathymetry, and vegetation heights in the floodplain for use in estimating floodplain friction factors. Model calibration and validation are being aided by comparing the flood extent observed in SAR images with the extent predicted by the model. The recent extension of this to the observation of urban flooding using high resolution TerraSAR-X data is described. Possible future research directions are considered.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flooding is a major hazard in both rural and urban areas worldwide, but it is in urban areas that the impacts are most severe. An investigation of the ability of high resolution TerraSAR-X data to detect flooded regions in urban areas is described. An important application for this would be the calibration and validation of the flood extent predicted by an urban flood inundation model. To date, research on such models has been hampered by lack of suitable distributed validation data. The study uses a 3m resolution TerraSAR-X image of a 1-in-150 year flood near Tewkesbury, UK, in 2007, for which contemporaneous aerial photography exists for validation. The DLR SETES SAR simulator was used in conjunction with airborne LiDAR data to estimate regions of the TerraSAR-X image in which water would not be visible due to radar shadow or layover caused by buildings and taller vegetation, and these regions were masked out in the flood detection process. A semi-automatic algorithm for the detection of floodwater was developed, based on a hybrid approach. Flooding in rural areas adjacent to the urban areas was detected using an active contour model (snake) region-growing algorithm seeded using the un-flooded river channel network, which was applied to the TerraSAR-X image fused with the LiDAR DTM to ensure the smooth variation of heights along the reach. A simpler region-growing approach was used in the urban areas, which was initialized using knowledge of the flood waterline in the rural areas. Seed pixels having low backscatter were identified in the urban areas using supervised classification based on training areas for water taken from the rural flood, and non-water taken from the higher urban areas. Seed pixels were required to have heights less than a spatially-varying height threshold determined from nearby rural waterline heights. Seed pixels were clustered into urban flood regions based on their close proximity, rather than requiring that all pixels in the region should have low backscatter. This approach was taken because it appeared that urban water backscatter values were corrupted in some pixels, perhaps due to contributions from side-lobes of strong reflectors nearby. The TerraSAR-X urban flood extent was validated using the flood extent visible in the aerial photos. It turned out that 76% of the urban water pixels visible to TerraSAR-X were correctly detected, with an associated false positive rate of 25%. If all urban water pixels were considered, including those in shadow and layover regions, these figures fell to 58% and 19% respectively. These findings indicate that TerraSAR-X is capable of providing useful data for the calibration and validation of urban flood inundation models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stochastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS (GEWEX Cloud System Study) community, simulating transitions between active and suppressed periods of tropical convection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe numerical simulations designed to elucidate the role of mean ocean salinity in climate. Using a coupled atmosphere-ocean general circulation model, we study a 100-year sensitivity experiment in which the global-mean salinity is approximately doubled from its present observed value, by adding 35 psu everywhere in the ocean. The salinity increase produces a rapid global-mean sea-surface warming of C within a few years, caused by reduced vertical mixing associated with changes in cabbeling. The warming is followed by a gradual global-mean sea-surface cooling of C within a few decades, caused by an increase in the vertical (downward) component of the isopycnal diffusive heat flux. We find no evidence of impacts on the variability of the thermohaline circulation (THC) or El Niño/Southern Oscillation (ENSO). The mean strength of the Atlantic meridional overturning is reduced by 20% and the North Atlantic Deep Water penetrates less deeply. Nevertheless, our results dispute claims that higher salinities for the world ocean have profound consequences for the thermohaline circulation. In additional experiments with doubled atmospheric carbon dioxide, we find that the amplitude and spatial pattern of the global warming signal are modified in the hypersaline ocean. In particular, the equilibrated global-mean sea-surface temperature increase caused by doubling carbon dioxide is reduced by 10%. We infer the existence of a non-linear interaction between the climate responses to modified carbon dioxide and modified salinity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Robert–Asselin time filter is widely used in numerical models of weather and climate. It successfully suppresses the spurious computational mode associated with the leapfrog time-stepping scheme. Unfortunately, it also weakly suppresses the physical mode and severely degrades the numerical accuracy. These two concomitant problems are shown to occur because the filter does not conserve the mean state, averaged over the three time slices on which it operates. The author proposes a simple modification to the Robert–Asselin filter, which does conserve the three-time-level mean state. When used in conjunction with the leapfrog scheme, the modification vastly reduces the impacts on the physical mode and increases the numerical accuracy for amplitude errors by two orders, yielding third-order accuracy. The modified filter could easily be incorporated into existing general circulation models of the atmosphere and ocean. In principle, it should deliver more faithful simulations at almost no additional computational expense. Alternatively, it may permit the use of longer time steps with no loss of accuracy, reducing the computational expense of a given simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The possibility of future rapid climatic changes is a pressing concern amongst climate scientists. For example, an abrupt collapse of the ocean's Thermohaline Circulation (THC) would rapidly cool the northern hemisphere and reduce the net global primary productivity of vegetation, according to computer models. It is unclear how to incorporate such low-probability, high-impact events into the development of economics policies. This paper reviews the salient aspects of rapid climate change relevant to economists and policy makers. The main scientific certainties and uncertainties are clearly delineated, with the aim of guiding economics goals and ensuring that they retain fidelity to their scientific underpinnings.