279 resultados para Resolution algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Flow in the world's oceans occurs at a wide range of spatial scales, from a fraction of a metre up to many thousands of kilometers. In particular, regions of intense flow are often highly localised, for example, western boundary currents, equatorial jets, overflows and convective plumes. Conventional numerical ocean models generally use static meshes. The use of dynamically-adaptive meshes has many potential advantages but needs to be guided by an error measure reflecting the underlying physics. A method of defining an error measure to guide an adaptive meshing algorithm for unstructured tetrahedral finite elements, utilizing an adjoint or goal-based method, is described here. This method is based upon a functional, encompassing important features of the flow structure. The sensitivity of this functional, with respect to the solution variables, is used as the basis from which an error measure is derived. This error measure acts to predict those areas of the domain where resolution should be changed. A barotropic wind driven gyre problem is used to demonstrate the capabilities of the method. The overall objective of this work is to develop robust error measures for use in an oceanographic context which will ensure areas of fine mesh resolution are used only where and when they are required. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

HFC-134a (CF3CH2F) is the most rapidly growing hydrofluorocarbon in terms of atmospheric abundance. It is currently used in a large number of household refrigerators and air-conditioning systems and its concentration in the atmosphere is forecast to increase substantially over the next 50–100 years. Previous estimates of its radiative forcing per unit concentration have differed significantly 25%. This paper uses a two-step approach to resolve this discrepancy. In the first step six independent absorption cross section datasets are analysed. We find that, for the integrated cross section in the spectral bands that contribute most to the radiative forcing, the differences between the various datasets are typically smaller than 5% and that the dependence on pressure and temperature is not significant. A “recommended'' HFC-134a infrared absorption spectrum was obtained based on the average band intensities of the strongest bands. In the second step, the “recommended'' HFC-134a spectrum was used in six different radiative transfer models to calculate the HFC-134a radiative forcing efficiency. The clear-sky instantaneous radiative forcing, using a single global and annual mean profile, differed by 8%, between the 6 models, and the latitudinally-resolved adjusted cloudy sky radiative forcing estimates differed by a similar amount.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Calculations of the absorption of solar radiation by atmospheric gases, and water vapor in particular, are dependent on the quality of databases of spectral line parameters. There has been increasing scrutiny of databases such as HITRAN in recent years, but this has mostly been performed on a band-by-band basis. We report nine high-spectral-resolution (0.03 cm(-1)) measurements of the solar radiation reaching the surface in southern England over the wave number range 2000 to 12,500 cm(-1) (0.8 to 5 mm) that allow a unique assessment of the consistency of the spectral line databases over this entire spectral region. The data are assessed in terms of the modeled water vapor column that is required to bring calculations and observations into agreement; for an entirely consistent database, this water vapor column should be constant with frequency. For the HITRAN01 database, the spread in water vapor column is about 11%, with distinct shifts between different spectral regions. The HITRAN04 database is in significantly better agreement (about 5% spread) in the completely updated 3000 to 8000 cm(-1) spectral region, but inconsistencies between individual spectral regions remain: for example, in the 8000 to 9500 cm(-1) spectral region, the results indicate an 18% (+/- 1%) underestimate in line intensities with respect to the 3000 to 8000 cm(-1) region. These measurements also indicate the impact of isotopic fractionation of water vapor in the 2500 to 2900 cm(-1) range, where HDO lines dominate over the lines of the most abundant isotope of H2O.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the major characteristics of the Institut Pierre Simon Laplace (IPSL) coupled ocean–atmosphere general circulation model. The model components and the coupling methodology are described, as well as the main characteristics of the climatology and interannual variability. The model results of the standard version used for IPCC climate projections, and for intercomparison projects like the Paleoclimate Modeling Intercomparison Project (PMIP 2) are compared to those with a higher resolution in the atmosphere. A focus on the North Atlantic and on the tropics is used to address the impact of the atmosphere resolution on processes and feedbacks. In the North Atlantic, the resolution change leads to an improved representation of the storm-tracks and the North Atlantic oscillation. The better representation of the wind structure increases the northward salt transports, the deep-water formation and the Atlantic meridional overturning circulation. In the tropics, the ocean–atmosphere dynamical coupling, or Bjerknes feedback, improves with the resolution. The amplitude of ENSO (El Niño-Southern oscillation) consequently increases, as the damping processes are left unchanged.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.