940 resultados para MESH
Resumo:
Accuracy and mesh generation are key issues for the high-resolution hydrodynamic modelling of the whole Great Barrier Reef. Our objective is to generate suitable unstructured grids that can resolve topological and dynamical features like tidal jets and recirculation eddies in the wake of islands. A new strategy is suggested to refine the mesh in areas of interest taking into account the bathymetric field and an approximated distance to islands and reefs. Such a distance is obtained by solving an elliptic differential operator, with specific boundary conditions. Meshes produced illustrate both the validity and the efficiency of the adaptive strategy. Selection of refinement and geometrical parameters is discussed. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.
Resumo:
Two ongoing projects at ESSC that involve the development of new techniques for extracting information from airborne LiDAR data and combining this information with environmental models will be discussed. The first project in conjunction with Bristol University is aiming to improve 2-D river flood flow models by using remote sensing to provide distributed data for model calibration and validation. Airborne LiDAR can provide such models with a dense and accurate floodplain topography together with vegetation heights for parameterisation of model friction. The vegetation height data can be used to specify a friction factor at each node of a model’s finite element mesh. A LiDAR range image segmenter has been developed which converts a LiDAR image into separate raster maps of surface topography and vegetation height for use in the model. Satellite and airborne SAR data have been used to measure flood extent remotely in order to validate the modelled flood extent. Methods have also been developed for improving the models by decomposing the model’s finite element mesh to reflect floodplain features such as hedges and trees having different frictional properties to their surroundings. Originally developed for rural floodplains, the segmenter is currently being extended to provide DEMs and friction parameter maps for urban floods, by fusing the LiDAR data with digital map data. The second project is concerned with the extraction of tidal channel networks from LiDAR. These networks are important features of the inter-tidal zone, and play a key role in tidal propagation and in the evolution of salt-marshes and tidal flats. The study of their morphology is currently an active area of research, and a number of theories related to networks have been developed which require validation using dense and extensive observations of network forms and cross-sections. The conventional method of measuring networks is cumbersome and subjective, involving manual digitisation of aerial photographs in conjunction with field measurement of channel depths and widths for selected parts of the network. A semi-automatic technique has been developed to extract networks from LiDAR data of the inter-tidal zone. A multi-level knowledge-based approach has been implemented, whereby low level algorithms first extract channel fragments based mainly on image properties then a high level processing stage improves the network using domain knowledge. The approach adopted at low level uses multi-scale edge detection to detect channel edges, then associates adjacent anti-parallel edges together to form channels. The higher level processing includes a channel repair mechanism.
Resumo:
A primary objective of agri-environment schemes is the conservation of biodiversity; in addition to increasing the value of farmland for wildlife, these schemes also aim to restore natural ecosystem functioning. The management of scheme options can influence their value for delivering ecosystem services by modifying the composition of floral and faunal communities. This study examines the impact of an agri-environment scheme prescription on ecosystem functioning by testing the hypothesis that vegetation management influences decomposition rates in grassy arable field margins. The effects of two vegetation management practices in arable field margins - cutting and soil disturbance (scarification) - on litter decomposition were compared using a litterbag experimental approach in early April 2006. Bags had either small mesh designed to restrict access to soil macrofauna, or large mesh that would allow macrofauna to enter. Bags were positioned on the soil surface or inserted into the soil in cut and scarified margins, retrieved after 44, 103 and 250 days and the amount of litter mass remaining was calculated. Litter loss from the litterbags with large mesh was greater than from the small mesh bags, providing evidence that soil macrofauna accelerate rates of litter decomposition. In the large mesh bags, the proportion of litter remaining in bags above and belowground in the cut plots was similar, while in the scarified plots, there was significantly more litter left in the aboveground bags than in the belowground bags. This loss of balance between decomposition rates above and belowground in scarified margins may have implications for the development and maintenance of grassy arable field margins by influencing nutrient availability for plant communities. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Health care providers, purchasers and policy makers need to make informed decisions regarding the provision of cost-effective care. When a new health care intervention is to be compared with the current standard, an economic evaluation alongside an evaluation of health benefits provides useful information for the decision making process. We consider the information on cost-effectiveness which arises from an individual clinical trial comparing the two interventions. Recent methods for conducting a cost-effectiveness analysis for a clinical trial have focused on the net benefit parameter. The net benefit parameter, a function of costs and health benefits, is positive if the new intervention is cost-effective compared with the standard. In this paper we describe frequentist and Bayesian approaches to cost-effectiveness analysis which have been suggested in the literature and apply them to data from a clinical trial comparing laparoscopic surgery with open mesh surgery for the repair of inguinal hernias. We extend the Bayesian model to allow the total cost to be divided into a number of different components. The advantages and disadvantages of the different approaches are discussed. In January 2001, NICE issued guidance on the type of surgery to be used for inguinal hernia repair. We discuss our example in the light of this information. Copyright © 2003 John Wiley & Sons, Ltd.
Resumo:
As Virtual Reality pushes the boundaries of the human computer interface new ways of interaction are emerging. One such technology is the integration of haptic interfaces (force-feedback devices) into virtual environments. This modality offers an improved sense of immersion to that achieved when relying only on audio and visual modalities. The paper introduces some of the technical obstacles such as latency and network traffic that need to be overcome for maintaining a high degree of immersion during haptic tasks. The paper describes the advantages of integrating haptic feedback into systems, and presents some of the technical issues inherent in a networked haptic virtual environment. A generic control interface has been developed to seamlessly mesh with existing networked VR development libraries.
Resumo:
Significant performance gain can potentially be achieved by employing distributed space-time block coding (D-STBC) in ad hoc or mesh networks. So far, however, most research on D-STBC has assumed that cooperative relay nodes are perfectly synchronized. Considering the difficulty in meeting such an assumption in many practical systems, this paper proposes a simple and near-optimum detection scheme for the case of two relay nodes, which proves to be able to handle far greater timing misalignment than the conventional STBC detector.
Resumo:
An efficient algorithm is presented for the solution of the steady Euler equations of gas dynamics. The scheme is based on solving linearised Riemann problems approximately and in more than one dimension incorporates operator splitting. The scheme is applied to a standard test problem of flow down a channel containing a circular arc bump for three different mesh sizes.
Resumo:
We present a finite difference scheme, with the TVD (total variation diminishing) property, for scalar conservation laws. The scheme applies to non-uniform meshes, allowing for variable mesh spacing, and is without upstream weighting. When applied to systems of conservation laws, no scalar decomposition is required, nor are any artificial tuning parameters, and this leads to an efficient, robust algorithm.
Resumo:
An algorithm based on flux difference splitting is presented for the solution of two-dimensional, open channel flows. A transformation maps a non-rectangular, physical domain into a rectangular one. The governing equations are then the shallow water equations, including terms of slope and friction, in a generalized coordinate system. A regular mesh on a rectangular computational domain can then be employed. The resulting scheme has good jump capturing properties and the advantage of using boundary/body-fitted meshes. The scheme is applied to a problem of flow in a river whose geometry induces a region of supercritical flow.
Resumo:
A solution has been found to the long-standing problem of experimental modelling of the interfacial instability in aluminium reduction cells. The idea is to replace the electrolyte overlaying molten aluminium with a mesh of thin rods supplying current down directly into the liquid metal layer. This eliminates electrolysis altogether and all the problems associated with it, such as high temperature, chemical aggressiveness of media, products of electrolysis, the necessity for electrolyte renewal, high power demands, etc. The result is a room temperature, versatile laboratory model which simulates Sele-type, rolling pad interfacial instability. Our new, safe laboratory model enables detailed experimental investigations to test the existing theoretical models for the first time.
Resumo:
This paper introduces PSOPT, an open source optimal control solver written in C++. PSOPT uses pseudospectral and local discretizations, sparse nonlinear programming, automatic differentiation, and it incorporates automatic scaling and mesh refinement facilities. The software is able to solve complex optimal control problems including multiple phases, delayed differential equations, nonlinear path constraints, interior point constraints, integral constraints, and free initial and/or final times. The software does not require any non-free platform to run, not even the operating system, as it is able to run under Linux. Additionally, the software generates plots as well as LATEX code so that its results can easily be included in publications. An illustrative example is provided.
Resumo:
We consider scattering of a time harmonic incident plane wave by a convex polygon with piecewise constant impedance boundary conditions. Standard finite or boundary element methods require the number of degrees of freedom to grow at least linearly with respect to the frequency of the incident wave in order to maintain accuracy. Extending earlier work by Chandler-Wilde and Langdon for the sound soft problem, we propose a novel Galerkin boundary element method, with the approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh with smaller elements closer to the corners of the polygon. Theoretical analysis and numerical results suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency of the incident wave.
Resumo:
We consider the scattering of a time-harmonic acoustic incident plane wave by a sound soft convex curvilinear polygon with Lipschitz boundary. For standard boundary or finite element methods, with a piecewise polynomial approximation space, the number of degrees of freedom required to achieve a prescribed level of accuracy grows at least linearly with respect to the frequency of the incident wave. Here we propose a novel Galerkin boundary element method with a hybrid approximation space, consisting of the products of plane wave basis functions with piecewise polynomials supported on several overlapping meshes; a uniform mesh on illuminated sides, and graded meshes refined towards the corners of the polygon on illuminated and shadow sides. Numerical experiments suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy need only grow logarithmically as the frequency of the incident wave increases.
Resumo:
Commisssioned by Frieze Art for the Frieze Sculpture Park The project presents the image of a sculpture as a sculpture, installed in the form of a large scale digital print on vinyl stretched over a 14 x 28ft (4.2 x 8.4m) stretcher supported by a scaffolding structure. The image itself depicts a futuristic public sculpture, an ‘impossible’ artwork, referencing Ballard’s descriptions in his book ‘Vermillion Sands’. The work also draws upon examples of rococo ornamentation and the compositional conventions of ‘images of sculpture’ (in art magazines, catalogues, publicity photos) including examples sited in Regents park in previous years. Technical details: The image is printed on vinyl, stretched over a 14 x 28ft (4.2 x 8.4m) wooden stretcher and fixed to a deep buttressed scaffold 8m long by 6.23 deep with IBC water tanks on the back edge as kentledge (4 x I tonne IVC water containers - 1 per bay). The structure is constructed from clean silver Layher system scaffold and wrapped by a dense black mesh netting.