951 resultados para Unpublished
Resumo:
The European research project TIDE (Tidal Inlets Dynamics and Environment) is developing and validating coupled models describing the morphological, biological and ecological evolution of tidal environments. The interactions between the physical and biological processes occurring in these regions requires that the system be studied as a whole rather than as separate parts. Extensive use of remote sensing including LiDAR is being made to provide validation data for the modelling. This paper describes the different uses of LiDAR within the project and their relevance to the TIDE science objectives. LiDAR data have been acquired from three different environments, the Venice Lagoon in Italy, Morecambe Bay in England, and the Eden estuary in Scotland. LiDAR accuracy at each site has been evaluated using ground reference data acquired with differential GPS. A semi-automatic technique has been developed to extract tidal channel networks from LiDAR data either used alone or fused with aerial photography. While the resulting networks may require some correction, the procedure does allow network extraction over large areas using objective criteria and reduces fieldwork requirements. The networks extracted may subsequently be used in geomorphological analyses, for example to describe the drainage patterns induced by networks and to examine the rate of change of networks. Estimation of the heights of the low and sparse vegetation on marshes is being investigated by analysis of the statistical distribution of the measured LiDAR heights. Species having different mean heights may be separated using the first-order moments of the height distribution.
Resumo:
Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.
Resumo:
Two ongoing projects at ESSC that involve the development of new techniques for extracting information from airborne LiDAR data and combining this information with environmental models will be discussed. The first project in conjunction with Bristol University is aiming to improve 2-D river flood flow models by using remote sensing to provide distributed data for model calibration and validation. Airborne LiDAR can provide such models with a dense and accurate floodplain topography together with vegetation heights for parameterisation of model friction. The vegetation height data can be used to specify a friction factor at each node of a model’s finite element mesh. A LiDAR range image segmenter has been developed which converts a LiDAR image into separate raster maps of surface topography and vegetation height for use in the model. Satellite and airborne SAR data have been used to measure flood extent remotely in order to validate the modelled flood extent. Methods have also been developed for improving the models by decomposing the model’s finite element mesh to reflect floodplain features such as hedges and trees having different frictional properties to their surroundings. Originally developed for rural floodplains, the segmenter is currently being extended to provide DEMs and friction parameter maps for urban floods, by fusing the LiDAR data with digital map data. The second project is concerned with the extraction of tidal channel networks from LiDAR. These networks are important features of the inter-tidal zone, and play a key role in tidal propagation and in the evolution of salt-marshes and tidal flats. The study of their morphology is currently an active area of research, and a number of theories related to networks have been developed which require validation using dense and extensive observations of network forms and cross-sections. The conventional method of measuring networks is cumbersome and subjective, involving manual digitisation of aerial photographs in conjunction with field measurement of channel depths and widths for selected parts of the network. A semi-automatic technique has been developed to extract networks from LiDAR data of the inter-tidal zone. A multi-level knowledge-based approach has been implemented, whereby low level algorithms first extract channel fragments based mainly on image properties then a high level processing stage improves the network using domain knowledge. The approach adopted at low level uses multi-scale edge detection to detect channel edges, then associates adjacent anti-parallel edges together to form channels. The higher level processing includes a channel repair mechanism.
Progress on “Changing coastlines: data assimilation for morphodynamic prediction and predictability”
Resumo:
The task of assessing the likelihood and extent of coastal flooding is hampered by the lack of detailed information on near-shore bathymetry. This is required as an input for coastal inundation models, and in some cases the variability in the bathymetry can impact the prediction of those areas likely to be affected by flooding in a storm. The constant monitoring and data collection that would be required to characterise the near-shore bathymetry over large coastal areas is impractical, leaving the option of running morphodynamic models to predict the likely bathymetry at any given time. However, if the models are inaccurate the errors may be significant if incorrect bathymetry is used to predict possible flood risks. This project is assessing the use of data assimilation techniques to improve the predictions from a simple model, by rigorously incorporating observations of the bathymetry into the model, to bring the model closer to the actual situation. Currently we are concentrating on Morecambe Bay as a primary study site, as it has a highly dynamic inter-tidal zone, with changes in the course of channels in this zone impacting the likely locations of flooding from storms. We are working with SAR images, LiDAR, and swath bathymetry to give us the observations over a 2.5 year period running from May 2003 – November 2005. We have a LiDAR image of the entire inter-tidal zone for November 2005 to use as validation data. We have implemented a 3D-Var data assimilation scheme, to investigate the improvements in performance of the data assimilation compared to the previous scheme which was based on the optimal interpolation method. We are currently evaluating these different data assimilation techniques, using 22 SAR data observations. We will also include the LiDAR data and swath bathymetry to improve the observational coverage, and investigate the impact of different types of observation on the predictive ability of the model. We are also assessing the ability of the data assimilation scheme to recover the correct bathymetry after storm events, which can dramatically change the bathymetry in a short period of time.
Resumo:
Orlistat is an effective weight-loss medicine, which will soon be available for purchase in pharmacies. We used a factorial experiment and found that informing people about the availability for purchase of this medicinal product previously only available on prescription resulted in higher ratings of perceived value and effectiveness compared to a natural health supplement even though we used the same statement about effectiveness. This positive perception of orlistat was not impaired by the provision of side-effect information. Orlistat will soon be available in pharmacies. Health professionals must act to prevent its misuse by those not overweight.
Resumo:
Rationale: In UK hospitals, the preparation of all total parenteral nutrition (TPN) products must be made in the pharmacy as TPNs are categorised as high-risk injectables (NPSA/2007/20). The National Aseptic Error Reporting Scheme has been collecting data on pharmacy compounding errors in the UK since August 2003. This study reports on types of error associated with the preparation of TPNs, including the stage at which these were identified and potential and actual patient outcomes. Methods: Reports of compounding errors for the period 1/2004 - 3/2007 were analysed on an Excel spreadsheet. Results: Of a total of 3691 compounding error reports, 674 (18%) related to TPN products; 548 adult vs. 126 paediatric. A significantly higher proportion of adult TPNs (28% vs. 13% paediatric) were associated with labelling errors and a significantly higher proportion of paediatric TPNs (25% vs. 15% adult) were associated with incorrect transcriptions (Chi-Square Test; p<0.005). Labelling errors were identified equally by pharmacists (42%) and technicians (48%) with technicians detecting mainly at first check and pharmacists at final check. Transcription errors were identified mainly by technicians (65% vs. 27% pharmacist) at first check. Incorrect drug selection (13%) and calculation errors (9%) were associated with adult and paediatric TPN preparations in the same ratio. One paediatric TPN error detected at first check was considered potentially catastrophic; 31 (5%) errors were considered of major and 38 (6%) of moderate potential consequence. Five errors (2 moderate, 1 minor) were identified during or after administration. Conclusions: While recent UK patient safety initiatives are aimed at improving the safety of injectable medicines in clinical areas, the current study highlights safety problems that exist within pharmacy production units. This could be used in the creation of an error management tool for TPN compounding processes within hospital pharmacies.
Resumo:
Previous work has established the value of goal-oriented approaches to requirements engineering. Achieving clarity and agreement about stakeholders’ goals and assumptions is critical for building successful software systems and managing their subsequent evolution. In general, this decision-making process requires stakeholders to understand the implications of decisions outside the domains of their own expertise. Hence it is important to support goal negotiation and decision making with description languages that are both precise and expressive, yet easy to grasp. This paper presents work in progress to develop a pattern language for describing goal refinement graphs. The language has a simple graphical notation, which is supported by a prototype editor tool, and a symbolic notation based on modal logic.
Resumo:
In real world applications sequential algorithms of data mining and data exploration are often unsuitable for datasets with enormous size, high-dimensionality and complex data structure. Grid computing promises unprecedented opportunities for unlimited computing and storage resources. In this context there is the necessity to develop high performance distributed data mining algorithms. However, the computational complexity of the problem and the large amount of data to be explored often make the design of large scale applications particularly challenging. In this paper we present the first distributed formulation of a frequent subgraph mining algorithm for discriminative fragments of molecular compounds. Two distributed approaches have been developed and compared on the well known National Cancer Institute’s HIV-screening dataset. We present experimental results on a small-scale computing environment.
Resumo:
This paper is an engineer's appreciation of environmental assessment with particular reference to highway development. While scheme-related Environmental Assessment for individual development may identify particular potential impacts, and may avoid or minimise some of the problems, in many cases it may be too late to take such actions. Ideally, Environmental Assessment should commence at the Strategic Level to cover policies, plan and programmes, and the scheme-related Environmental Assessments for individual projects should supplement those in the framework of Strategic Level. The utimate target is to assess the policy for their contribution to effecting sustainable development. Whole Life Environmental Impacts should be considered. These are the full impact consideration from planning, design and choice of materials, construction, operation and finally decommission. Most of the Environmental Assessments have not included the Whole Life Environmental Impacts. There is only limited monitoring in the operation stage after the construction of the scheme is complete, therefore, subsequent Environmental Assessments cannot benefit from the feedback of the scheme. No development should cost the Earth, hence Environmental Assessments have to be carried out thoroughly to serve as one of the instruments to meet the need of sustainable development.
Resumo:
Dog-in-a-Doublet Bridge Reconstruction Scheme integrates the interdisciplinary design to provide solution for different needs: to provide a crossing to carry the new 40-tonne loading requirement, to improve the visibility of the substandard junction, and within the funding available. The management of the project involves the co-ordination of different authorities, statutory undertakers and other bodies. At certain stages, there were negotiations with RSPB on the restriction of construction period from July to November. After the re-assessment of the environmental impact of the construction on the breeding and wintering birds, the restrict was waived. As the bid for the assessment, strengthening and structural maintenance of bridges in the Cambridgeshire County Council Transport Policies and Programme No. 21 (1995/96) for Dog-in-a-Doublet Bridge Reconstruction Schemes was unsuccessful to attract the Transport Supplement Grant (TSG). A series of temporary measures had to be undertaken until funding is available for its replacement.