980 resultados para Remote sensing techniques
Resumo:
In recent years there has been a great effort to combine the technologies and techniques of GIS and process models. This project examines the issues of linking a standard current generation 2½d GIS with several existing model codes. The focus for the project has been the Shropshire Groundwater Scheme, which is being developed to augment flow in the River Severn during drought periods by pumping water from the Shropshire Aquifer. Previous authors have demonstrated that under certain circumstances pumping could reduce the soil moisture available for crops. This project follows earlier work at Aston in which the effects of drawdown were delineated and quantified through the development of a software package that implemented a technique which brought together the significant spatially varying parameters. This technique is repeated here, but using a standard GIS called GRASS. The GIS proved adequate for the task and the added functionality provided by the general purpose GIS - the data capture, manipulation and visualisation facilities - were of great benefit. The bulk of the project is concerned with examining the issues of the linkage of GIS and environmental process models. To this end a groundwater model (Modflow) and a soil moisture model (SWMS2D) were linked to the GIS and a crop model was implemented within the GIS. A loose-linked approach was adopted and secondary and surrogate data were used wherever possible. The implications of which relate to; justification of a loose-linked versus a closely integrated approach; how, technically, to achieve the linkage; how to reconcile the different data models used by the GIS and the process models; control of the movement of data between models of environmental subsystems, to model the total system; the advantages and disadvantages of using a current generation GIS as a medium for linking environmental process models; generation of input data, including the use of geostatistic, stochastic simulation, remote sensing, regression equations and mapped data; issues of accuracy, uncertainty and simply providing adequate data for the complex models; how such a modelling system fits into an organisational framework.
Resumo:
This thesis begins by providing a review of techniques for interpreting the thermal response at the earth's surface acquired using remote sensing technology. Historic limitations in the precision with which imagery acquired from airborne platforms can be geometrically corrected and co-registered has meant that relatively little work has been carried out examining the diurnal variation of surface temperature over wide regions. Although emerging remote sensing systems provide the potential to register temporal image data within satisfactory levels of accuracy, this technology is still not widely available and does not address the issue of historic data sets which cannot be rectified using conventional parametric approaches. In overcoming these problems, the second part of this thesis describes the development of an alternative approach for rectifying airborne line-scanned imagery. The underlying assumption that scan lines within the imagery are straight greatly reduces the number of ground control points required to describe the image geometry. Furthermore, the use of pattern matching procedures to identify geometric disparities between raw line-scanned imagery and corresponding aerial photography enables the correction procedure to be almost fully automated. By reconstructing the raw image data on a truly line-by-line basis, it is possible to register the airborne line-scanned imagery to the aerial photography with an average accuracy of better than one pixel. Providing corresponding aerial photography is available, this approach can be applied in the absence of platform altitude information allowing multi-temporal data sets to be corrected and registered.
Resumo:
Urban regions present some of the most challenging areas for the remote sensing community. Many different types of land cover have similar spectral responses, making them difficult to distinguish from one another. Traditional per-pixel classification techniques suffer particularly badly because they only use these spectral properties to determine a class, and no other properties of the image, such as context. This project presents the results of the classification of a deeply urban area of Dudley, West Midlands, using 4 methods: Supervised Maximum Likelihood, SMAP, ECHO and Unsupervised Maximum Likelihood. An accuracy assessment method is then developed to allow a fair representation of each procedure and a direct comparison between them. Subsequently, a classification procedure is developed that makes use of the context in the image, though a per-polygon classification. The imagery is broken up into a series of polygons extracted from the Marr-Hildreth zero-crossing edge detector. These polygons are then refined using a region-growing algorithm, and then classified according to the mean class of the fine polygons. The imagery produced by this technique is shown to be of better quality and of a higher accuracy than that of other conventional methods. Further refinements are suggested and examined to improve the aesthetic appearance of the imagery. Finally a comparison with the results produced from a previous study of the James Bridge catchment, in Darleston, West Midlands, is made, showing that the Polygon classified ATM imagery performs significantly better than the Maximum Likelihood classified videography used in the initial study, despite the presence of geometric correction errors.
Resumo:
This study was concerned with the computer automation of land evaluation. This is a broad subject with many issues to be resolved, so the study concentrated on three key problems: knowledge based programming; the integration of spatial information from remote sensing and other sources; and the inclusion of socio-economic information into the land evaluation analysis. Land evaluation and land use planning were considered in the context of overseas projects in the developing world. Knowledge based systems were found to provide significant advantages over conventional programming techniques for some aspects of the land evaluation process. Declarative languages, in particular Prolog, were ideally suited to integration of social information which changes with every situation. Rule-based expert system shells were also found to be suitable for this role, including knowledge acquisition at the interview stage. All the expert system shells examined suffered from very limited constraints to problem size, but new products now overcome this. Inductive expert system shells were useful as a guide to knowledge gaps and possible relationships, but the number of examples required was unrealistic for typical land use planning situations. The accuracy of classified satellite imagery was significantly enhanced by integrating spatial information on soil distribution for Thailand data. Estimates of the rice producing area were substantially improved (30% change in area) by the addition of soil information. Image processing work on Mozambique showed that satellite remote sensing was a useful tool in stratifying vegetation cover at provincial level to identify key development areas, but its full utility could not be realised on typical planning projects, without treatment as part of a complete spatial information system.
Resumo:
This thesis reports on the development of a technique to evaluate hydraulic conductivities in a soil (Snowcal) subject to freezing conditions. The technique draws on three distinctly different disciplines, Nuclear Physics, Soil Physics and Remote Sensing to provide a non-destructive and reliable evaluation of hydraulic conductivity throughout a freezing test. Thermal neutron radiography is used to provide information on local water/ice contents at anytime throughout the test. The experimental test rig is designed so that the soil matrix can be radiated by a neutron beam, from a nuclear reactor, to obtain radiographs. The radiographs can then be interpreted, following a process of remote sensing image enhancement, to yield information on relative water/ice contents. Interpretation of the radiographs is accommodated using image analysis equipment capable of distinguishing between 256 shades of grey. Remote sensing image enhancing techniques are then employed to develop false colour images which show the movement of water and development of ice lenses in the soil. Instrumentation is incorporated in the soil in the form of psychrometer/thermocouples, to record water potential, electrical resistance probes to enable ice and water to be differentiated on the radiographs and thermocouples to record the temperature gradient. Water content determinations are made from the enhanced images and plotted against potential measurements to provide the moisture characteristic for the soil. With relevant mathematical theory pore water distributions are obtained and combined with water content data to give hydraulic conductivities. The values for hydraulic conductivity in the saturated soil and at the frozen fringe are compared with established values for silts and silty-sands. The values are in general agreement and, with refinement, this non-destructive technique could afford useful information on a whole range of soils. The technique is of value over other methods because ice lenses are actually seen forming in the soil, supporting the accepted theories of frost action. There are economic and experimental restraints to the work which are associated with the use of a nuclear facility, however, the technique is versatile and has been applied to the study of moisture transfer in porous building materials and could be further developed into other research areas.
Resumo:
Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. ^ In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly spaced LIDAR measurements. To reconstruct 3D building models, the raw 2D topology of each building is first extracted and then further adjusted. Since the adjusting operations for simple building models do not work well on 2D topology, 2D snake algorithm is proposed to adjust 2D topology. The 2D snake algorithm consists of newly defined energy functions for topology adjusting and a linear algorithm to find the minimal energy value of 2D snake problems. Data sets from urbanized areas including large institutional, commercial, and small residential buildings were employed to test the proposed framework. The results demonstrated that the proposed framework achieves a very good performance. ^
Resumo:
In the shallow continental shelf in Northeastern Rio Grande do Norte - Brazil, important underwater geomorphological features can be found 6km from the coastline. They are coral reefs, locally known as “parrachos”. The present study aims to characterize and analyze the geomorphological feature as well as the ones of the benthic surface, and the distribution of biogenic sediments found in parrachos at Rio do Fogo and associated shallow platforms, by using remote sensing products and in situ data collections. This was made possible due to sedimentological, bathymetric and geomorphological maps elaborated from composite bands of images from the satellite sensors ETM+/Landsat-7, OLI/Landsat-8, MS/GeoEye and PAN/WordView-1, and analysis of bottom sediments samples. These maps were analyzed, integrally interpreted and validated in fieldwork, thus permitting the generation of a new geomorphological zoning of the shallow shelf in study and a geoenvironmental map of the Parrachos in Rio do Fogo. The images used were subject to Digital Image Processing techniques. All obtained data and information were stored in a Geographic Information System (GIS) and can become available to the scientific community. This shallow platform has a carbonate bottom composed mostly by algae. Collected and analyzed sediment samples can be classified as biogenic carbonatic sands, as they are composed 75% by calcareous algae, according to the found samples. The most abundant classes are green algae, red algae, nonbiogenic sediments (mineral grains), ancient algae and molluscs. At the parrachos the following was mapped: Barreta Channel, intertidal reefs, submerged reefs, the spur and grooves, the pools, the sandy bank, the bank of algae, sea grass, submerged roads and Rio do Fogo Channel. This work presents new information about geomorphology and evolution in the study area, and will be guiding future decision making in the handling and environmental management of the region
Resumo:
Surveying habitats critical to the survival of grey nurse sharks in South-East Queensland has mapped critical habitats, gathered species inventories and developed protocols for ecological monitoring of critical habitats in southern Queensland. This information has assisted stakeholders with habitat definition and effective management. In 2002 members of UniDive applied successfully for World Wide Fund for Nature, Threatened Species Network funds to map the critical Grey Nurse Shark Habitats in south east Queensland. UniDive members used the funding to survey, from the boats of local dive operators, Wolf Rock at Double Island Point, Gotham, Cherub's Cave, Henderson's Rock and China Wall at North Moreton and Flat Rock at Point Look Out during 2002 and 2003. These sites are situated along the south east Queensland coast and are known to be key Grey Nurse Shark aggregation sites. During the project UniDive members were trained in mapping and survey techniques that include identification of fish, invertebrates and substrate types. Training was conducted by experts from the University of Queensland (Centre of Marine Studies, Biophysical Remote Sensing) and the Queensland Parks and Wildlife Service who are also UniDive members. The monitoring methods (see methods) are based upon results of the UniDive Coastcare project from 2002, the international established Reef Check program and research conducted by Biophysical Remote Sensing and the Centre of Marine Studies. Habitats were mapped using a combination of towed GPS photo transects, aerial photography, bathymetry surveys and expert knowledge. This data provides georeferenced information regarding the major features of each of Sites mapped including Wolf Rock
Resumo:
This dissertation studies the coding strategies of computational imaging to overcome the limitation of conventional sensing techniques. The information capacity of conventional sensing is limited by the physical properties of optics, such as aperture size, detector pixels, quantum efficiency, and sampling rate. These parameters determine the spatial, depth, spectral, temporal, and polarization sensitivity of each imager. To increase sensitivity in any dimension can significantly compromise the others.
This research implements various coding strategies subject to optical multidimensional imaging and acoustic sensing in order to extend their sensing abilities. The proposed coding strategies combine hardware modification and signal processing to exploiting bandwidth and sensitivity from conventional sensors. We discuss the hardware architecture, compression strategies, sensing process modeling, and reconstruction algorithm of each sensing system.
Optical multidimensional imaging measures three or more dimensional information of the optical signal. Traditional multidimensional imagers acquire extra dimensional information at the cost of degrading temporal or spatial resolution. Compressive multidimensional imaging multiplexes the transverse spatial, spectral, temporal, and polarization information on a two-dimensional (2D) detector. The corresponding spectral, temporal and polarization coding strategies adapt optics, electronic devices, and designed modulation techniques for multiplex measurement. This computational imaging technique provides multispectral, temporal super-resolution, and polarization imaging abilities with minimal loss in spatial resolution and noise level while maintaining or gaining higher temporal resolution. The experimental results prove that the appropriate coding strategies may improve hundreds times more sensing capacity.
Human auditory system has the astonishing ability in localizing, tracking, and filtering the selected sound sources or information from a noisy environment. Using engineering efforts to accomplish the same task usually requires multiple detectors, advanced computational algorithms, or artificial intelligence systems. Compressive acoustic sensing incorporates acoustic metamaterials in compressive sensing theory to emulate the abilities of sound localization and selective attention. This research investigates and optimizes the sensing capacity and the spatial sensitivity of the acoustic sensor. The well-modeled acoustic sensor allows localizing multiple speakers in both stationary and dynamic auditory scene; and distinguishing mixed conversations from independent sources with high audio recognition rate.
Resumo:
Geomorphic process units have been derived in order to allow quantification via GIS techniques at a catchment scale. Mass movement rates based on existing field measurements are employed in the budget calculations. In the Kärkevagge catchment, Northern Sweden, 80% of the area can be identified either as a source area for sediments or as a zone where sediments are deposited. The overall budget for the slopes beneath the rockwalls in the Kärkevagge is approximately 680 t/a whilst about 150 t a-1 are transported into the fluvial system.
Resumo:
Big Data Analytics is an emerging field since massive storage and computing capabilities have been made available by advanced e-infrastructures. Earth and Environmental sciences are likely to benefit from Big Data Analytics techniques supporting the processing of the large number of Earth Observation datasets currently acquired and generated through observations and simulations. However, Earth Science data and applications present specificities in terms of relevance of the geospatial information, wide heterogeneity of data models and formats, and complexity of processing. Therefore, Big Earth Data Analytics requires specifically tailored techniques and tools. The EarthServer Big Earth Data Analytics engine offers a solution for coverage-type datasets, built around a high performance array database technology, and the adoption and enhancement of standards for service interaction (OGC WCS and WCPS). The EarthServer solution, led by the collection of requirements from scientific communities and international initiatives, provides a holistic approach that ranges from query languages and scalability up to mobile access and visualization. The result is demonstrated and validated through the development of lighthouse applications in the Marine, Geology, Atmospheric, Planetary and Cryospheric science domains.
Resumo:
Big Data Analytics is an emerging field since massive storage and computing capabilities have been made available by advanced e-infrastructures. Earth and Environmental sciences are likely to benefit from Big Data Analytics techniques supporting the processing of the large number of Earth Observation datasets currently acquired and generated through observations and simulations. However, Earth Science data and applications present specificities in terms of relevance of the geospatial information, wide heterogeneity of data models and formats, and complexity of processing. Therefore, Big Earth Data Analytics requires specifically tailored techniques and tools. The EarthServer Big Earth Data Analytics engine offers a solution for coverage-type datasets, built around a high performance array database technology, and the adoption and enhancement of standards for service interaction (OGC WCS and WCPS). The EarthServer solution, led by the collection of requirements from scientific communities and international initiatives, provides a holistic approach that ranges from query languages and scalability up to mobile access and visualization. The result is demonstrated and validated through the development of lighthouse applications in the Marine, Geology, Atmospheric, Planetary and Cryospheric science domains.
Resumo:
We are in an era of unprecedented data volumes generated from observations and model simulations. This is particularly true from satellite Earth Observations (EO) and global scale oceanographic models. This presents us with an opportunity to evaluate large scale oceanographic model outputs using EO data. Previous work on model skill evaluation has led to a plethora of metrics. The paper defines two new model skill evaluation metrics. The metrics are based on the theory of universal multifractals and their purpose is to measure the structural similarity between the model predictions and the EO data. The two metrics have the following advantages over the standard techniques: a) they are scale-free, b) they carry important part of information about how model represents different oceanographic drivers. Those two metrics are then used in the paper to evaluate the performance of the FVCOM model in the shelf seas around the south-west coast of the UK.
Resumo:
We are in an era of unprecedented data volumes generated from observations and model simulations. This is particularly true from satellite Earth Observations (EO) and global scale oceanographic models. This presents us with an opportunity to evaluate large scale oceanographic model outputs using EO data. Previous work on model skill evaluation has led to a plethora of metrics. The paper defines two new model skill evaluation metrics. The metrics are based on the theory of universal multifractals and their purpose is to measure the structural similarity between the model predictions and the EO data. The two metrics have the following advantages over the standard techniques: a) they are scale-free, b) they carry important part of information about how model represents different oceanographic drivers. Those two metrics are then used in the paper to evaluate the performance of the FVCOM model in the shelf seas around the south-west coast of the UK.