952 resultados para Image processing.
Semantic discriminant mapping for classification and browsing of remote sensing textures and objects
Resumo:
Presented in this paper is an experimental study on the characteristics of the turbulence produced by rising air bubbles in water. The measurements of turbulent velocities were made by using visualization technique of particle streak and computer image processing of the flow field. The turbulence features have been examined, showing that the rising bubble-produced turbulence can be approximately modeled by homogeneous turbulence as in the case of grid turbulence in air.
Resumo:
In this paper, the real-time deformation fields are observed in two different kinds of hole-excavated dog-bone samples loaded by an SHTB, including single hole sample and dual holes sample with the aperture size of 0.8mm. The testing system consists of a high-speed camera, a He-Ne laser, a frame grabber and a synchronization device with the controlling accuracy of I microsecond. Both the single hole expanding process and the interaction of the two holes are recorded with the time interval of 10 mu s. The observed images on the sample surface are analyzed by newly developed software based on digital correlation theory and a modified image processing method. The 2-D displacement fields in plane are obtained with a resolution of 50 mu m and an accuracy of 0.5 mu m. Experimental results obtained in this paper are proofed, by compared with FEM numerical simulations.
Resumo:
An optical diagnostic system consisting of Michelson interferometer with image processor has been developed for study of the kinetics of thermal capillary convection and buoyancy convection. This optical interferometer has been used to observe and measure surface deformation and surface wave of capillary convection and buoyancy convection in a rectangular cavity with different temperature’s sidewalls. Fourier transformation is used to image processing. The quantitative results of surface deformation and surface wave have been calculated from the interference fringe pattern. With the increasing of temperature gradient, the liquid surface slant gradually. It’s deformation has been calculated, which is related directly with temperature gradient. This is one of the characters introducing convection. Another interesting phenomenon is the inclining direction, which is different when the liquid layer is thin or thick. When the liquid layer is thin, convection is mainly controlled by thermocapillary effect. However, When the liquid layer is thick, convection is mainly controlled by buoyancy effect. Surface deformation in the present experiment are more and more declining in this process. The present experiment proved that surface deformation appears before the appearance of surface wave on fluid convection, it is related with temperature gradient, and the height of liquid layer, and lies on capillary convection and buoyancy convection. The present experiment also demonstrates that the amplitude of surface wave of thermocapillary-buoyancy convection is much smaller than surface deformation, the wave is covered by deformation.
Resumo:
It is to investigate molecule interactions between antigen and antibody with ellipsometric imaging technique and demonstrate some features and possibilities offered by applications of the technique. Molecule interaction is an important interest for molecule biologist and immunologist. They have used some established methods such as immufluorcence, radioimmunoassay and surface plasma resonance, etc, to study the molecule interaction. At the same time, experimentalists hope to use some updated technique with more direct visual results. Ellipsometric imaging is non-destructive and exhibits a high sensitivity to phase transitions with thin layers. It is capable of imaging local variations in the optical properties such as thickness due to the presence of different surface concentration of molecule or different deposited molecules. If a molecular mono-layer (such as antigen) with bio-activity were deposited on a surface to form a sensing surface and then incubated in a solution with other molecules (such as antibody), a variation of the layer thickness when the molecules on the sensing surface reacted with the others in the solution could be observed with ellipsometric imaging. Every point on the surface was measured at the same time with a high sensitivity to distinguish the variation between mono-layer and molecular complexes. Ellipsometric imaging is based on conventional ellipsometry with charge coupled device (CCD) as detector and images are caught with computer with image processing technique. It has advantages of high sensitivity to thickness variation (resolution in the order of angstrom), big field of view (in square centimeter), high sampling speed (a picture taken within one second), and high lateral resolution (in the order of micrometer). Here it has just shown one application in study of antigen-antibody interaction, and it is possible to observe molecule interaction process with an in-situ technique.
Resumo:
The DADAISM project brings together researchers from the diverse fields of archaeology, human computer interaction, image processing, image search and retrieval, and text mining to create a rich interactive system to address the problems of researchers finding images relevant to their research. In the age of digital photography, thousands of images are taken of archaeological artefacts. These images could help archaeologists enormously in their tasks of classification and identification if they could be related to one another effectively. They would yield many new insights on a range of archaeological problems. However, these images are currently greatly underutilized for two key reasons. Firstly, the current paradigm for interaction with image collections is basic keyword search or, at best, simple faceted search. Secondly, even if these interactions are possible, the metadata related to the majority of images of archaeological artefacts is scarce in information relating to the content of the image and the nature of the artefact, and is time intensive to enter manually. DADAISM will transform the way in which archaeologists interact with online image collections. It will deploy user-centred design methodologies to create an interactive system that goes well beyond current systems for working with images, and will support archaeologists’ tasks of finding, organising, relating and labelling images as well as other relevant sources of information such as grey literature documents.
Resumo:
EXECUTIVE SUMMARY: The Coastal Change Analysis Programl (C-CAP) is developing a nationally standardized database on landcover and habitat change in the coastal regions of the United States. C-CAP is part of the Estuarine Habitat Program (EHP) of NOAA's Coastal Ocean Program (COP). C-CAP inventories coastal submersed habitats, wetland habitats, and adjacent uplands and monitors changes in these habitats on a one- to five-year cycle. This type of information and frequency of detection are required to improve scientific understanding of the linkages of coastal and submersed wetland habitats with adjacent uplands and with the distribution, abundance, and health of living marine resources. The monitoring cycle will vary according to the rate and magnitude of change in each geographic region. Satellite imagery (primarily Landsat Thematic Mapper), aerial photography, and field data are interpreted, classified, analyzed, and integrated with other digital data in a geographic information system (GIS). The resulting landcover change databases are disseminated in digital form for use by anyone wishing to conduct geographic analysis in the completed regions. C-CAP spatial information on coastal change will be input to EHP conceptual and predictive models to support coastal resource policy planning and analysis. CCAP products will include 1) spatially registered digital databases and images, 2) tabular summaries by state, county, and hydrologic unit, and 3) documentation. Aggregations to larger areas (representing habitats, wildlife refuges, or management districts) will be provided on a case-by-case basis. Ongoing C-CAP research will continue to explore techniques for remote determination of biomass, productivity, and functional status of wetlands and will evaluate new technologies (e.g. remote sensor systems, global positioning systems, image processing algorithms) as they become available. Selected hardcopy land-cover change maps will be produced at local (1:24,000) to regional scales (1:500,000) for distribution. Digital land-cover change data will be provided to users for the cost of reproduction. Much of the guidance contained in this document was developed through a series of professional workshops and interagency meetings that focused on a) coastal wetlands and uplands; b) coastal submersed habitat including aquatic beds; c) user needs; d) regional issues; e) classification schemes; f) change detection techniques; and g) data quality. Invited participants included technical and regional experts and representatives of key State and Federal organizations. Coastal habitat managers and researchers were given an opportunity for review and comment. This document summarizes C-CAP protocols and procedures that are to be used by scientists throughout the United States to develop consistent and reliable coastal change information for input to the C-CAP nationwide database. It also provides useful guidelines for contributors working on related projects. It is considered a working document subject to periodic review and revision.(PDF file contains 104 pages.)
Resumo:
For 10 years the Institute for Fishing Technology, Hamburg (IFH) has been carrying out experiments in the brown shrimp fishery with beam trawls aiming at a reduction of unwanted bycatches. When the tests were transferred to commercial fishery conditions the personnel effort and costs increased markedly. It became e.g. necessary to install a deep-freeze chain to make it possible to evaluate more samples in the laboratory. This again required to increase the number of technicians for measuring the fish and shrimp samples, but also made it necessary to perform this work in the most rational and time-saving way by applying modern electronic aids. Though all samples still have to be sorted by species and have to be weighed and measured the introduction of electronic aids, however, like electronic measuring board and computer-aided image processing system, all weight and length data are immediately and digitally recorded after processing. They are transferred via a network to a server PC which stores them into a purpose-designed database. This article describes the applicationof two electronic systems: the measuring board (FM 100, Fa. SCANTROL), iniated by a project in the Norwegian Institute for Fishing Technology, and a computer-aided image processing system, focussing on measuring shrimps in their naturally flexed shape, also developed in the Institute for Fishing Technology in close collaboration with the University of Duisburg. These electronic recording systems allow the consistent and reproducible record of data independent of the changing day-to-day personal form of the staff operating them. With the help of these systems the number of measurements the laboratory could be maximized to 250 000 per year. This made it possible to evaluate, in 1999, 525 catch samples from 75 commercial hauls taken during 15 days at sea. The time gain in measuring the samples is about one third of the time previously needed (i.e. one hour per sample). An additional advantage is the immediate availability of the digitally stored data which enables rapid analyses of all finished subexperiments. Both systems are applied today in several institutes of the Federal Research Centre. The image processing system is now the standard measuring method in an international research project.