733 resultados para HABITAT CLASSIFICATION SYSTEM (HCS)
Resumo:
The presence of a large number of spectral bands in the hyperspectral images increases the capability to distinguish between various physical structures. However, they suffer from the high dimensionality of the data. Hence, the processing of hyperspectral images is applied in two stages: dimensionality reduction and unsupervised classification techniques. The high dimensionality of the data has been reduced with the help of Principal Component Analysis (PCA). The selected dimensions are classified using Niche Hierarchical Artificial Immune System (NHAIS). The NHAIS combines the splitting method to search for the optimal cluster centers using niching procedure and the merging method is used to group the data points based on majority voting. Results are presented for two hyperspectral images namely EO-1 Hyperion image and Indian pines image. A performance comparison of this proposed hierarchical clustering algorithm with the earlier three unsupervised algorithms is presented. From the results obtained, we deduce that the NHAIS is efficient.
Resumo:
Clock synchronization in wireless sensor networks (WSNs) assures that sensor nodes have the same reference clock time. This is necessary not only for various WSN applications but also for many system level protocols for WSNs such as MAC protocols, and protocols for sleep scheduling of sensor nodes. Clock value of a node at a particular instant of time depends on its initial value and the frequency of the crystal oscillator used in the sensor node. The frequency of the crystal oscillator varies from node to node, and may also change over time depending upon many factors like temperature, humidity, etc. As a result, clock values of different sensor nodes diverge from each other and also from the real time clock, and hence, there is a requirement for clock synchronization in WSNs. Consequently, many clock synchronization protocols for WSNs have been proposed in the recent past. These protocols differ from each other considerably, and so, there is a need to understand them using a common platform. Towards this goal, this survey paper categorizes the features of clock synchronization protocols for WSNs into three types, viz, structural features, technical features, and global objective features. Each of these categories has different options to further segregate the features for better understanding. The features of clock synchronization protocols that have been used in this survey include all the features which have been used in existing surveys as well as new features such as how the clock value is propagated, when the clock value is propagated, and when the physical clock is updated, which are required for better understanding of the clock synchronization protocols in WSNs in a systematic way. This paper also gives a brief description of a few basic clock synchronization protocols for WSNs, and shows how these protocols fit into the above classification criteria. In addition, the recent clock synchronization protocols for WSNs, which are based on the above basic clock synchronization protocols, are also given alongside the corresponding basic clock synchronization protocols. Indeed, the proposed model for characterizing the clock synchronization protocols in WSNs can be used not only for analyzing the existing protocols but also for designing new clock synchronization protocols. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
Selection of relevant features is an open problem in Brain-computer interfacing (BCI) research. Sometimes, features extracted from brain signals are high dimensional which in turn affects the accuracy of the classifier. Selection of the most relevant features improves the performance of the classifier and reduces the computational cost of the system. In this study, we have used a combination of Bacterial Foraging Optimization and Learning Automata to determine the best subset of features from a given motor imagery electroencephalography (EEG) based BCI dataset. Here, we have employed Discrete Wavelet Transform to obtain a high dimensional feature set and classified it by Distance Likelihood Ratio Test. Our proposed feature selector produced an accuracy of 80.291% in 216 seconds.
Resumo:
In this paper, we consider applying derived knowledge base regarding the sensitivity and specificity of damage(s) to be detected by an SHM system being designed and qualified. These efforts are necessary toward developing capabilities in SHM system to classify reliably various probable damages through sequence of monitoring, i.e., damage precursor identification, detection of damage and monitoring its progression. We consider the particular problem of visual and ultrasonic NDE based SHM system design requirements, where the damage detection sensitivity and specificity data definitions for a class of structural components are established. Methodologies for SHM system specification creation are discussed in details. Examples are shown to illustrate how the physics of damage detection scheme limits particular damage detection sensitivity and specificity and further how these information can be used in algorithms to combine various different NDE schemes in an SHM system to enhance efficiency and effectiveness. Statistical and data driven models to determine the sensitivity and probability of damage detection (POD) has been demonstrated for plate with varying one-sided line crack using optical and ultrasonic based inspection techniques.
Resumo:
Based on the computer integrated and flexible laser processing system, an intelligent measuring sub-system was developed. A novel model has been built up to compensate the deviations of the main frame-structure, and a new 3-D laser tracker system is applied to adjust the accuracy of the system. To analyze the characteristic of all kind surfaces of automobile outer penal moulds and dies, classification of types of the surface、brim and ridge(or vale) area to be measured and processed has been established, resulting in one of the main processing functions of the laser processing system. According to different type of surfaces, a 2-D adaptive measuring method based on B?zier curve was developed; furthermore a 3-D adaptive measuring method based on Spline curve was also developed. According to the laser materials processing characteristics and data characteristics, necessary methods have been developed to generate processing tracks, they are explained in details. Measuring experiments and laser processing experiments were carried out to testify the above mentioned methods, which have been applied in the computer integrated and flexible laser processing system developed by the Institute of Mechanics, CAS.
Resumo:
Without knowledge of basic seafloor characteristics, the ability to address any number of critical marine and/or coastal management issues is diminished. For example, management and conservation of essential fish habitat (EFH), a requirement mandated by federally guided fishery management plans (FMPs), requires among other things a description of habitats for federally managed species. Although the list of attributes important to habitat are numerous, the ability to efficiently and effectively describe many, and especially at the scales required, does not exist with the tools currently available. However, several characteristics of seafloor morphology are readily obtainable at multiple scales and can serve as useful descriptors of habitat. Recent advancements in acoustic technology, such as multibeam echosounding (MBES), can provide remote indication of surficial sediment properties such as texture, hardness, or roughness, and further permit highly detailed renderings of seafloor morphology. With acoustic-based surveys providing a relatively efficient method for data acquisition, there exists a need for efficient and reproducible automated segmentation routines to process the data. Using MBES data collected by the Olympic Coast National Marine Sanctuary (OCNMS), and through a contracted seafloor survey, we expanded on the techniques of Cutter et al. (2003) to describe an objective repeatable process that uses parameterized local Fourier histogram (LFH) texture features to automate segmentation of surficial sediments from acoustic imagery using a maximum likelihood decision rule. Sonar signatures and classification performance were evaluated using video imagery obtained from a towed camera sled. Segmented raster images were converted to polygon features and attributed using a hierarchical deep-water marine benthic classification scheme (Greene et al. 1999) for use in a geographical information system (GIS). (PDF contains 41 pages.)
Resumo:
Functional linkage between reef habitat quality and fish growth and production has remained elusive. Most current research is focused on correlative relationships between a general habitat type and presence/absence of a species, an index of species abundance, or species diversity. Such descriptive information largely ignores how reef attributes regulate reef fish abundance (density-dependent habitat selection), trophic interactions, and physiological performance (growth and condition). To determine the functional relationship between habitat quality, fish abundance, trophic interactions, and physiological performance, we are using an experimental reef system in the northeastern Gulf of Mexico where we apply advanced sensor and biochemical technologies. Our study site controls for reef attributes (size, cavity space, and reef mosaics) and focuses on the processes that regulate gag grouper (Mycteroperca microlepis) abundance, behavior and performance (growth and condition), and the availability of their pelagic prey. We combine mobile and fixed-active (fisheries) acoustics, passive acoustics, video cameras, and advanced biochemical techniques. Fisheries acoustics quantifies the abundance of pelagic prey fishes associated with the reefs and their behavior. Passive acoustics and video allow direct observation of gag and prey fish behavior and the acoustic environment, and provide a direct visual for the interpretation of fixed fisheries acoustics measurements. New application of biochemical techniques, such as Electron Transport System (ETS) assay, allow the in situ measurement of metabolic expenditure of gag and relates this back to reef attributes, gag behavior, and prey fish availability. Here, we provide an overview of our integrated technological approach for understanding and quantifying the functional relationship between reef habitat quality and one element of production – gag grouper growth on shallow coastal reefs.
Resumo:
EXECUTIVE SUMMARY: The Coastal Change Analysis Programl (C-CAP) is developing a nationally standardized database on landcover and habitat change in the coastal regions of the United States. C-CAP is part of the Estuarine Habitat Program (EHP) of NOAA's Coastal Ocean Program (COP). C-CAP inventories coastal submersed habitats, wetland habitats, and adjacent uplands and monitors changes in these habitats on a one- to five-year cycle. This type of information and frequency of detection are required to improve scientific understanding of the linkages of coastal and submersed wetland habitats with adjacent uplands and with the distribution, abundance, and health of living marine resources. The monitoring cycle will vary according to the rate and magnitude of change in each geographic region. Satellite imagery (primarily Landsat Thematic Mapper), aerial photography, and field data are interpreted, classified, analyzed, and integrated with other digital data in a geographic information system (GIS). The resulting landcover change databases are disseminated in digital form for use by anyone wishing to conduct geographic analysis in the completed regions. C-CAP spatial information on coastal change will be input to EHP conceptual and predictive models to support coastal resource policy planning and analysis. CCAP products will include 1) spatially registered digital databases and images, 2) tabular summaries by state, county, and hydrologic unit, and 3) documentation. Aggregations to larger areas (representing habitats, wildlife refuges, or management districts) will be provided on a case-by-case basis. Ongoing C-CAP research will continue to explore techniques for remote determination of biomass, productivity, and functional status of wetlands and will evaluate new technologies (e.g. remote sensor systems, global positioning systems, image processing algorithms) as they become available. Selected hardcopy land-cover change maps will be produced at local (1:24,000) to regional scales (1:500,000) for distribution. Digital land-cover change data will be provided to users for the cost of reproduction. Much of the guidance contained in this document was developed through a series of professional workshops and interagency meetings that focused on a) coastal wetlands and uplands; b) coastal submersed habitat including aquatic beds; c) user needs; d) regional issues; e) classification schemes; f) change detection techniques; and g) data quality. Invited participants included technical and regional experts and representatives of key State and Federal organizations. Coastal habitat managers and researchers were given an opportunity for review and comment. This document summarizes C-CAP protocols and procedures that are to be used by scientists throughout the United States to develop consistent and reliable coastal change information for input to the C-CAP nationwide database. It also provides useful guidelines for contributors working on related projects. It is considered a working document subject to periodic review and revision.(PDF file contains 104 pages.)
Resumo:
Humans are able of distinguishing more than 5000 visual categories even in complex environments using a variety of different visual systems all working in tandem. We seem to be capable of distinguishing thousands of different odors as well. In the machine learning community, many commonly used multi-class classifiers do not scale well to such large numbers of categories. This thesis demonstrates a method of automatically creating application-specific taxonomies to aid in scaling classification algorithms to more than 100 cate- gories using both visual and olfactory data. The visual data consists of images collected online and pollen slides scanned under a microscope. The olfactory data was acquired by constructing a small portable sniffing apparatus which draws air over 10 carbon black polymer composite sensors. We investigate performance when classifying 256 visual categories, 8 or more species of pollen and 130 olfactory categories sampled from common household items and a standardized scratch-and-sniff test. Taxonomies are employed in a divide-and-conquer classification framework which improves classification time while allowing the end user to trade performance for specificity as needed. Before classification can even take place, the pollen counter and electronic nose must filter out a high volume of background “clutter” to detect the categories of interest. In the case of pollen this is done with an efficient cascade of classifiers that rule out most non-pollen before invoking slower multi-class classifiers. In the case of the electronic nose, much of the extraneous noise encountered in outdoor environments can be filtered using a sniffing strategy which preferentially samples the visensor response at frequencies that are relatively immune to background contributions from ambient water vapor. This combination of efficient background rejection with scalable classification algorithms is tested in detail for three separate projects: 1) the Caltech-256 Image Dataset, 2) the Caltech Automated Pollen Identification and Counting System (CAPICS) and 3) a portable electronic nose specially constructed for outdoor use.
Resumo:
Observational studies of our solar system's small-body populations (asteroids and comets) offer insight into the history of our planetary system, as these minor planets represent the left-over building blocks from its formation. The Palomar Transient Factory (PTF) survey began in 2009 as the latest wide-field sky-survey program to be conducted on the 1.2-meter Samuel Oschin telescope at Palomar Observatory. Though its main science program has been the discovery of high-energy extragalactic sources (such as supernovae), during its first five years PTF has collected nearly five million observations of over half a million unique solar system small bodies. This thesis begins to analyze this vast data set to address key population-level science topics, including: the detection rates of rare main-belt comets and small near-Earth asteroids, the spin and shape properties of asteroids as inferred from their lightcurves, the applicability of this visible light data to the interpretation of ultraviolet asteroid observations, and a comparison of the physical properties of main-belt and Jovian Trojan asteroids. Future sky-surveys would benefit from application of the analytical techniques presented herein, which include novel modeling methods and unique applications of machine-learning classification. The PTF asteroid small-body data produced in the course of this thesis work should remain a fertile source of solar system science and discovery for years to come.
Resumo:
In the Ukraine there are several thousand large, medium and small lakes and lake-like reservoirs, distinguished by origin, salinity, regional position, productivity and by construction a significant number of large and small water bodies, ponds and industrial reservoirs of variable designation. The problem of national systems necessitates the creation of specific schemes and classifications. Classifying into specific types of reservoir by means of suitable specifications is required for planning national measures with the objective of the rational utilisation of natural resources. It is now necessary to consider the present-day characteristics of Ukranian lakes. In the case of the Ukraine it is possible to use two approaches - genetical and ecological. This paper uses the genetical system to classify the lake-like water bodies of the Ukraine.
Resumo:
This article describes the progress of the River Communities Project which commenced in 1977. This project aimed to develop a sensitive and practical system for river site classification using macroinvertebrates as an objective means of appraising the status of British rivers. The relationship between physical and chemical features of sites and their biological communities were examined. Sampling was undertaken on 41 British rivers. Ordination techniques were used to analyze data and the sites were classified into 16 groups using multiple discrimination analysis. The potential for using the environmental data to predict to which group a site belonged and the fauna likely to be present was investigated.
Resumo:
Foraging habitat selection of nesting Great Egrets ( Ardea alba ) and Snowy Egrets ( Egretta thula ) was investigated within an estuary with extensive impounded salt marsh habitat. Using a geographic information system, available habitat was partitioned into concentric bands at five, ten, and 15 km radius from nesting colonies to assess the relative effects of habitat composition and distance on habitat selection. Snowy Egrets were more likely than Great Egrets to depart colonies and travel to foraging sites in groups, but both species usually arrived at sites that were occupied by other wading birds. Mean flight distances were 6.2 km (SE = 0.4, N = 28, range 1.8-10.7 km) for Great Egrets and 4.7 km (SE = 0.48, N = 31, range 0.7-12.5 km) for Snowy Egrets. At the broadest spatial scale both species used impounded (mostly salt marsh) and estuarine edge habitat more than expected based on availability while avoiding unimpounded (mostly fresh water wetland) habitat. At more local scales habitat use matched availability. Interpretation of habitat preference differed with the types of habitat that were included and the maximum distance that habitat was considered available. These results illustrate that caution is needed when interpreting the results of habitat preference studies when individuals are constrained in their choice of habitats, such as for central place foragers.
Resumo:
The River Douglas has a long industrial heritage beginning in the early 18th century with its use by boats carrying goods between Wigan and Tarleton. The River and its tributaries have also historically been, and to a certain extent are still, subject to polluting inputs from the urban, agricultural and industrialised areas located within its catchment. During the early stages in the production of the River Douglas Catchment Management Plan, it became apparent that very little data existed on the populations of coarse and salmonid fish species within the River Douglas system. The data that did exist was largely anecdotal, consisting of catch reports from anglers or water bailiffs, or of dead and distressed fish following pollution incidents. This study was initiated to assess the status of coarse and salmonid fish species within the River Douglas system and so address the lack of knowledge. Eighty two sites were surveyed by electric fishing, including 14 sites using an electric fishing punt and up to four anodes. The data was analysed according to a new National Fisheries Classification Scheme. This classified the sites by the fish stock s present and compared the results with a database containing information from sites around the country that have similar habitat types. A stocking experiment was also undertaken in the River Lostock using chub reared at the Leyland Hatchery. These were marked with an identifiable blue spot in the spring of 1995 and then released into three, previously surveyed, locations in the river. These sites were then resurveyed during the summer stock assessment. This report also Site Reports with details on monitored sites, habitat features and fishery classification.
Resumo:
The main contribution of this work is to analyze and describe the state of the art performance as regards answer scoring systems from the SemEval- 2013 task, as well as to continue with the development of an answer scoring system (EHU-ALM) developed in the University of the Basque Country. On the overall this master thesis focuses on finding any possible configuration that lets improve the results in the SemEval dataset by using attribute engineering techniques in order to find optimal feature subsets, along with trying different hierarchical configurations in order to analyze its performance against the traditional one versus all approach. Altogether, throughout the work we propose two alternative strategies: on the one hand, to improve the EHU-ALM system without changing the architecture, and, on the other hand, to improve the system adapting it to an hierarchical con- figuration. To build such new models we describe and use distinct attribute engineering, data preprocessing, and machine learning techniques.