954 resultados para Change detection
Resumo:
This work of thesis wants to present a dissertation of the wide range of modern dense matching algorithms, which are spreading in different application and research fields, with a particular attention to the innovative “Semi-Global” matching techniques. The choice of develop a semi-global numerical code was justified by the need of getting insight on the variables and strategies that affect the algorithm performances with the primary objective of maximizing the method accuracy and efficiency, and the results level of completeness. The dissertation will consist in the metrological characterization of the proprietary implementation of the semi-global matching algorithm, evaluating the influence of several matching variables and functions implemented in the process and comparing the accuracy and completeness of different results (digital surface models, disparity maps and 2D displacement fields) obtained using our code and other commercial and open-source matching programs in a wide variety of application fields.
Resumo:
The challenge of detecting a change in the distribution of data is a sequential decision problem that is relevant to many engineering solutions, including quality control and machine and process monitoring. This dissertation develops techniques for exact solution of change-detection problems with discrete time and discrete observations. Change-detection problems are classified as Bayes or minimax based on the availability of information on the change-time distribution. A Bayes optimal solution uses prior information about the distribution of the change time to minimize the expected cost, whereas a minimax optimal solution minimizes the cost under the worst-case change-time distribution. Both types of problems are addressed. The most important result of the dissertation is the development of a polynomial-time algorithm for the solution of important classes of Markov Bayes change-detection problems. Existing techniques for epsilon-exact solution of partially observable Markov decision processes have complexity exponential in the number of observation symbols. A new algorithm, called constellation induction, exploits the concavity and Lipschitz continuity of the value function, and has complexity polynomial in the number of observation symbols. It is shown that change-detection problems with a geometric change-time distribution and identically- and independently-distributed observations before and after the change are solvable in polynomial time. Also, change-detection problems on hidden Markov models with a fixed number of recurrent states are solvable in polynomial time. A detailed implementation and analysis of the constellation-induction algorithm are provided. Exact solution methods are also established for several types of minimax change-detection problems. Finite-horizon problems with arbitrary observation distributions are modeled as extensive-form games and solved using linear programs. Infinite-horizon problems with linear penalty for detection delay and identically- and independently-distributed observations can be solved in polynomial time via epsilon-optimal parameterization of a cumulative-sum procedure. Finally, the properties of policies for change-detection problems are described and analyzed. Simple classes of formal languages are shown to be sufficient for epsilon-exact solution of change-detection problems, and methods for finding minimally sized policy representations are described.
Resumo:
The automated extraction of roads from aerial imagery can be of value for tasks including mapping, surveillance and change detection. Unfortunately, there are no public databases or standard evaluation protocols for evaluating these techniques. Many techniques are further hindered by a reliance on manual initialisation, making large scale application of the techniques impractical. In this paper, we present a public database and evaluation protocol for the evaluation of road extraction algorithms, and propose an improved automatic seed finding technique to initialise road extraction, based on a combination of geometric and colour features.
Resumo:
Machine vision is emerging as a viable sensing approach for mid-air collision avoidance (particularly for small to medium aircraft such as unmanned aerial vehicles). In this paper, using relative entropy rate concepts, we propose and investigate a new change detection approach that uses hidden Markov model filters to sequentially detect aircraft manoeuvres from morphologically processed image sequences. Experiments using simulated and airborne image sequences illustrate the performance of our proposed algorithm in comparison to other sequential change detection approaches applied to this application.
Resumo:
The Modicon Communication Bus (Modbus) protocol is one of the most commonly used protocols in industrial control systems. Modbus was not designed to provide security. This paper confirms that the Modbus protocol is vulnerable to flooding attacks. These attacks involve injection of commands that result in disrupting the normal operation of the control system. This paper describes a set of experiments that shows that an anomaly-based change detection algorithm and signature-based Snort threshold module are capable of detecting Modbus flooding attacks. In comparing these intrusion detection techniques, we find that the signature-based detection requires a carefully selected threshold value, and that the anomaly-based change detection algorithm may have a short delay before detecting the attacks depending on the parameters used. In addition, we also generate a network traffic dataset of flooding attacks on the Modbus control system protocol.
Resumo:
In this paper we present research adapting a state of the art condition-invariant robotic place recognition algorithm to the role of automated inter- and intra-image alignment of sensor observations of environmental and skin change over time. The approach involves inverting the typical criteria placed upon navigation algorithms in robotics; we exploit rather than attempt to fix the limited camera viewpoint invariance of such algorithms, showing that approximate viewpoint repetition is realistic in a wide range of environments and medical applications. We demonstrate the algorithms automatically aligning challenging visual data from a range of real-world applications: ecological monitoring of environmental change, aerial observation of natural disasters including flooding, tsunamis and bushfires and tracking wound recovery and sun damage over time and present a prototype active guidance system for enforcing viewpoint repetition. We hope to provide an interesting case study for how traditional research criteria in robotics can be inverted to provide useful outcomes in applied situations.
Resumo:
The auditory system can detect occasional changes (deviants) in acoustic regularities without the need for subjects to focus their attention on the sound material. Deviant detection is reflected in the elicitation of the mismatch negativity component (MMN) of the event-related potentials. In the studies presented in this thesis, the MMN is used to investigate the auditory abilities for detecting similarities and regularities in sound streams. To investigate the limits of these processes, professional musicians have been tested in some of the studies. The results show that auditory grouping is already more advanced in musicians than in nonmusicians and that the auditory system of musicians can, unlike that of nonmusicians, detect a numerical regularity of always four tones in a series. These results suggest that sensory auditory processing in musicians is not only a fine tuning of universal abilities, but is also qualitatively more advanced than in nonmusicians. In addition, the relationship between the auditory change-detection function and perception is examined. It is shown that, contrary to the generally accepted view, MMN elicitation does not necessarily correlate with perception. The outcome of the auditory change-detection function can be implicit and the implicit knowledge of the sound structure can, after training, be utilized for behaviorally correct intuitive sound detection. These results illustrate the automatic character of the sensory change detection function.
Resumo:
We consider the problem of quickest detection of an intrusion using a sensor network, keeping only a minimal number of sensors active. By using a minimal number of sensor devices, we ensure that the energy expenditure for sensing, computation and communication is minimized (and the lifetime of the network is maximized). We model the intrusion detection (or change detection) problem as a Markov decision process (MDP). Based on the theory of MDP, we develop the following closed loop sleep/wake scheduling algorithms: (1) optimal control of Mk+1, the number of sensors in the wake state in time slot k + 1, (2) optimal control of qk+1, the probability of a sensor in the wake state in time slot k + 1, and an open loop sleep/wake scheduling algorithm which (3) computes q, the optimal probability of a sensor in the wake state (which does not vary with time), based on the sensor observations obtained until time slot k. Our results show that an optimum closed loop control on Mk+1 significantly decreases the cost compared to keeping any number of sensors active all the time. Also, among the three algorithms described, we observe that the total cost is minimum for the optimum control on Mk+1 and is maximum for the optimum open loop control on q.
Resumo:
We consider the problem of quickest detection of an intrusion using a sensor network, keeping only a minimal number of sensors active. By using a minimal number of sensor devices,we ensure that the energy expenditure for sensing, computation and communication is minimized (and the lifetime of the network is maximized). We model the intrusion detection (or change detection) problem as a Markov decision process (MDP). Based on the theory of MDP, we develop the following closed loop sleep/wake scheduling algorithms: 1) optimal control of Mk+1, the number of sensors in the wake state in time slot k + 1, 2) optimal control of qk+1, the probability of a sensor in the wake state in time slot k + 1, and an open loop sleep/wake scheduling algorithm which 3) computes q, the optimal probability of a sensor in the wake state (which does not vary with time),based on the sensor observations obtained until time slot k.Our results show that an optimum closed loop control onMk+1 significantly decreases the cost compared to keeping any number of sensors active all the time. Also, among the three algorithms described, we observe that the total cost is minimum for the optimum control on Mk+1 and is maximum for the optimum open loop control on q.
Resumo:
We consider a small extent sensor network for event detection, in which nodes periodically take samples and then contend over a random access network to transmit their measurement packets to the fusion center. We consider two procedures at the fusion center for processing the measurements. The Bayesian setting, is assumed, that is, the fusion center has a prior distribution on the change time. In the first procedure, the decision algorithm at the fusion center is network-oblivious and makes a decision only when a complete vector of measurements taken at a sampling instant is available. In the second procedure, the decision algorithm at the fusion center is network-aware and processes measurements as they arrive, but in a time-causal order. In this case, the decision statistic depends on the network delays, whereas in the network-oblivious case, the decision statistic does not. This yields a Bayesian change-detection problem with a trade-off between the random network delay and the decision delay that is, a higher sampling rate reduces the decision delay but increases the random access delay. Under periodic sampling, in the network-oblivious case, the structure of the optimal stopping rule is the same as that without the network, and the optimal change detection delay decouples into the network delay and the optimal decision delay without the network. In the network-aware case, the optimal stopping problem is analyzed as a partially observable Markov decision process, in which the states of the queues and delays in the network need to be maintained. A sufficient decision statistic is the network state and the posterior probability of change having occurred, given the measurements received and the state of the network. The optimal regimes are studied using simulation.
Resumo:
EXECUTIVE SUMMARY: The Coastal Change Analysis Programl (C-CAP) is developing a nationally standardized database on landcover and habitat change in the coastal regions of the United States. C-CAP is part of the Estuarine Habitat Program (EHP) of NOAA's Coastal Ocean Program (COP). C-CAP inventories coastal submersed habitats, wetland habitats, and adjacent uplands and monitors changes in these habitats on a one- to five-year cycle. This type of information and frequency of detection are required to improve scientific understanding of the linkages of coastal and submersed wetland habitats with adjacent uplands and with the distribution, abundance, and health of living marine resources. The monitoring cycle will vary according to the rate and magnitude of change in each geographic region. Satellite imagery (primarily Landsat Thematic Mapper), aerial photography, and field data are interpreted, classified, analyzed, and integrated with other digital data in a geographic information system (GIS). The resulting landcover change databases are disseminated in digital form for use by anyone wishing to conduct geographic analysis in the completed regions. C-CAP spatial information on coastal change will be input to EHP conceptual and predictive models to support coastal resource policy planning and analysis. CCAP products will include 1) spatially registered digital databases and images, 2) tabular summaries by state, county, and hydrologic unit, and 3) documentation. Aggregations to larger areas (representing habitats, wildlife refuges, or management districts) will be provided on a case-by-case basis. Ongoing C-CAP research will continue to explore techniques for remote determination of biomass, productivity, and functional status of wetlands and will evaluate new technologies (e.g. remote sensor systems, global positioning systems, image processing algorithms) as they become available. Selected hardcopy land-cover change maps will be produced at local (1:24,000) to regional scales (1:500,000) for distribution. Digital land-cover change data will be provided to users for the cost of reproduction. Much of the guidance contained in this document was developed through a series of professional workshops and interagency meetings that focused on a) coastal wetlands and uplands; b) coastal submersed habitat including aquatic beds; c) user needs; d) regional issues; e) classification schemes; f) change detection techniques; and g) data quality. Invited participants included technical and regional experts and representatives of key State and Federal organizations. Coastal habitat managers and researchers were given an opportunity for review and comment. This document summarizes C-CAP protocols and procedures that are to be used by scientists throughout the United States to develop consistent and reliable coastal change information for input to the C-CAP nationwide database. It also provides useful guidelines for contributors working on related projects. It is considered a working document subject to periodic review and revision.(PDF file contains 104 pages.)
Resumo:
We present a multimodal detection and tracking algorithm for sensors composed of a camera mounted between two microphones. Target localization is performed on color-based change detection in the video modality and on time difference of arrival (TDOA) estimation between the two microphones in the audio modality. The TDOA is computed by multiband generalized cross correlation (GCC) analysis. The estimated directions of arrival are then postprocessed using a Riccati Kalman filter. The visual and audio estimates are finally integrated, at the likelihood level, into a particle filter (PF) that uses a zero-order motion model, and a weighted probabilistic data association (WPDA) scheme. We demonstrate that the Kalman filtering (KF) improves the accuracy of the audio source localization and that the WPDA helps to enhance the tracking performance of sensor fusion in reverberant scenarios. The combination of multiband GCC, KF, and WPDA within the particle filtering framework improves the performance of the algorithm in noisy scenarios. We also show how the proposed audiovisual tracker summarizes the observed scene by generating metadata that can be transmitted to other network nodes instead of transmitting the raw images and can be used for very low bit rate communication. Moreover, the generated metadata can also be used to detect and monitor events of interest.
Resumo:
Flooding is a particular hazard in urban areas worldwide due to the increased risks to life and property in these regions. Synthetic Aperture Radar (SAR) sensors are often used to image flooding because of their all-weather day-night capability, and now possess sufficient resolution to image urban flooding. The flood extents extracted from the images may be used for flood relief management and improved urban flood inundation modelling. A difficulty with using SAR for urban flood detection is that, due to its side-looking nature, substantial areas of urban ground surface may not be visible to the SAR due to radar layover and shadow caused by buildings and taller vegetation. This paper investigates whether urban flooding can be detected in layover regions (where flooding may not normally be apparent) using double scattering between the (possibly flooded) ground surface and the walls of adjacent buildings. The method estimates double scattering strengths using a SAR image in conjunction with a high resolution LiDAR (Light Detection and Ranging) height map of the urban area. A SAR simulator is applied to the LiDAR data to generate maps of layover and shadow, and estimate the positions of double scattering curves in the SAR image. Observations of double scattering strengths were compared to the predictions from an electromagnetic scattering model, for both the case of a single image containing flooding, and a change detection case in which the flooded image was compared to an un-flooded image of the same area acquired with the same radar parameters. The method proved successful in detecting double scattering due to flooding in the single-image case, for which flooded double scattering curves were detected with 100% classification accuracy (albeit using a small sample set) and un-flooded curves with 91% classification accuracy. The same measures of success were achieved using change detection between flooded and un-flooded images. Depending on the particular flooding situation, the method could lead to improved detection of flooding in urban areas.
Resumo:
ABSTRACT World Heritage sites provide a glimpse into the stories and civilizations of the past. There are currently 1007 unique World Heritage properties with 779 being classified as cultural sites, 197 as natural sites, and 31 falling into the categories of both cultural and natural sites (UNESCO & World Heritage Centre, 1992-2015). However, of these 1007 World Heritage sites, at least 46 are categorized as in danger and this number continues to grow. These unique and irreplaceable sites are exceptional because of their universality. Consequently, since World Heritage sites belong to all the people of the world and provide inspiration and admiration to all who visit them, it is our responsibility to help preserve these sites. The key form of preservation involves the individual monitoring of each site over time. While traditional methods are still extremely valuable, more recent advances in the field of geographic and spatial technologies including geographic information systems (GIS), laser scanning, and remote sensing, are becoming more beneficial for the monitoring and overall safeguarding of World Heritage sites. Through the employment and analysis of more accurately detailed spatial data, World Heritage sites can be better managed. There is a strong urgency to protect these sites. The purpose of this thesis is to describe the importance of taking care of World Heritage sites and to depict a way in which spatial technologies can be used to monitor and in effect preserve World Heritage sites through the utilization of remote sensing imagery. The research conducted in this thesis centers on the Everglades National Park, a World Heritage site that is continually affected by changes in vegetation. Data used include Landsat satellite imagery that dates from 2001-2003, the Everglades' boundaries shapefile, and Google Earth imagery. In order to conduct the in-depth analysis of vegetation change within the selected World Heritage site, three main techniques were performed to study changes found within the imagery. These techniques consist of conducting supervised classification for each image, incorporating a vegetation index known as Normalized Vegetation Index (NDVI), and utilizing the change detection tool available in the Environment for Visualizing Images (ENVI) software. With the research and analysis conducted throughout this thesis, it has been shown that within the three year time span (2001-2003), there has been an overall increase in both areas of barren soil (5.760%) and areas of vegetation (1.263%) with a decrease in the percentage of areas classified as sparsely vegetated (-6.987%). These results were gathered through the use of the maximum likelihood classification process available in the ENVI software. The results produced by the change detection tool which further analyzed vegetation change correlate with the results produced by the classification method. As well, by utilizing the NDVI method, one is able to locate changes by selecting a specific area and comparing the vegetation index generated for each date. It has been found that through the utilization of remote sensing technology, it is possible to monitor and observe changes featured within a World Heritage site. Remote sensing is an extraordinary tool that can and should be used by all site managers and organizations whose goal it is to preserve and protect World Heritage sites. Remote sensing can be used to not only observe changes over time, but it can also be used to pinpoint threats within a World Heritage site. World Heritage sites are irreplaceable sources of beauty, culture, and inspiration. It is our responsibility, as citizens of this world, to guard these treasures.
Resumo:
The objective of this study is to gain a quantitative understanding of land use and land cover change (LULCC) that have occurred in a rural Nicaraguan municipality by analyzing Landsat 5 Thematic Mapper (TM) images. By comparing the potential extent of tropical dry forest (TDF) with Landsat 5 TM images, this study analyzes the loss of this forest type on a local level for the municipality of San Juan de Cinco Pinos (63.5 km2) in the Department of Chinandega. Change detection analysis shows where and how land use has changed from 1985 to the present. From 1985 to 2011, nearly 15% of the TDF in San Juan de Cinco Pinos was converted to other land uses. Of the 1434.2 ha of TDF that was present in 1985, 1223.64 ha remained in 2011. The deforestation is primarily a result of agricultural expansion and fuelwood extraction. If current rates of TDF deforestation continue, the municipality faces the prospect of losing its forest cover within the next few decades.