13 resultados para River monitoring network
em Aston University Research Archive
Resumo:
The automatic interpolation of environmental monitoring network data such as air quality or radiation levels in real-time setting poses a number of practical and theoretical questions. Among the problems found are (i) dealing and communicating uncertainty of predictions, (ii) automatic (hyper)parameter estimation, (iii) monitoring network heterogeneity, (iv) dealing with outlying extremes, and (v) quality control. In this paper we discuss these issues, in light of the spatial interpolation comparison exercise held in 2004.
Resumo:
Automatically generating maps of a measured variable of interest can be problematic. In this work we focus on the monitoring network context where observations are collected and reported by a network of sensors, and are then transformed into interpolated maps for use in decision making. Using traditional geostatistical methods, estimating the covariance structure of data collected in an emergency situation can be difficult. Variogram determination, whether by method-of-moment estimators or by maximum likelihood, is very sensitive to extreme values. Even when a monitoring network is in a routine mode of operation, sensors can sporadically malfunction and report extreme values. If this extreme data destabilises the model, causing the covariance structure of the observed data to be incorrectly estimated, the generated maps will be of little value, and the uncertainty estimates in particular will be misleading. Marchant and Lark [2007] propose a REML estimator for the covariance, which is shown to work on small data sets with a manual selection of the damping parameter in the robust likelihood. We show how this can be extended to allow treatment of large data sets together with an automated approach to all parameter estimation. The projected process kriging framework of Ingram et al. [2007] is extended to allow the use of robust likelihood functions, including the two component Gaussian and the Huber function. We show how our algorithm is further refined to reduce the computational complexity while at the same time minimising any loss of information. To show the benefits of this method, we use data collected from radiation monitoring networks across Europe. We compare our results to those obtained from traditional kriging methodologies and include comparisons with Box-Cox transformations of the data. We discuss the issue of whether to treat or ignore extreme values, making the distinction between the robust methods which ignore outliers and transformation methods which treat them as part of the (transformed) process. Using a case study, based on an extreme radiological events over a large area, we show how radiation data collected from monitoring networks can be analysed automatically and then used to generate reliable maps to inform decision making. We show the limitations of the methods and discuss potential extensions to remedy these.
Resumo:
In many Environmental Information Systems the actual observations arise from a discrete monitoring network which might be rather heterogeneous in both location and types of measurements made. In this paper we describe the architecture and infrastructure for a system, developed as part of the EU FP6 funded INTAMAP project, to provide a service oriented solution that allows the construction of an interoperable, automatic, interpolation system. This system will be based on the Open Geospatial Consortium’s Web Feature Service (WFS) standard. The essence of our approach is to extend the GML3.1 observation feature to include information about the sensor using SensorML, and to further extend this to incorporate observation error characteristics. Our extended WFS will accept observations, and will store them in a database. The observations will be passed to our R-based interpolation server, which will use a range of methods, including a novel sparse, sequential kriging method (only briefly described here) to produce an internal representation of the interpolated field resulting from the observations currently uploaded to the system. The extended WFS will then accept queries, such as ‘What is the probability distribution of the desired variable at a given point’, ‘What is the mean value over a given region’, or ‘What is the probability of exceeding a certain threshold at a given location’. To support information-rich transfer of complex and uncertain predictions we are developing schema to represent probabilistic results in a GML3.1 (object-property) style. The system will also offer more easily accessible Web Map Service and Web Coverage Service interfaces to allow users to access the system at the level of complexity they require for their specific application. Such a system will offer a very valuable contribution to the next generation of Environmental Information Systems in the context of real time mapping for monitoring and security, particularly for systems that employ a service oriented architecture.
Resumo:
The INTAMAP FP6 project has developed an interoperable framework for real-time automatic mapping of critical environmental variables by extending spatial statistical methods and employing open, web-based, data exchange protocols and visualisation tools. This paper will give an overview of the underlying problem, of the project, and discuss which problems it has solved and which open problems seem to be most relevant to deal with next. The interpolation problem that INTAMAP solves is the generic problem of spatial interpolation of environmental variables without user interaction, based on measurements of e.g. PM10, rainfall or gamma dose rate, at arbitrary locations or over a regular grid covering the area of interest. It deals with problems of varying spatial resolution of measurements, the interpolation of averages over larger areas, and with providing information on the interpolation error to the end-user. In addition, monitoring network optimisation is addressed in a non-automatic context.
Resumo:
Large monitoring networks are becoming increasingly common and can generate large datasets from thousands to millions of observations in size, often with high temporal resolution. Processing large datasets using traditional geostatistical methods is prohibitively slow and in real world applications different types of sensor can be found across a monitoring network. Heterogeneities in the error characteristics of different sensors, both in terms of distribution and magnitude, presents problems for generating coherent maps. An assumption in traditional geostatistics is that observations are made directly of the underlying process being studied and that the observations are contaminated with Gaussian errors. Under this assumption, sub–optimal predictions will be obtained if the error characteristics of the sensor are effectively non–Gaussian. One method, model based geostatistics, assumes that a Gaussian process prior is imposed over the (latent) process being studied and that the sensor model forms part of the likelihood term. One problem with this type of approach is that the corresponding posterior distribution will be non–Gaussian and computationally demanding as Monte Carlo methods have to be used. An extension of a sequential, approximate Bayesian inference method enables observations with arbitrary likelihoods to be treated, in a projected process kriging framework which is less computationally intensive. The approach is illustrated using a simulated dataset with a range of sensor models and error characteristics.
Resumo:
This collection of papers records a series of studies, carried out over a period of some 50 years, on two aspects of river pollution control - the prevention of pollution by sewage biological filtration and the monitoring of river pollution by biological surveillance. The earlier studies were carried out to develop methods of controlling flies which bred in the filters and caused serious nuisance and possible public health hazard, when they dispersed to surrounding villages. Although the application of insecticides proved effective as an alleviate measure, because it resulted in only a temporary disturbance of the ecological balance, it was considered ecologically unsound as a long-term solution. Subsequent investigations showed that the fly populations in filters were largely determined by the amount of food available to the grazing larval stage in the form of filter film. It was also established that the winter deterioration in filter performance was due to the excessive accumulation of film. Subsequent investigations were therefore carried out to determine the factors responsible for the accumulation of film in different types of filter. Methods of filtration which were considered to control film accumulation by increasing the flushing action of the sewage, were found to control fungal film by creating nutrient limiting conditions. In some filters increasing the hydraulic flushing reduced the grazing fauna population in the surface layers and resulted in an increase in film. The results of these investigations were successfully applied in modifying filters and in the design of a Double Filtration process. These studies on biological filters lead to the conclusion that they should be designed and operated as ecological systems and not merely as hydraulic ones. Studies on the effects of sewage effluents on Birmingham streams confirmed the findings of earlier workers justifying their claim for using biological methods for detecting and assessing river pollution. Further ecological studies showed the sensitivity of benthic riffle communities to organic pollution. Using experimental channels and laboratory studies the different environmental conditions associated with organic pollution were investigated. The degree and duration of the oxygen depletion during the dark hours were found to be a critical factor. The relative tolerance of different taxa to other pollutants, such as ammonia, differed. Although colonisation samplers proved of value in sampling difficult sites, the invertebrate data generated were not suitable for processing as any of the commonly used biotic indexes. Several of the papers, which were written by request for presentation at conferences etc., presented the biological viewpoint on river pollution and water quality issues at the time and advocated the use of biological methods. The information and experiences gained in these investigations was used as the "domain expert" in the development of artificial intelligence systems for use in the biological surveillance of river water quality.
River basin surveillance using remotely sensed data: a water resources information management system
Resumo:
This thesis describes the development of an operational river basin water resources information management system. The river or drainage basin is the fundamental unit of the system; in both the modelling and prediction of hydrological processes, and in the monitoring of the effect of catchment management policies. A primary concern of the study is the collection of sufficient and sufficiently accurate information to model hydrological processes. Remote sensing, in combination with conventional point source measurement, can be a valuable source of information, but is often overlooked by hydrologists, due to the cost of acquisition and processing. This thesis describes a number of cost effective methods of acquiring remotely sensed imagery, from airborne video survey to real time ingestion of meteorological satellite data. Inexpensive micro-computer systems and peripherals are used throughout to process and manipulate the data. Spatial information systems provide a means of integrating these data with topographic and thematic cartographic data, and historical records. For the system to have any real potential the data must be stored in a readily accessible format and be easily manipulated within the database. The design of efficient man-machine interfaces and the use of software enginering methodologies are therefore included in this thesis as a major part of the design of the system. The use of low cost technologies, from micro-computers to video cameras, enables the introduction of water resources information management systems into developing countries where the potential benefits are greatest.
Resumo:
This thesis presents a thorough and principled investigation into the application of artificial neural networks to the biological monitoring of freshwater. It contains original ideas on the classification and interpretation of benthic macroinvertebrates, and aims to demonstrate their superiority over the biotic systems currently used in the UK to report river water quality. The conceptual basis of a new biological classification system is described, and a full review and analysis of a number of river data sets is presented. The biological classification is compared to the common biotic systems using data from the Upper Trent catchment. This data contained 292 expertly classified invertebrate samples identified to mixed taxonomic levels. The neural network experimental work concentrates on the classification of the invertebrate samples into biological class, where only a subset of the sample is used to form the classification. Other experimentation is conducted into the identification of novel input samples, the classification of samples from different biotopes and the use of prior information in the neural network models. The biological classification is shown to provide an intuitive interpretation of a graphical representation, generated without reference to the class labels, of the Upper Trent data. The selection of key indicator taxa is considered using three different approaches; one novel, one from information theory and one from classical statistical methods. Good indicators of quality class based on these analyses are found to be in good agreement with those chosen by a domain expert. The change in information associated with different levels of identification and enumeration of taxa is quantified. The feasibility of using neural network classifiers and predictors to develop numeric criteria for the biological assessment of sediment contamination in the Great Lakes is also investigated.
Resumo:
In this paper, multiplexed sensor network capable of monitoring the shape changes of the torso for respiratory function monitoring is developed. As a demonstration, LPGs written into refractive index insensitive, progressive three layered fibre are embedded into supporting material is then placed on a resuscitation training manikin simulating respiration. A derivative spectroscopy interrogation technique is implemented and the bend sensitivity of the LPGs is used to reconstruct the shape of the manikin's torso. © 2003 IEEE.
Resumo:
This work bridges the gap between the remote interrogation of multiple optical sensors and the advantages of using inherently biocompatible low-cost polymer optical fiber (POF)-based photonic sensing. A novel hybrid sensor network combining both silica fiber Bragg gratings (FBG) and polymer FBGs (POFBG) is analyzed. The topology is compatible with WDM networks so multiple remote sensors can be addressed providing high scalability. A central monitoring unit with virtual data processing is implemented, which could be remotely located up to units of km away. The feasibility of the proposed solution for potential medical environments and biomedical applications is shown.
Resumo:
Background: To examine the views and current practice of SMBG among Black Caribbean and South Asian individuals with non-insulin treated Type 2 diabetes mellitus. Methods: Twelve participants completed semi-structured interviews that were guided by the Health Belief Model and analyzed using thematic network analysis. Results: The frequency of monitoring among participants varied from several times a day to once per week. Most participants expressed similar experiences regarding their views and practices of SMBG. Minor differences across gender and culture were observed. All participants understood the benefits, but not all viewed SMBG as beneficial to their personal diabetes management. SMBG can facilitate a better understanding and maintenance of self-care behaviours. However, it can trigger both positive and negative emotional responses, such as a sense of disappointment when high readings are not anticipated, resulting in emotional distress. Health care professionals play a key role in the way SMBG is perceived and used by patients. Conclusion: While the majority of participants value SMBG as a self-management tool, barriers exist that impede its practice, particularly its cost. How individuals cope with these barriers is integral to understanding why some patients adopt SMBG more than others. © 2013 Gucciardi et al.; licensee BioMed Central Ltd.
Resumo:
Since privatisation, maintenance of DNO LV feeder maximum demand information has gradually demised in some Utility Areas, and it is postulated that lack of knowledge about 11kV and LV electrical networks is resulting in a less economical and energy efficient Network as a whole. In an attempt to quantify the negative impact, this paper examines ten postulated new connection scenarios for a set of real LV load readings, in order to find the difference in design solutions when LV load readings were and were not known. The load profiles of the substations were examined in order to explore the utilisation profile. It was found that in 70% of the scenarios explored, significant cost differences were found. These cost differences varied by an average of 1000%, between schemes designed with and without load readings. Obviously, over designing a system and therefore operating more, underutilised transformers becomes less financially beneficial and less energy efficient. The paper concludes that new connection design is improved in terms of cost when carried out based on known LV load information and enhances the case for regular maximum feeder demand information and/or metering of LV feeders. © 2013 IEEE.