34 resultados para Satellite Monitoring Systems
em Aston University Research Archive
Resumo:
In this article we envision factors and trends that shape the next generation of environmental monitoring systems. One key factor in this respect is the combined effect of end-user needs and the general development of IT services and their availability. Currently, an environmental (monitoring) system is assumed to be reactive. It delivers measurement data and computational results only if the user explicitly asks for it either by query or subscription. There is a temptation to automate this by simply pushing data to end-users. This, however, leads easily to an "advertisement strategy", where data is pushed to end-users regardless of users' needs. Under this strategy, the mere amount of received data obfuscates the individual messages; any "automatic" service, regardless of its fitness, overruns a system that requires the user's initiative. The foreseeable problem is that, unless there is no overall management, each new environmental service is going to compete for end-users' attention and, thus, inadvertently hinder the use of existing services. As the main contribution we investigate the nature of proactive environmental systems, and how they should be designed to avoid the aforementioned problem. We also discuss how semantics, participatory sensing, uncertainty management, and situational awareness link to proactive environmental systems. We illustrate our proposals with some real-life examples.
Resumo:
Self-adaptive systems have the capability to autonomously modify their behavior at run-time in response to changes in their environment. Self-adaptation is particularly necessary for applications that must run continuously, even under adverse conditions and changing requirements; sample domains include automotive systems, telecommunications, and environmental monitoring systems. While a few techniques have been developed to support the monitoring and analysis of requirements for adaptive systems, limited attention has been paid to the actual creation and specification of requirements of self-adaptive systems. As a result, self-adaptivity is often constructed in an ad-hoc manner. In order to support the rigorous specification of adaptive systems requirements, this paper introduces RELAX, a new requirements language for self-adaptive systems that explicitly addresses uncertainty inherent in adaptive systems. We present the formal semantics for RELAX in terms of fuzzy logic, thus enabling a rigorous treatment of requirements that include uncertainty. RELAX enables developers to identify uncertainty in the requirements, thereby facilitating the design of systems that are, by definition, more flexible and amenable to adaptation in a systematic fashion. We illustrate the use of RELAX on smart home applications, including an adaptive assisted living system.
Resumo:
Self-adaptive systems have the capability to autonomously modify their behaviour at run-time in response to changes in their environment. Self-adaptation is particularly necessary for applications that must run continuously, even under adverse conditions and changing requirements; sample domains include automotive systems, telecommunications, and environmental monitoring systems. While a few techniques have been developed to support the monitoring and analysis of requirements for adaptive systems, limited attention has been paid to the actual creation and specification of requirements of self-adaptive systems. As a result, self-adaptivity is often constructed in an ad-hoc manner. In this paper, we argue that a more rigorous treatment of requirements explicitly relating to self-adaptivity is needed and that, in particular, requirements languages for self-adaptive systems should include explicit constructs for specifying and dealing with the uncertainty inherent in self-adaptive systems. We present RELAX, a new requirements language for selfadaptive systems and illustrate it using examples from the smart home domain. © 2009 IEEE.
Resumo:
This thesis presents a novel high-performance approach to time-division-multiplexing (TDM) fibre Bragg grating (FBG) optical sensors, known as the resonant cavity architecture. A background theory of FBG optical sensing includes several techniques for multiplexing sensors. The limitations of current wavelength-division-multiplexing (WDM) schemes are contrasted against the technological and commercial advantage of TDM. The author’s hypothesis that ‘it should be possible to achieve TDM FBG sensor interrogation using an electrically switched semiconductor optical amplifier (SOA)’ is then explained. Research and development of a commercially viable optical sensor interrogator based on the resonant cavity architecture forms the remainder of this thesis. A fully programmable SOA drive system allows interrogation of sensor arrays 10km long with a spatial resolution of 8cm and a variable gain system provides dynamic compensation for fluctuating system losses. Ratiometric filter- and diffractive-element spectrometer-based wavelength measurement systems are developed and analysed for different commercial applications. The ratiometric design provides a low-cost solution that has picometre resolution and low noise using 4% reflective sensors, but is less tolerant to variation in system loss. The spectrometer design is more expensive, but delivers exceptional performance with picometre resolution, low noise and tolerance to 13dB system loss variation. Finally, this thesis details the interrogator’s peripheral components, its compliance for operation in harsh industrial environments and several examples of commercial applications where it has been deployed. Applications include laboratory instruments, temperature monitoring systems for oil production, dynamic control for wind-energy and battery powered, self-contained sub-sea strain monitoring.
Resumo:
The promoters of the large groundwater developments implemented in the 1970's paid little attention to the effects of pumping on soil moisture. A field study, conducted in 1979 in the Tern Area of the Shropshire Groundwater Scheme, revealed that significant quantities of the available moisture could be removed from the root zone of vegetation when drawdown of shallow watertables occurred. Arguments to this effect, supported by the field study evidence, were successfully presented at the Shropshire Groundwater Scheme public inquiry. The aim of this study has been to expand the work which was undertaken in connection with the Shropshire Groundwater Scheme, and to develop a method whereby the effects of groundwater pumping on vegetation can be assessed, and hence the impacts minimised. Two concepts, the critical height and the soil sensitivity depth, formulated during the initial work are at the core of the Environmental Impact Assessment method whose development is described. A programme of laboratory experiments on soil columns is described, as is the derivation of relationships for determining critical heights and field capacity moisture profiles. These relationships are subsequently employed in evaluating the effects of groundwater drawdown. In employing the environmental assessment technique, digitised maps of relevant features of the Tern Area are combined to produce composite maps delineating the extent of the areas which are potentially sensitive to groundwater drawdown. A series of crop yield/moisture loss functions are then employed to estimate the impact of simulated pumping events on the agricultural community of the Tern Area. Finally, guidelines, based on experience gained through evaluation of the Tern Area case study, are presented for use in the design of soil moisture monitoring systems and in the siting of boreholes. In addition recommendations are made for development of the EIA technique, and further research needs are identified.
Resumo:
Self-adaptive systems have the capability to autonomously modify their behaviour at run-time in response to changes in their environment. Such systems are now commonly built in domains as diverse as enterprise computing, automotive control systems, and environmental monitoring systems. To date, however, there has been limited attention paid to how to engineer requirements for such systems. As a result, selfadaptivity is often constructed in an ad-hoc manner. In this paper, we argue that a more rigorous treatment of requirements relating to self-adaptivity is needed and that, in particular, requirements languages for self-adaptive systems should include explicit constructs for specifying and dealing with the uncertainty inherent in self-adaptive systems. We present some initial thoughts on a new requirements language for selfadaptive systems and illustrate it using examples from the services domain. © 2008 IEEE.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
The amplification of demand variation up a supply chain widely termed ‘the Bullwhip Effect’ is disruptive, costly and something that supply chain management generally seeks to minimise. Originally attributed to poor system design; deficiencies in policies, organisation structure and delays in material and information flow all lead to sub-optimal reorder point calculation. It has since been attributed to exogenous random factors such as: uncertainties in demand, supply and distribution lead time but these causes are not exclusive as academic and operational studies since have shown that orders and/or inventories can exhibit significant variability even if customer demand and lead time are deterministic. This increase in the range of possible causes of dynamic behaviour indicates that our understanding of the phenomenon is far from complete. One possible, yet previously unexplored, factor that may influence dynamic behaviour in supply chains is the application and operation of supply chain performance measures. Organisations monitoring and responding to their adopted key performance metrics will make operational changes and this action may influence the level of dynamics within the supply chain, possibly degrading the performance of the very system they were intended to measure. In order to explore this a plausible abstraction of the operational responses to the Supply Chain Council’s SCOR® (Supply Chain Operations Reference) model was incorporated into a classic Beer Game distribution representation, using the dynamic discrete event simulation software Simul8. During the simulation the five SCOR Supply Chain Performance Attributes: Reliability, Responsiveness, Flexibility, Cost and Utilisation were continuously monitored and compared to established targets. Operational adjustments to the; reorder point, transportation modes and production capacity (where appropriate) for three independent supply chain roles were made and the degree of dynamic behaviour in the Supply Chain measured, using the ratio of the standard deviation of upstream demand relative to the standard deviation of the downstream demand. Factors employed to build the detailed model include: variable retail demand, order transmission, transportation delays, production delays, capacity constraints demand multipliers and demand averaging periods. Five dimensions of supply chain performance were monitored independently in three autonomous supply chain roles and operational settings adjusted accordingly. Uniqueness of this research stems from the application of the five SCOR performance attributes with modelled operational responses in a dynamic discrete event simulation model. This project makes its primary contribution to knowledge by measuring the impact, on supply chain dynamics, of applying a representative performance measurement system.
Resumo:
In this paper, we present experimental results for monitoring long distance WDM communication links using a line monitoring system suitable for legacy optically amplified long-haul undersea systems. This monitoring system is based on setting up a simple, passive, low cost high-loss optical loopback circuit at each repeater that provides a connection between the existing anti-directional undersea fibres, and can be used to define fault location. Fault location is achieved by transmitting a short pulse supervisory signal along with the WDM data signals where a portion of the overall signal is attenuated and returned to the transmit terminal by the loopback circuit. A special receiver is used at the terminal to extract the weakly returned supervisory signal where each supervisory signal is received at different times corresponding to different optical repeaters. Therefore, the degradation in any repeater appears on its corresponding supervisory signal level. We use a recirculating loop to simulate a 4600 km fibre link, on which a high-loss loopback supervisory system is implemented. Successful monitoring is accomplished through the production of an appropriate supervisory signal at the terminal that is detected and identified in a satisfactory time period after passing through up to 45 dB attenuation in the loopback circuit. © 2012 Elsevier B.V. All rights reserved.
Resumo:
Obtaining wind vectors over the ocean is important for weather forecasting and ocean modelling. Several satellite systems used operationally by meteorological agencies utilise scatterometers to infer wind vectors over the oceans. In this paper we present the results of using novel neural network based techniques to estimate wind vectors from such data. The problem is partitioned into estimating wind speed and wind direction. Wind speed is modelled using a multi-layer perceptron (MLP) and a sum of squares error function. Wind direction is a periodic variable and a multi-valued function for a given set of inputs; a conventional MLP fails at this task, and so we model the full periodic probability density of direction conditioned on the satellite derived inputs using a Mixture Density Network (MDN) with periodic kernel functions. A committee of the resulting MDNs is shown to improve the results.
Resumo:
Obtaining wind vectors over the ocean is important for weather forecasting and ocean modelling. Several satellite systems used operationally by meteorological agencies utilise scatterometers to infer wind vectors over the oceans. In this paper we present the results of using novel neural network based techniques to estimate wind vectors from such data. The problem is partitioned into estimating wind speed and wind direction. Wind speed is modelled using a multi-layer perceptron (MLP) and a sum of squares error function. Wind direction is a periodic variable and a multi-valued function for a given set of inputs; a conventional MLP fails at this task, and so we model the full periodic probability density of direction conditioned on the satellite derived inputs using a Mixture Density Network (MDN) with periodic kernel functions. A committee of the resulting MDNs is shown to improve the results.
Resumo:
In May 2006, the Ministers of Health of all the countries on the African continent, at a special session of the African Union, undertook to institutionalise efficiency monitoring within their respective national health information management systems. The specific objectives of this study were: (i) to assess the technical efficiency of National Health Systems (NHSs) of African countries for measuring male and female life expectancies, and (ii) to assess changes in health productivity over time with a view to analysing changes in efficiency and changes in technology. The analysis was based on a five-year panel data (1999-2003) from all the 53 countries of continental Africa. Data Envelopment Analysis (DEA) - a non-parametric linear programming approach - was employed to assess the technical efficiency. Malmquist Total Factor Productivity (MTFP) was used to analyse efficiency and productivity change over time among the 53 countries' national health systems. The data consisted of two outputs (male and female life expectancies) and two inputs (per capital total health expenditure and adult literacy). The DEA revealed that 49 (92.5%) countries' NHSs were run inefficiently in 1999 and 2000; 50 (94.3%), 48 (90.6%) and 47 (88.7%) operated inefficiently in 2001, 2002, and 2003 respectively. All the 53 countries' national health systems registered improvements in total factor productivity attributable mainly to technical progress. Fifty-two countries did not experience any change in scale efficiency, while thirty (56.6%) countries' national health systems had a Pure Efficiency Change (PEFFCH) index of less than one, signifying that those countries' NHSs pure efficiency contributed negatively to productivity change. All the 53 countries' national health systems registered improvements in total factor productivity, attributable mainly to technical progress. Over half of the countries' national health systems had a pure efficiency index of less than one, signifying that those countries' NHSs pure efficiency contributed negatively to productivity change. African countries may need to critically evaluate the utility of institutionalising Malmquist TFP type of analyses to monitor changes in health systems economic efficiency and productivity over time. African national health systems, per capita total health expenditure, technical efficiency, scale efficiency, Malmquist indices of productivity change, DEA
Resumo:
Objective: To assess and explain deviations from recommended practice in National Institute for Clinical Excellence (NICE) guidelines in relation to fetal heart monitoring. Design: Qualitative study. Setting: Large teaching hospital in the UK. Sample: Sixty-six hours of observation of 25 labours and interviews with 20 midwives of varying grades. Methods: Structured observations of labour and semistructured interviews with midwives. Interviews were undertaken using a prompt guide, audiotaped, and transcribed verbatim. Analysis was based on the constant comparative method, assisted by QSR N5 software. Main outcome measures: Deviations from recommended practice in relation to fetal monitoring and insights into why these occur. Results: All babies involved in the study were safely delivered, but 243 deviations from recommended practice in relation to NICE guidelines on fetal monitoring were identified, with the majority (80%) of these occurring in relation to documentation. Other deviations from recommended practice included indications for use of electronic fetal heart monitoring and conduct of fetal heart monitoring. There is evidence of difficulties with availability and maintenance of equipment, and some deficits in staff knowledge and skill. Differing orientations towards fetal monitoring were reported by midwives, which were likely to have impacts on practice. The initiation, management, and interpretation of fetal heart monitoring is complex and distributed across time, space, and professional boundaries, and practices in relation to fetal heart monitoring need to be understood within an organisational and social context. Conclusion: Some deviations from best practice guidelines may be rectified through straightforward interventions including improved systems for managing equipment and training. Other deviations from recommended practice need to be understood as the outcomes of complex processes that are likely to defy easy resolution. © RCOG 2006.
Resumo:
Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.
Resumo:
Two in-fiber Bragg grating (FBG) temperature sensor systems for medical applications are demonstrated: (1) an FBG flow-directed thermodilution catheter based on interferometric detection of wavelength shift that is used for cardiac monitoring; and (2) an FBG sensor system with a tunable Fabry-Perot filter for in vivo temperature profiling in nuclear magnetic resonance (NMR) machines. Preliminary results show that the FBG sensor is in good agreement with electrical sensors that are widely used in practice. A field test shows that the FBG sensor system is suitable for in situ temperature profiling in NMR machines for medical applications.