6 resultados para OPTICAL PERFORMANCE MONITORING
Resumo:
Approximately half of the houses in Northern Ireland were built before any form of minimum thermal specification or energy efficiency standard was enforced. Furthermore, 44% of households are categorised as being in fuel poverty; spending more than 10% of the household income to heat the house to bring it to an acceptable level of thermal comfort. To bring existing housing stock up to an acceptable standard, retrofitting for improving the energy efficiency is essential and it is also necessary to study the effectiveness of such improvements in future climate scenarios. This paper presents the results from a year-long performance monitoring of two houses that have undergone retrofits to improve energy efficiency. Using wireless sensor technology internal temperature, humidity, external weather, household gas and electricity usage were monitored for a year. Simulations using IES-VE dynamic building modelling software were calibrated using the monitoring data to ASHARE Guideline 14 standards. The energy performance and the internal environment of the houses were then assessed for current and future climate scenarios and the results show that there is a need for a holistic balanced strategy for retrofitting.
Resumo:
Motivated by environmental protection concerns, monitoring the flue gas of thermal power plant is now often mandatory due to the need to ensure that emission levels stay within safe limits. Optical based gas sensing systems are increasingly employed for this purpose, with regression techniques used to relate gas optical absorption spectra to the concentrations of specific gas components of interest (NOx, SO2 etc.). Accurately predicting gas concentrations from absorption spectra remains a challenging problem due to the presence of nonlinearities in the relationships and the high-dimensional and correlated nature of the spectral data. This article proposes a generalized fuzzy linguistic model (GFLM) to address this challenge. The GFLM is made up of a series of “If-Then” fuzzy rules. The absorption spectra are input variables in the rule antecedent. The rule consequent is a general nonlinear polynomial function of the absorption spectra. Model parameters are estimated using least squares and gradient descent optimization algorithms. The performance of GFLM is compared with other traditional prediction models, such as partial least squares, support vector machines, multilayer perceptron neural networks and radial basis function networks, for two real flue gas spectral datasets: one from a coal-fired power plant and one from a gas-fired power plant. The experimental results show that the generalized fuzzy linguistic model has good predictive ability, and is competitive with alternative approaches, while having the added advantage of providing an interpretable model.
Resumo:
Motivated by environmental protection concerns, monitoring the flue gas of thermal power plant is now often mandatory due to the need to ensure that emission levels stay within safe limits. Optical based gas sensing systems are increasingly employed for this purpose, with regression techniques used to relate gas optical absorption spectra to the concentrations of specific gas components of interest (NOx, SO2 etc.). Accurately predicting gas concentrations from absorption spectra remains a challenging problem due to the presence of nonlinearities in the relationships and the high-dimensional and correlated nature of the spectral data. This article proposes a generalized fuzzy linguistic model (GFLM) to address this challenge. The GFLM is made up of a series of “If-Then” fuzzy rules. The absorption spectra are input variables in the rule antecedent. The rule consequent is a general nonlinear polynomial function of the absorption spectra. Model parameters are estimated using least squares and gradient descent optimization algorithms. The performance of GFLM is compared with other traditional prediction models, such as partial least squares, support vector machines, multilayer perceptron neural networks and radial basis function networks, for two real flue gas spectral datasets: one from a coal-fired power plant and one from a gas-fired power plant. The experimental results show that the generalized fuzzy linguistic model has good predictive ability, and is competitive with alternative approaches, while having the added advantage of providing an interpretable model.
Resumo:
Dissolved CO2 measurements are usually made using a Severinghaus electrode, which is bulky and can suffer from electrical interference. In contrast, optical sensors for gaseous CO2, whilst not suffering these problems, are mainly used for making gaseous (not dissolved) CO2 measurements, due to dye leaching and protonation, especially at high ionic strengths (>0.01 M) and acidity (<pH 4). This is usually prevented by coating the sensor with a gas-permeable, but ion-impermeable, membrane (GPM). Herein, we introduce a highly sensitive, colourimetric-based, plastic film sensor for the measurement of both gaseous and dissolved CO2, in which a pH-sensitive dye, thymol blue (TB) is coated onto particles of hydrophilic silica to create a CO2-sensitive, TB-based pigment, which is then extruded into low density polyethylene (LDPE) to create a GPM-free, i.e. naked, TB plastic sensor film for gaseous and dissolved CO2 measurements. When used for making dissolved CO2 measurements, the hydrophobic nature of the LDPE renders the film: (i) indifferent to ionic strength, (ii) highly resistant to acid attack and (iii) stable when stored under ambient (dark) conditions for >8 months, with no loss of colour or function. Here, the performance of the TB plastic film is primarily assessed as a dissolved CO2 sensor in highly saline (3.5 wt%) water. The TB film is blue in the absence of CO2 and yellow in its presence, exhibiting 50% transition in its colour at ca. 0.18% CO2. This new type of CO2 sensor has great potential in the monitoring of CO2 levels in the hydrosphere, as well as elsewhere, e.g. food packaging and possibly patient monitoring.
Resumo:
The thermoforming industry has been relatively slow to embrace modern measurement technologies. As a result researchers have struggled to develop accurate thermoforming simulations as some of the key aspects of the process remain poorly understood. For the first time, this work reports the development of a prototype multivariable instrumentation system for use in thermoforming. The system contains sensors for plug force, plug displacement, air pressure and temperature, plug temperature, and sheet temperature. Initially, it was developed to fit the tooling on a laboratory thermoforming machine, but later its performance was validated by installing it on a similar industrial tool. Throughout its development, providing access for the various sensors and their cabling was the most challenging task. In testing, all of the sensors performed well and the data collected has given a powerful insight into the operation of the process. In particular, it has shown that both the air and plug temperatures stabilize at more than 80C during the continuous thermoforming of amorphous polyethylene terephthalate (aPET) sheet at 110C. The work also highlighted significant differences in the timing and magnitude of the cavity pressures reached in the two thermoforming machines. The prototype system has considerable potential for further development.
Resumo:
To maintain the pace of development set by Moore's law, production processes in semiconductor manufacturing are becoming more and more complex. The development of efficient and interpretable anomaly detection systems is fundamental to keeping production costs low. As the dimension of process monitoring data can become extremely high anomaly detection systems are impacted by the curse of dimensionality, hence dimensionality reduction plays an important role. Classical dimensionality reduction approaches, such as Principal Component Analysis, generally involve transformations that seek to maximize the explained variance. In datasets with several clusters of correlated variables the contributions of isolated variables to explained variance may be insignificant, with the result that they may not be included in the reduced data representation. It is then not possible to detect an anomaly if it is only reflected in such isolated variables. In this paper we present a new dimensionality reduction technique that takes account of such isolated variables and demonstrate how it can be used to build an interpretable and robust anomaly detection system for Optical Emission Spectroscopy data.