12 resultados para NSM FRP CFRP continuous quality control monitoring bond EMM-ARM
em CentAUR: Central Archive University of Reading - UK
Resumo:
In this paper, various types of fault detection methods for fuel cells are compared. For example, those that use a model based approach or a data driven approach or a combination of the two. The potential advantages and drawbacks of each method are discussed and comparisons between methods are made. In particular, classification algorithms are investigated, which separate a data set into classes or clusters based on some prior knowledge or measure of similarity. In particular, the application of classification methods to vectors of reconstructed currents by magnetic tomography or to vectors of magnetic field measurements directly is explored. Bases are simulated using the finite integration technique (FIT) and regularization techniques are employed to overcome ill-posedness. Fisher's linear discriminant is used to illustrate these concepts. Numerical experiments show that the ill-posedness of the magnetic tomography problem is a part of the classification problem on magnetic field measurements as well. This is independent of the particular working mode of the cell but influenced by the type of faulty behavior that is studied. The numerical results demonstrate the ill-posedness by the exponential decay behavior of the singular values for three examples of fault classes.
Resumo:
This article models the interactions between safety and quality control and stage of distribution in the food marketing complex
Resumo:
The quality control, validation and verification of the European Flood Alert System (EFAS) are described. EFAS is designed as a flood early warning system at pan-European scale, to complement national systems and provide flood warnings more than 2 days before a flood. On average 20–30 alerts per year are sent out to the EFAS partner network which consists of 24 National hydrological authorities responsible for transnational river basins. Quality control of the system includes the evaluation of the hits, misses and false alarms, showing that EFAS has more than 50% of the time hits. Furthermore, the skills of both the meteorological as well as the hydrological forecasts are evaluated, and are included here for a 10-year period. Next, end-user needs and feedback are systematically analysed. Suggested improvements, such as real-time river discharge updating, are currently implemented.
Resumo:
The real-time quality control (RTQC) methods applied to Argo profiling float data by the United Kingdom (UK) Met Office, the United States (US) Fleet Numerical Meteorology and Oceanography Centre, the Australian Bureau of Meteorology and the Coriolis Centre are compared and contrasted. Data are taken from the period 2007 to 2011 inclusive and RTQC performance is assessed with respect to Argo delayed-mode quality control (DMQC). An intercomparison of RTQC techniques is performed using a common data set of profiles from 2010 and 2011. The RTQC systems are found to have similar power in identifying faulty Argo profiles but to vary widely in the number of good profiles incorrectly rejected. The efficacy of individual QC tests are inferred from the results of the intercomparison. Techniques to increase QC performance are discussed.
Resumo:
The objective of this study was to investigate a novel light backscatter sensor, with a large field of view relative to curd size, for continuous on-line monitoring of coagulation and syneresis to improve curd moisture content control. A three-level, central composite design was employed to study the effects of temperature, cutting time, and CaCl2 addition on cheese making parameters. The sensor signal was recorded and analyzed. The light backscatter ratio followed a sigmoid increase during coagulation and decreased asymptotically after gel cutting. Curd yield and curd moisture content were predicted from the time to the maximum slope of the first derivative of the light backscatter ratio during coagulation and the decrease in the sensor response during syneresis. Whey fat was affected by coagulation kinetics and cutting time, suggesting curd rheological properties at cutting are dominant factors determining fat losses. The proposed technology shows potential for on-line monitoring of coagulation and syneresis. 2007 Elsevier Ltd. All rights reserved..
Resumo:
This paper reviews the current state of development of both near-infrared (NIR) and mid-infrared (MIR) spectroscopic techniques for process monitoring, quality control, and authenticity determination in cheese processing. Infrared spectroscopy has been identified as an ideal process analytical technology tool, and recent publications have demonstrated the potential of both NIR and MIR spectroscopy, coupled with chemometric techniques, for monitoring coagulation, syneresis, and ripening as well as determination of authenticity, composition, sensory, and rheological parameters. Recent research is reviewed and compared on the basis of experimental design, spectroscopic and chemometric methods employed to assess the potential of infrared spectroscopy as a technology for improving process control and quality in cheese manufacture. Emerging research areas for these technologies, such as cheese authenticity and food chain traceability, are also discussed.
Resumo:
Remote sensing from space-borne platforms is often seen as an appealing method of monitoring components of the hydrological cycle, including river discharge, due to its spatial coverage. However, data from these platforms is often less than ideal because the geophysical properties of interest are rarely measured directly and the measurements that are taken can be subject to significant errors. This study assimilated water levels derived from a TerraSAR-X synthetic aperture radar image and digital aerial photography with simulations from a two dimensional hydraulic model to estimate discharge, inundation extent, depths and velocities at the confluence of the rivers Severn and Avon, UK. An ensemble Kalman filter was used to assimilate spot heights water levels derived by intersecting shorelines from the imagery with a digital elevation model. Discharge was estimated from the ensemble of simulations using state augmentation and then compared with gauge data. Assimilating the real data reduced the error between analyzed mean water levels and levels from three gauging stations to less than 0.3 m, which is less than typically found in post event water marks data from the field at these scales. Measurement bias was evident, but the method still provided a means of improving estimates of discharge for high flows where gauge data are unavailable or of poor quality. Posterior estimates of discharge had standard deviations between 63.3 m3s-1 and 52.7 m3s-1, which were below 15% of the gauged flows along the reach. Therefore, assuming a roughness uncertainty of 0.03-0.05 and no model structural errors discharge could be estimated by the EnKF with accuracy similar to that arguably expected from gauging stations during flood events. Quality control prior to assimilation, where measurements were rejected for being in areas of high topographic slope or close to tall vegetation and trees, was found to be essential. The study demonstrates the potential, but also the significant limitations of currently available imagery to reduce discharge uncertainty in un-gauged or poorly gauged basins when combined with model simulations in a data assimilation framework.
Resumo:
Climate data are used in a number of applications including climate risk management and adaptation to climate change. However, the availability of climate data, particularly throughout rural Africa, is very limited. Available weather stations are unevenly distributed and mainly located along main roads in cities and towns. This imposes severe limitations to the availability of climate information and services for the rural community where, arguably, these services are needed most. Weather station data also suffer from gaps in the time series. Satellite proxies, particularly satellite rainfall estimate, have been used as alternatives because of their availability even over remote parts of the world. However, satellite rainfall estimates also suffer from a number of critical shortcomings that include heterogeneous time series, short time period of observation, and poor accuracy particularly at higher temporal and spatial resolutions. An attempt is made here to alleviate these problems by combining station measurements with the complete spatial coverage of satellite rainfall estimates. Rain gauge observations are merged with a locally calibrated version of the TAMSAT satellite rainfall estimates to produce over 30-years (1983-todate) of rainfall estimates over Ethiopia at a spatial resolution of 10 km and a ten-daily time scale. This involves quality control of rain gauge data, generating locally calibrated version of the TAMSAT rainfall estimates, and combining these with rain gauge observations from national station network. The infrared-only satellite rainfall estimates produced using a relatively simple TAMSAT algorithm performed as good as or even better than other satellite rainfall products that use passive microwave inputs and more sophisticated algorithms. There is no substantial difference between the gridded-gauge and combined gauge-satellite products over the test area in Ethiopia having a dense station network; however, the combined product exhibits better quality over parts of the country where stations are sparsely distributed.
Resumo:
Existing urban meteorological networks have an important role to play as test beds for inexpensive and more sustainable measurement techniques that are now becoming possible in our increasingly smart cities. The Birmingham Urban Climate Laboratory (BUCL) is a near-real-time, high-resolution urban meteorological network (UMN) of automatic weather stations and inexpensive, nonstandard air temperature sensors. The network has recently been implemented with an initial focus on monitoring urban heat, infrastructure, and health applications. A number of UMNs exist worldwide; however, BUCL is novel in its density, the low-cost nature of the sensors, and the use of proprietary Wi-Fi networks. This paper provides an overview of the logistical aspects of implementing a UMN test bed at such a density, including selecting appropriate urban sites; testing and calibrating low-cost, nonstandard equipment; implementing strict quality-assurance/quality-control mechanisms (including metadata); and utilizing preexisting Wi-Fi networks to transmit data. Also included are visualizations of data collected by the network, including data from the July 2013 U.K. heatwave as well as highlighting potential applications. The paper is an open invitation to use the facility as a test bed for evaluating models and/or other nonstandard observation techniques such as those generated via crowdsourcing techniques.