954 resultados para Continuous monitoring
Resumo:
Condition monitoring on rails and train wheels is vitally important to the railway asset management and the rail-wheel interactions provide the crucial information of the health state of both rails and wheels. Continuous and remote monitoring is always a preference for operators. With a new generation of strain sensing devices in Fibre Bragg Grating (FBG) sensors, this study explores the possibility of continuous monitoring of the health state of the rails; and investigates the required signal processing techniques and their limitations.
Resumo:
There are different ways to authenticate humans, which is an essential prerequisite for access control. The authentication process can be subdivided into three categories that rely on something someone i) knows (e.g. password), and/or ii) has (e.g. smart card), and/or iii) is (biometric features). Besides classical attacks on password solutions and the risk that identity-related objects can be stolen, traditional biometric solutions have their own disadvantages such as the requirement of expensive devices, risk of stolen bio-templates etc. Moreover, existing approaches provide the authentication process usually performed only once initially. Non-intrusive and continuous monitoring of user activities emerges as promising solution in hardening authentication process: iii-2) how so. behaves. In recent years various keystroke dynamic behavior-based approaches were published that are able to authenticate humans based on their typing behavior. The majority focuses on so-called static text approaches, where users are requested to type a previously defined text. Relatively few techniques are based on free text approaches that allow a transparent monitoring of user activities and provide continuous verification. Unfortunately only few solutions are deployable in application environments under realistic conditions. Unsolved problems are for instance scalability problems, high response times and error rates. The aim of this work is the development of behavioral-based verification solutions. Our main requirement is to deploy these solutions under realistic conditions within existing environments in order to enable a transparent and free text based continuous verification of active users with low error rates and response times.
Resumo:
Continuous monitoring of diesel engine performance is critical for early detection of fault developments in an engine before they materialize into a functional failure. Instantaneous crank angular speed (IAS) analysis is one of a few nonintrusive condition monitoring techniques that can be utilized for such a task. Furthermore, the technique is more suitable for mass industry deployments than other non-intrusive methods such as vibration and acoustic emission techniques due to the low instrumentation cost, smaller data size and robust signal clarity since IAS is not affected by the engine operation noise and noise from the surrounding environment. A combination of IAS and order analysis was employed in this experimental study and the major order component of the IAS spectrum was used for engine loading estimation and fault diagnosis of a four-stroke four-cylinder diesel engine. It was shown that IAS analysis can provide useful information about engine speed variation caused by changing piston momentum and crankshaft acceleration during the engine combustion process. It was also found that the major order component of the IAS spectra directly associated with the engine firing frequency (at twice the mean shaft rotating speed) can be utilized to estimate engine loading condition regardless of whether the engine is operating at healthy condition or with faults. The amplitude of this order component follows a distinctive exponential curve as the loading condition changes. A mathematical relationship was then established in the paper to estimate the engine power output based on the amplitude of this order component of the IAS spectrum. It was further illustrated that IAS technique can be employed for the detection of a simulated exhaust valve fault in this study.
Resumo:
This is the Stillwaters monitoring programme summary results 2000 from the Environment Agency. In May 1997, a Stillwaters meeting was held to discuss the way forward in stillwaters monitoring. It decided upon the establishment of a three year rolling programme, in which three stillwaters would be monitored three times a year, every third year. During 2000, stillwaters monitored for the fourth year of the Stillwaters Monitoring Programme were Hatch Mere, Marbury Big Mere, Comber Mere, Tabley Mere, Tatton Mere and Melchett Mere. Algal, zooplankton and water chemical samples were taken on all meres. Surveys of Tabley Mere and Comber Mere continued on from last year when water quality concerns were highlighted. Continuous monitoring in Oak Mere, including water level data continued in 2000. Fish surveys were carried out in Tatton Mere and Comber Mere. Tabley Mere survey was abandoned due to the awkward bathymetry of the mere. No invertebrate samples were taken in 2000 due to lack of resources.
Resumo:
Following the commencement of construction works of a 250 MW hydropower plant at Dumbbell Island in the Upper Victoria Nile in September 2007, BEL requested NaFIRRI to conduct continuous monitoring of fish catches at two transects i.e. the immediate upstream transect of the project site (Kalange-Makwanzi) and the immediate downstream .transect (Buyala-Kikubamutwe). The routine monitoring surveys were designed to be conducted twice a week at each of the tWo transects. It was anticipated that major immediate impacts were to occur during construction, and these needed to be known by BEL as part of a mitigation strategy. For example, the construction of it cofferdam could be accompanied by rapid changes in water quality and quantity downstream of the construction. These changes in turn could affect the fish catch and would probably be missed by the quarterly monitoring already in place. Therefore, a major cbjective of the more regular and rapid monitoring was to discern immediate impacts of construction activities by focusing on selected water quality parameters (total suspended solids, water conductivity, temperature, dissolved oxygen and pH) and fish catch characteristics (total catch, catch rates and value of the catch)
Resumo:
The validity of load estimates from intermittent, instantaneous grab sampling is dependent on adequate spatial coverage by monitoring networks and a sampling frequency that re?ects the variability in the system under study. Catchments with a ?ashy hydrology due to surface runoff pose a particular challenge as intense short duration rainfall events may account for a signi?cant portion of the total diffuse transfer of pollution from soil to water in any hydrological year. This can also be exacerbated by the presence of strong background pollution signals from point sources during low flows. In this paper, a range of sampling methodologies and load estimation techniques are applied to phosphorus data from such a surface water dominated river system, instrumented at three sub-catchments (ranging from 3 to 5 km2 in area) with near-continuous monitoring stations. Systematic and Monte Carlo approaches were applied to simulate grab sampling using multiple strategies and to calculate an estimated load, Le based on established load estimation methods. Comparison with the actual load, Lt, revealed signi?cant average underestimation, of up to 60%, and high variability for all feasible sampling approaches. Further analysis of the time series provides an insight into these observations; revealing peak frequencies and power-law scaling in the distributions of P concentration, discharge and load associated with surface runoff and background transfers. Results indicate that only near-continuous monitoring that re?ects the rapid temporal changes in these river systems is adequate for comparative monitoring and evaluation purposes. While the implications of this analysis may be more tenable to small scale ?ashy systems, this represents an appropriate scale in terms of evaluating catchment mitigation strategies such as agri-environmental policies for managing diffuse P transfers in complex landscapes.
Resumo:
Weathering of stone is one of the major reasons for the damage of stone masonry structures and it takes place due to interlinked chemical, physical and biological processes in stones. The key parameters involved in the deterioration processes are temperature, moisture and salt. It is now known that the sudden variations in temperature and moisture greatly accelerate the weathering process of the building stone fabric. Therefore, in order to monitor these sudden variations an effective and continuous monitoring system is needed. Furthermore, it must consist of robust sensors which are accurate and can survive in the harsh environments experienced in and around masonry structures. Although salt penetration is important for the rate of deterioration of stone masonry structures, the processes involved are much slower than the damage associated with temperature and moisture variations. Therefore, in this paper a novel fibre optic temperature cum relative humidity sensor is described and its applicability in monitoring building stones demonstrated. The performance of the sensor is assessed in an experiment comprising wetting and drying of limestone blocks. The results indicate that the novel fibre optic relative humidity sensor which is tailor made for applications in masonry structures performed well in wetting and drying tests, whilst commercial capacitance based sensors failed to recover during the drying regime for a long period after a wetting regime. That is, the fibre optic sensor has the capability to measure both sorption and de-sorption characteristics of stone blocks. This sensor is used in a test wall in Oxford and the data thus obtained strengthened the laboratory observations.
Resumo:
New techniques based on embedded sensors have been developed for monitoring reinforced concrete structures for assessing their durability, which can be used instead of the conventional non-destructive test techniques. The continuous monitoring of concrete for its durability with various types of sensors allows not only early assessment of the potential durability of structures, but also a prediction of their service life. Effrosyni Tzoura and Muhammed Basheer of University of Leeds, Sreejith Nanukuttan and Danny McPolin of Queen's University Belfast, John McCarter of Heriot-Watt University, Ken Grattan and Tong Sun of City University London and Sudarshan Srinivasan of Mott MacDonald report.
Resumo:
During the last decade Mongolia’s region was characterized by a rapid increase of both severity and frequency of drought events, leading to pasture reduction. Drought monitoring and assessment plays an important role in the region’s early warning systems as a way to mitigate the negative impacts in social, economic and environmental sectors. Nowadays it is possible to access information related to the hydrologic cycle through remote sensing, which provides a continuous monitoring of variables over very large areas where the weather stations are sparse. The present thesis aimed to explore the possibility of using NDVI as a potential drought indicator by studying anomaly patterns and correlations with other two climate variables, LST and precipitation. The study covered the growing season (March to September) of a fifteen year period, between 2000 and 2014, for Bayankhongor province in southwest Mongolia. The datasets used were MODIS NDVI, LST and TRMM Precipitation, which processing and analysis was supported by QGIS software and Python programming language. Monthly anomaly correlations between NDVI-LST and NDVI-Precipitation were generated as well as temporal correlations for the growing season for known drought years (2001, 2002 and 2009). The results show that the three variables follow a seasonal pattern expected for a northern hemisphere region, with occurrence of the rainy season in the summer months. The values of both NDVI and precipitation are remarkably low while LST values are high, which is explained by the region’s climate and ecosystems. The NDVI average, generally, reached higher values with high precipitation values and low LST values. The year of 2001 was the driest year of the time-series, while 2003 was the wet year with healthier vegetation. Monthly correlations registered weak results with low significance, with exception of NDVI-LST and NDVI-Precipitation correlations for June, July and August of 2002. The temporal correlations for the growing season also revealed weak results. The overall relationship between the variables anomalies showed weak correlation results with low significance, which suggests that an accurate answer for predicting drought using the relation between NDVI, LST and Precipitation cannot be given. Additional research should take place in order to achieve more conclusive results. However the NDVI anomaly images show that NDVI is a suitable drought index for Bayankhongor province.
Resumo:
Wednesday 23rd April 2014 Speaker(s): Willi Hasselbring Organiser: Leslie Carr Time: 23/04/2014 14:00-15:00 Location: B32/3077 File size: 802Mb Abstract The internal behavior of large-scale software systems cannot be determined on the basis of static (e.g., source code) analysis alone. Kieker provides complementary dynamic analysis capabilities, i.e., monitoring/profiling and analyzing a software system's runtime behavior. Application Performance Monitoring is concerned with continuously observing a software system's performance-specific runtime behavior, including analyses like assessing service level compliance or detecting and diagnosing performance problems. Architecture Discovery is concerned with extracting architectural information from an existing software system, including both structural and behavioral aspects like identifying architectural entities (e.g., components and classes) and their interactions (e.g., local or remote procedure calls). In addition to the Architecture Discovery of Java systems, Kieker supports Architecture Discovery for other platforms, including legacy systems, for instance, inplemented in C#, C++, Visual Basic 6, COBOL or Perl. Thanks to Kieker's extensible architecture it is easy to implement and use custom extensions and plugins. Kieker was designed for continuous monitoring in production systems inducing only a very low overhead, which has been evaluated in extensive benchmark experiments. Please, refer to http://kieker-monitoring.net/ for more information.
Resumo:
This paper describes an automatic device for in situ and continuous monitoring of the ageing process occurring in natural and synthetic resins widely used in art and in the conservation and restoration of cultural artefacts. The results of tests carried out under accelerated ageing conditions are also presented. This easy-to-assemble palm-top device, essentially consists of oscillators based on quartz crystal resonators coated with films of the organic materials whose response to environmental stress is to be addressed. The device contains a microcontroller which selects at pre-defined time intervals the oscillators and records and stores their oscillation frequency. The ageing of the coatings, caused by the environmental stress and resulting in a shift in the oscillation frequency of the modified crystals, can be straightforwardly monitored in this way. The kinetics of this process reflects the level of risk damage associated with a specific microenvironment. In this case, natural and artificial resins, broadly employed in art and restoration of artistic and archaeological artefacts (dammar and Paraloid B72), were applied onto the crystals. The environmental stress was represented by visible and UV radiation, since the chosen materials are known to be photochemically active, to different extents. In the case of dammar, the results obtained are consistent with previous data obtained using a bench-top equipment by impedance analysis through discrete measurements and confirm that the ageing of this material is reflected in the gravimetric response of the modified quartz crystals. As for Paraloid B72, the outcome of the assays indicates that the resin is resistant to visible light, but is very sensitive to UV irradiation. The use of a continuous monitoring system, apart from being obviously more practical, is essential to identify short-term (i.e. reversible) events, like water vapour adsorption/desorption processes, and to highlight ageing trends or sudden changes of such trends. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
In the last decade the near-surface mounted (NSM) strengthening technique using carbon fibre reinforced polymers (CFRP) has been increasingly used to improve the load carrying capacity of concrete members. Compared to externally bonded reinforcement (EBR), the NSM system presents considerable advantages. This technique consists in the insertion of carbon fibre reinforced polymer laminate strips into pre-cut slits opened in the concrete cover of the elements to be strengthened. CFRP reinforcement is bonded to concrete with an appropriate groove filler, typically epoxy adhesive or cement grout. Up to now, research efforts have been mainly focused on several structural aspects, such as: bond behaviour, flexural and/or shear strengthening effectiveness, and energy dissipation capacity of beam-column joints. In such research works, as well as in field applications, the most widespread adhesives that are used to bond reinforcements to concrete are epoxy resins. It is largely accepted that the performance of the whole application of NSM systems strongly depends on the mechanical properties of the epoxy resins, for which proper curing conditions must be assured. Therefore, the existence of non-destructive methods that allow monitoring the curing process of epoxy resins in the NSM CFRP system is desirable, in view of obtaining continuous information that can provide indication in regard to the effectiveness of curing and the expectable bond behaviour of CFRP/adhesive/concrete systems. The experimental research was developed at the Laboratory of the Structural Division of the Civil Engineering Department of the University of Minho in Guimar\~aes, Portugal (LEST). The main objective was to develop and propose a new method for continuous quality control of the curing of epoxy resins applied in NSM CFRP strengthening systems. This objective is pursued through the adaptation of an existing technique, termed EMM-ARM (Elasticity Modulus Monitoring through Ambient Response Method) that has been developed for monitoring the early stiffness evolution of cement-based materials. The experimental program was composed of two parts: (i) direct pull-out tests on concrete specimens strengthened with NSM CFRP laminate strips were conducted to assess the evolution of bond behaviour between CFRP and concrete since early ages; and, (ii) EMM-ARM tests were carried out for monitoring the progressive stiffness development of the structural adhesive used in CFRP applications. In order to verify the capability of the proposed method for evaluating the elastic modulus of the epoxy, static E-Modulus was determined through tension tests. The results of the two series of tests were then combined and compared to evaluate the possibility of implementation of a new method for the continuous monitoring and quality control of NSM CFRP applications.
Resumo:
Der zunehmende Anteil von Strom aus erneuerbaren Energiequellen erfordert ein dynamisches Konzept, um Spitzenlastzeiten und Versorgungslücken aus der Wind- und Solarenergie ausgleichen zu können. Biogasanlagen können aufgrund ihrer hohen energetischen Verfügbarkeit und der Speicherbarkeit von Biogas eine flexible Energiebereitstellung ermöglichen und darüber hinaus über ein „Power-to-Gas“-Verfahren bei einem kurzzeitigen Überschuss von Strom eine Überlastung des Stromnetzes verhindern. Ein nachfrageorientierter Betrieb von Biogasanlagen stellt jedoch hohe Anforderungen an die Mikrobiologie im Reaktor, die sich an die häufig wechselnden Prozessbedingungen wie der Raumbelastung im Reaktor anpassen muss. Eine Überwachung des Fermentationsprozesses in Echtzeit ist daher unabdingbar, um Störungen in den mikrobiellen Gärungswegen frühzeitig erkennen und adäquat entgegenwirken zu können. rnBisherige mikrobielle Populationsanalysen beschränken sich auf aufwendige, molekularbiologische Untersuchungen des Gärsubstrates, deren Ergebnisse dem Betreiber daher nur zeitversetzt zur Verfügung stehen. Im Rahmen dieser Arbeit wurde erstmalig ein Laser-Absorptionsspektrometer zur kontinuierlichen Messung der Kohlenstoff-Isotopenverhältnisse des Methans an einer Forschungsbiogasanlage erprobt. Dabei konnten, in Abhängigkeit der Raumbelastung und Prozessbedingungen variierende Isotopenverhältnisse gemessen werden. Anhand von Isolaten aus dem untersuchten Reaktor konnte zunächst gezeigt werden, dass für jeden Methanogenesepfad (hydrogeno-troph, aceto¬klastisch sowie methylotroph) eine charakteristische, natürliche Isotopensignatur im Biogas nachgewiesen werden kann, sodass eine Identifizierung der aktuell dominierenden methanogenen Reaktionen anhand der Isotopen-verhältnisse im Biogas möglich ist. rnDurch den Einsatz von 13C- und 2H-isotopen¬markierten Substraten in Rein- und Mischkulturen und Batchreaktoren, sowie HPLC- und GC-Unter¬suchungen der Stoffwechselprodukte konnten einige bislang unbekannte C-Flüsse in Bioreaktoren festgestellt werden, die sich wiederum auf die gemessenen Isotopenverhältnisse im Biogas auswirken können. So konnte die Entstehung von Methanol sowie dessen mikrobieller Abbauprodukte bis zur finalen CH4-Bildung anhand von fünf Isolaten erstmalig in einer landwirtschaftlichen Biogasanlage rekonstruiert und das Vorkommen methylotropher Methanogenesewege nachgewiesen werden. Mithilfe molekularbiologischer Methoden wurden darüber hinaus methanoxidierende Bakterien zahlreicher, unbekannter Arten im Reaktor detektiert, deren Vorkommen aufgrund des geringen O2-Gehaltes in Biogasanlagen bislang nicht erwartet wurde. rnDurch die Konstruktion eines synthetischen DNA-Stranges mit den Bindesequenzen für elf spezifische Primerpaare konnte eine neue Methode etabliert werden, anhand derer eine Vielzahl mikrobieller Zielorganismen durch die Verwendung eines einheitlichen Kopienstandards in einer real-time PCR quantifiziert werden können. Eine über 70 Tage durchgeführte, wöchentliche qPCR-Analyse von Fermenterproben zeigte, dass die Isotopenverhältnisse im Biogas signifikant von der Zusammensetzung der Reaktormikrobiota beeinflusst sind. Neben den aktuell dominierenden Methanogenesewegen war es auch möglich, einige bakterielle Reaktionen wie eine syntrophe Acetatoxidation, Acetogenese oder Sulfatreduktion anhand der δ13C (CH4)-Werte zu identifizieren, sodass das hohe Potential einer kontinuierlichen Isotopenmessung zur Prozessanalytik in Biogasanlagen aufgezeigt werden konnte.rn
Resumo:
As the number of space debris is increasing in the geostationary ring, it becomes mandatory for any satellite operator to avoid any collisions. Space debris in geosynchronous orbits may be observed with optical telescopes. Other than radar, that requires very large dishes and transmission powers for sensing high-altitude objects, optical observations do not depend on active illumination from ground and may be performed with notably smaller apertures. The detection size of an object depends on the aperture of the telescope, sky background and exposure time. With a telescope of 50 cm aperture, objects down to approximately 50 cm may be observed. This size is regarded as a threshold for the identification of hazardous objects and the prevention of potentially catastrophic collisions in geostationary orbits. In collaboration with the Astronomical Institute of the University of Bern (AIUB), the German Space Operations Center (GSOC) is building a small aperture telescope to demonstrate the feasibility of optical surveillance of the geostationary ring. The telescope will be located in the southern hemisphere and complement an existing telescope in the northern hemisphere already operated by AIUB. These two telescopes provide an optimum coverage of European GEO satellites and enable a continuous monitoring independent of seasonal limitations. The telescope will be operated completely automatically. The automated operations should be demonstrated covering the full range of activities including scheduling of observations, telescope and camera control as well as data processing.
Resumo:
Standard methods for testing safety data are needed to ensure the safe conduct of clinical trials. In particular, objective rules for reliably identifying unsafe treatments need to be put into place to help protect patients from unnecessary harm. DMCs are uniquely qualified to evaluate accumulating unblinded data and make recommendations about the continuing safe conduct of a trial. However, it is the trial leadership who must make the tough ethical decision about stopping a trial, and they could benefit from objective statistical rules that help them judge the strength of evidence contained in the blinded data. We design early stopping rules for harm that act as continuous safety screens for randomized controlled clinical trials with blinded treatment information, which could be used by anyone, including trial investigators (and trial leadership). A Bayesian framework, with emphasis on the likelihood function, is used to allow for continuous monitoring without adjusting for multiple comparisons. Close collaboration between the statistician and the clinical investigators will be needed in order to design safety screens with good operating characteristics. Though the math underlying this procedure may be computationally intensive, implementation of the statistical rules will be easy and the continuous screening provided will give suitably early warning when real problems were to emerge. Trial investigators and trial leadership need these safety screens to help them to effectively monitor the ongoing safe conduct of clinical trials with blinded data.^