954 resultados para Continuous Monitoring
Resumo:
Structural Health Monitoring (SHM) is an integral part of infrastructure maintenance and management systems due to socio-economic, safety and security reasons. The behaviour of a structure under vibration depends on structure characteristics. The change of structure characteristics may suggest the change in system behaviour due to the presence of damage(s) within. Therefore the consistent, output signal guided, and system dependable markers would be convenient tool for the online monitoring, the maintenance, rehabilitation strategies, and optimized decision making policies as required by the engineers, owners, managers, and the users from both safety and serviceability aspects. SHM has a very significant advantage over traditional investigations where tangible and intangible costs of a very high degree are often incurred due to the disruption of service. Additionally, SHM through bridge-vehicle interaction opens up opportunities for continuous tracking of the condition of the structure. Research in this area is still in initial stage and is extremely promising. This PhD focuses on using bridge-vehicle interaction response for SHM of damaged or deteriorating bridges to monitor or assess them under operating conditions. In the present study, a number of damage detection markers have been investigated and proposed in order to identify the existence, location, and the extent of an open crack in the structure. The theoretical and experimental investigation has been conducted on Single Degree of Freedom linear system, simply supported beams. The novel Delay Vector Variance (DVV) methodology has been employed for characterization of structural behaviour by time-domain response analysis. Also, the analysis of responses of actual bridges using DVV method has been for the first time employed for this kind of investigation.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain
Resumo:
This paper investigates the use of the acoustic emission (AE) monitoring technique for use in identifying the damage mechanisms present in paper associated with its production process. The microscopic structure of paper consists of a random mesh of paper fibres connected by hydrogen bonds. This implies the existence of two damage mechanisms, the failure of a fibre-fibre bond and the failure of a fibre. This paper describes a hybrid mathematical model which couples the mechanics of the mass-spring model to the acoustic wave propagation model for use in generating the acoustic signal emitted by complex structures of paper fibres under strain. The derivation of the mass-spring model can be found in [1,2], with details of the acoustic wave equation found in [3,4]. The numerical implementation of the vibro-acoustic model is discussed in detail with particular emphasis on the damping present in the numerical model. The hybrid model uses an implicit solver which intrinsically introduces artificial damping to the solution. The artificial damping is shown to affect the frequency response of the mass-spring model, therefore certain restrictions on the simulation time step must be enforced so that the model produces physically accurate results. The hybrid mathematical model is used to simulate small fibre networks to provide information on the acoustic response of each damage mechanism. The simulated AEs are then analysed using a continuous wavelet transform (CWT), described in [5], which provides a two dimensional time-frequency representation of the signal. The AEs from the two damage mechanisms show different characteristics in the CWT so that it is possible to define a fibre-fibre bond failure by the criteria listed below. The dominant frequency components of the AE must be at approximately 250 kHz or 750 kHz. The strongest frequency component may be at either approximately 250 kHz or 750 kHz. The duration of the frequency component at approximately 250 kHz is longer than that of the frequency component at approximately 750 kHz. Similarly, the criteria for identifying a fibre failure are given below. The dominant frequency component of the AE must be greater than 800 kHz. The duration of the dominant frequency component must be less than 5.00E-06 seconds. The dominant frequency component must be present at the front of the AE. Essentially, the failure of a fibre-fibre bond produces a low frequency wave and the failure of a fibre produces a high frequency pulse. Using this theoretical criteria, it is now possible to train an intelligent classifier such as the Self-Organising Map (SOM) [6] using the experimental data. First certain features must be extracted from the CWTs of the AEs for use in training the SOM. For this work, each CWT is divided into 200 windows of 5E-06s in duration covering a 100 kHz frequency range. The power ratio for each windows is then calculated and used as a feature. Having extracted the features from the AEs, the SOM can now be trained, but care is required so that the both damage mechanisms are adequately represented in the training set. This is an issue with paper as the failure of the fibre-fibre bonds is the prevalent damage mechanism. Once a suitable training set is found, the SOM can be trained and its performance analysed. For the SOM described in this work, there is a good chance that it will correctly classify the experimental AEs.
Resumo:
The Continuous Plankton Recorder (CPR) survey was conceived from the outset as a programme of applied research designed to assist the fishing industry. Its survival and continuing vigour after 70 years is a testament to its utility, which has been achieved in spite of great changes in our understanding of the marine environment and in our concerns over how to manage it. The CPR has been superseded in several respects by other technologies, such as acoustics and remote sensing, but it continues to provide unrivalled seasonal and geographic information about a wide range of zooplankton and phytoplankton taxa. The value of this coverage increases with time and provides the basis for placing recent observations into the context of long-term, large-scale variability and thus suggesting what the causes are likely to be. Information from the CPR is used extensively in judging environmental impacts and producing quality status reports (QSR); it has shown the distributions of fish stocks, which had not previously been exploited; it has pointed to the extent of ungrazed phytoplankton production in the North Atlantic, which was a vital element in establishing the importance of carbon sequestration by phytoplankton. The CPR continues to be the principal source of large-scale, long-term information about the plankton ecosystem of the North Atlantic. It has recently provided extensive information about the biodiversity of the plankton and about the distribution of introduced species. It serves as a valuable example for the design of future monitoring of the marine environment and it has been essential to the design and implementation of most North Atlantic plankton research.
Resumo:
The continuous plankton recorder (CPR) survey is the largest multi-decadal plankton monitoring programme in the world. It was initiated in 1931 and by the end of 2004 had counted 207,619 samples and identified 437 phyto- and zooplankton taxa throughout the North Atlantic. CPR data are used extensively by the research community and in recent years have been used increasingly to underpin marine management. Here, we take a critical look at how best to use CPR data. We first describe the CPR itself, CPR sampling, and plankton counting procedures. We discuss the spatial and temporal biases in the Survey, summarise environmental data that have not previously been available, and describe the new data access policy. We supply information essential to using CPR data, including descriptions of each CPR taxonomic entity, the idiosyncrasies associated with counting many of the taxa, the logic behind taxonomic changes in the Survey, the semi-quantitative nature of CPR sampling, and recommendations on choosing the spatial and temporal scale of study. This forms the basis for a broader discussion on how to use CPR data for deriving ecologically meaningful indices based on size, functional groups and biomass that can be used to support research and management. This contribution should be useful for plankton ecologists, modellers and policy makers that actively use CPR data.
Resumo:
The Continuous Plankton Recorder (CPR) survey, operated by the Sir Alister Hardy Foundation for Ocean Science (SAHFOS), is the largest plankton monitoring programme in the world and has spanned >70 yr. The dataset contains information from ~200 000 samples, with over 2.3 million records of individual taxa. Here we outline the evolution of the CPR database through changes in technology, and how this has increased data access. Recent high-impact publications and the expanded role of CPR data in marine management demonstrate the usefulness of the dataset. We argue that solely supplying data to the research community is not sufficient in the current research climate; to promote wider use, additional tools need to be developed to provide visual representation and summary statistics. We outline 2 software visualisation tools, SAHFOS WinCPR and the digital CPR Atlas, which provide access to CPR data for both researchers and non-plankton specialists. We also describe future directions of the database, data policy and the development of visualisation tools. We believe that the approach at SAHFOS to increase data accessibility and provide new visualisation tools has enhanced awareness of the data and led to the financial security of the organisation; it also provides a good model of how long-term monitoring programmes can evolve to help secure their future.
Resumo:
Preserved and archived organic material offers huge potential for the conduct of retrospective and long-term historical ecosystem reconstructions using stable isotope analyses, but because of isotopic exchange with preservatives the obtained values require validation. The Continuous Plankton Recorder (CPR) Survey is the most extensive long-term monitoring program for plankton communities worldwide and has utilised ships of opportunity to collect samples since 1931. To keep the samples intact for subsequent analysis, they are collected and preserved in formalin; however, previous studies have found that this may alter stable carbon and nitrogen isotope ratios in zooplankton. A maximum ~0.9‰ increase of δ15N and a time dependent maximum ~1.0‰ decrease of δ13C were observed when the copepod, Calanus helgolandicus, was experimentally exposed to two formalin preservatives for 12 months. Applying specific correction factors to δ15N and δ13C values for similarly preserved Calanoid species collected by the CPR Survey within 12 months of analysis may be appropriate to enable their use in stable isotope studies. The isotope values of samples stored frozen did not differ significantly from those of controls. Although the impact of formalin preservation was relatively small in this and other studies of marine zooplankton, changes in isotope signatures are not consistent across taxa, especially for δ15N, indicating that species-specific studies may be required. Copyright © 2011 John Wiley & Sons, Ltd.
Resumo:
We review current knowledge and understanding of the biology and ecology of Centropages typicus in the European shelf-seas (e.g. North Sea, English Channel and Bay of Biscay). Our study is based on observations at seven coastal time-series stations as well as on the Continuous Plankton Recorder dataset. This paper focuses on the influence of the environmental parameters (e.g. temperature and Chla) on the life cycle and distribution of C typicus and provides a comparison with its congeneric species C. hamatus and C. chierchiae in the study area. Data on abundance, seasonality and egg production have been used to define the temperature and chlorophyll optima for occurrence and reproduction of Centropages spp. within this region of the European shelf-seas. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
We examined how marine plankton interaction networks, as inferred by multivariate autoregressive (MAR) analysis of time-series, differ based on data collected at a fixed sampling location (L4 station in the Western English Channel) and four similar time-series prepared by averaging Continuous Plankton Recorder (CPR) datapoints in the region surrounding the fixed station. None of the plankton community structures suggested by the MAR models generated from the CPR datasets were well correlated with the MAR model for L4, but of the four CPR models, the one most closely resembling the L4 model was that for the CPR region nearest to L4. We infer that observation error and spatial variation in plankton community dynamics influenced the model performance for the CPR datasets. A modified MAR framework in which observation error and spatial variation are explicitly incorporated could allow the analysis to better handle the diverse time-series data collected in marine environments.
Resumo:
Going Global: planning the next 80 years of the Continuous Plankton Recorder Survey. Operated by the Sir Alister Hardy Foundation for Ocean Science (SAHFOS), the Continuous Plankton Recorder (CPR) survey is the world’s largest, sampling 4 ocean basins, and longest running (since 1931) plankton biodiversity monitoring programme. Having sampled enough miles to circumnavigate the globe over 200 times, the CPR database houses over 2.5 million entries, describing the distribution of 500 phytoplankton and zooplankton taxa. Routinely sampling in the Arctic, Atlantic, Pacific and Southern Oceans, the survey analyses 4000 samples yearly. Data collected from these samples are made freely available for bona fide scientific purposes. The CPR survey data is used to generate a better understanding of changes in the plankton and to date some 1000 papers have been published on plankton biodiversity. This year sees the 80th anniversary of the CPR survey and to celebrate and build upon this unique monitoring programme, SAHFOS intends to further develop its global plankton perspective. Work will be extended into the South Atlantic and Indian Ocean and an international partnership with complementary surveys in Australia, Canada, America, Japan and South Africa will be implemented. The Digital Object will describe the CPR survey using compilations made by Plymouth Art College and BBC film footage.
Resumo:
Thermoforming processes generally employ sheet temperature monitoring as the primary means of process control. In this paper the development of an alternative system that monitors plug force is described. Tests using a prototype device have shown that the force record over a forming cycle creates a unique map of the process operation. Key process features such as the sheet modulus, sheet sag and the timing of the process stages may be readily observed, and the effects of changes in all of the major processing parameters are easily distinguished. Continuous, cycle-to-cycle tests show that the output is consistent and repeatable over a longer time frame, providing the opportunity for development of an on-line process control system. Further testing of the system is proposed.
Resumo:
In situ ellipsometry and Kerr polarimetry have been used to follow the continuous evolution of the optical and magneto- optical properties of multiple layers of Co and Pd during their growth. Films were sputter deposited onto a Pd buffer layer on glass substrates up to a maximum of N = 10 bi-layer periods according to the scheme glass/Pd(10)Ar x (0.3Co/3Pd) (nm). Magnetic hysteresis measurements taken during the deposition consistently showed strong perpendicular anisotropy at all stages of film growth following the deposition of a single monolayer of Co. Magneto-optic signals associated with the normal-incidence polar Kerr effect indicated strong polarization of Pd atoms at both Co-Pd and Pd-Co interfaces and that the magnitude of the complex magneto-optic Voigt parameter and the magnetic moment of the Pd decrease exponentially with distance from the interface with a decay constant of 1.1 nm(- 1). Theoretical simulations have provided an understanding of the observations and allow the determination of the ultrathin- film values of the elements of the skew-symmetric permittivity tensor that describe the optical and magneto-optical properties for both CO and Pd. Detailed structure in the observed Kerr ellipticity shows distinct Pd-thickness-dependent oscillations with a spatial period of about 1.6 nm that are believed to be associated with quantum well levels in the growing Pd layer.
Resumo:
The performance of the surface zone of concrete is acknowledged as a major factor governing the rate of deterioration of reinforced concrete structures as it provides the only barrier to the ingress of water containing dissolved ionic species such as chlorides which, ultimately, initiate corrosion of the reinforcement. In-situ monitoring of cover-zone concrete is therefore critical in attempting to make realistic predictions as to the in-service performance of the structure. To this end, this paper presents developments in a remote interrogation system to allow continuous, real-time monitoring of the cover-zone concrete from an office setting. Use is made of a multi-electrode array embedded within cover-zone concrete to acquire discretized electrical resistivity and temperature measurements, with both parameters monitored spatially and temporally. On-site instrumentation, which allows remote interrogation of concrete samples placed at a marine exposure site, is detailed, together with data handling and processing procedures. Site-measurements highlight the influence of temperature on electrical resistivity and an Arrhenius-based temperature correction protocol is developed using on-site measurements to standardize resistivity data to a reference temperature; this is an advancement over the use of laboratory-based procedures. The testing methodology and interrogation system represents a robust, low-cost and high-value technique which could be deployed for intelligent monitoring of reinforced concrete structures.
Resumo:
The monitoring of temperature and moisture changes in response to different micro-environment of building stones is essential to understand the material behaviour and the degradation mechanisms. From a practical point of view, having a continuous and detailed understanding of micro-environmental changes in building stones helps to assist in their maintenance and repair strategies. Temperature within the stone is usually monitored by means of thermistors, whereas wide ranges of techniques are available for monitoring the moisture. In the case of concrete an electrical resistance method has previously been used as an inexpensive tool for monitoring moisture changes. This paper describes the adaptation of this technique and describes its further development for monitoring moisture movement in building stones.
In this study a block of limestone was subjected to intermittent infrared radiation with programmed cycles of ambient temperature, rainfall and wind conditions in an automated climatic chamber. The temperature and moisture changes at different depths within the stone were monitored by means of bead thermistors and electrical resistance sensors. This experiment has helped to understand the thermal conductivity and moisture transport from surface into deeper parts of the stone at different simulated extreme climatic conditions. Results indicated that variations in external ambient conditions could substantially affect the moisture transport and temperature profile within the micro-environment of building stones and hence they could have a significant impact on stone decay.
Resumo:
OBJECTIVE - To evaluate an algorithm guiding responses of continuous subcutaneous insulin infusion (CSII)-treated type 1 diabetic patients using real-time continuous glucose monitoring (RT-CGM). RESEARCH DESIGN AND METHODS - Sixty CSII-treated type 1 diabetic participants (aged 13-70 years, including adult and adolescent subgroups, with A1C =9.5%) were randomized in age-, sex-, and A1C-matched pairs. Phase 1 was an open 16-week multicenter randomized controlled trial. Group A was treated with CSII/RT-CGM with the algorithm, and group B was treated with CSII/RT-CGM without the algorithm. The primary outcome was the difference in time in target (4-10 mmol/l) glucose range on 6-day masked CGM. Secondary outcomes were differences in A1C, low (=3.9 mmol/l) glucose CGM time, and glycemic variability. Phase 2 was the week 16-32 follow-up. Group A was returned to usual care, and group B was provided with the algorithm. Glycemia parameters were as above. Comparisons were made between baseline and 16 weeks and 32 weeks. RESULTS - In phase 1, after withdrawals 29 of 30 subjects were left in group A and 28 of 30 subjects were left in group B. The change in target glucose time did not differ between groups. A1C fell (mean 7.9% [95% CI 7.7-8.2to 7.6% [7.2-8.0]; P <0.03) in group A but not in group B (7.8% [7.5-8.1] to 7.7 [7.3-8.0]; NS) with no difference between groups. More subjects in group A achieved A1C =7% than those in group B (2 of 29 to 14 of 29 vs. 4 of 28 to 7 of 28; P = 0.015). In phase 2, one participant was lost from each group. In group A, A1C returned to baseline with RT-CGM discontinuation but did not change in group B, who continued RT-CGM with addition of the algorithm. CONCLUSIONS - Early but not late algorithm provision to type 1 diabetic patients using CSII/RT-CGM did not increase the target glucose time but increased achievement of A1C =7%. Upon RT-CGM cessation, A1C returned to baseline. © 2010 by the American Diabetes Association.