948 resultados para Monitoring tool
Resumo:
As civil infrastructures such as bridges age, there is a concern for safety and a need for cost-effective and reliable monitoring tool. Different diagnostic techniques are available nowadays for structural health monitoring (SHM) of bridges. Acoustic emission is one such technique with potential of predicting failure. The phenomenon of rapid release of energy within a material by crack initiation or growth in form of stress waves is known as acoustic emission (AE). AEtechnique involves recording the stress waves bymeans of sensors and subsequent analysis of the recorded signals,which then convey information about the nature of the source. AE can be used as a local SHM technique to monitor specific regions with visible presence of cracks or crack prone areas such as welded regions and joints with bolted connection or as a global technique to monitor the whole structure. Strength of AE technique lies in its ability to detect active crack activity, thus helping in prioritising maintenance work by helping focus on active cracks rather than dormant cracks. In spite of being a promising tool, some challenges do still exist behind the successful application of AE technique. One is the generation of large amount of data during the testing; hence an effective data analysis and management is necessary, especially for long term monitoring uses. Complications also arise as a number of spurious sources can giveAEsignals, therefore, different source discrimination strategies are necessary to identify genuine signals from spurious ones. Another major challenge is the quantification of damage level by appropriate analysis of data. Intensity analysis using severity and historic indices as well as b-value analysis are some important methods and will be discussed and applied for analysis of laboratory experimental data in this paper.
Resumo:
Previous research on damage detection based on the response of a structure to a moving load has reported decay in accuracy with increasing load speed. Using a 3D vehicle – bridge interaction model, this paper shows that the area under the filtered acceleration response of the bridge increases with increasing damage, even at highway load speeds. Once a datum reading is established, the area under subsequent readings can be monitored and compared with the baseline reading, if an increase is observed it may indicate the presence of damage. The sensitivity of the proposed approach to road roughness and noise is tested in several damage scenarios. The possibility of identifying damage in the bridge by analysing the acceleration response of the vehicle traversing it is also investigated. While vehicle acceleration is shown to be more sensitive to road roughness and noise and therefore less reliable than direct bridge measurements, damage is successfully identified in favourable scenarios.
Resumo:
We used differential GPS measurements from a 13 station GPS network spanning the Santa Ana Volcano and Coatepeque Caldera to characterize the inter-eruptive activity and tectonic movements near these two active and potentially hazardous features. Caldera-forming events occurred from 70-40 ka and at Santa Ana/Izalco volcanoes eruptive activity occurred as recently as 2005. Twelve differential stations were surveyed for 1 to 2 hours on a monthly basis from February through September 2009 and tied to a centrally located continuous GPS station, which serves as the reference site for this volcanic network. Repeatabilities of the averages from 20-minute sessions taken over 20 hours or longer range from 2-11 mm in the horizontal (north and east) components of the inter-station baselines, suggesting a lower detection limit for the horizontal components of any short-term tectonic or volcanic deformation. Repeatabilities of the vertical baseline component range from 12-34 mm. Analysis of the precipitable water vapor in the troposphere suggests that tropospheric decorrelation as a function of baseline lengths and variable site elevations are the most likely sources of vertical error. Differential motions of the 12 sites relative to the continuous reference site reveal inflation from February through July at several sites surrounding the caldera with vertical displacements that range from 61 mm to 139 mm followed by a lower magnitude deflation event on 1.8-7.4 km-long baselines. Uplift rates for the inflationary period reach 300 mm/yr with 1σ uncertainties of +/- 26 – 119 mm. Only one other station outside the caldera exhibits a similar deformation trend, suggesting a localized source. The results suggest that the use of differential GPS measurements from short duration occupations over short baselines can be a useful monitoring tool at sub-tropical volcanoes and calderas.
Resumo:
There is now an emerging need for an efficient modeling strategy to develop a new generation of monitoring systems. One method of approaching the modeling of complex processes is to obtain a global model. It should be able to capture the basic or general behavior of the system, by means of a linear or quadratic regression, and then superimpose a local model on it that can capture the localized nonlinearities of the system. In this paper, a novel method based on a hybrid incremental modeling approach is designed and applied for tool wear detection in turning processes. It involves a two-step iterative process that combines a global model with a local model to take advantage of their underlying, complementary capacities. Thus, the first step constructs a global model using a least squares regression. A local model using the fuzzy k-nearest-neighbors smoothing algorithm is obtained in the second step. A comparative study then demonstrates that the hybrid incremental model provides better error-based performance indices for detecting tool wear than a transductive neurofuzzy model and an inductive neurofuzzy model.
Resumo:
Dans ce projet de recherche, le dépôt des couches minces de carbone amorphe (généralement connu sous le nom de DLC pour Diamond-Like Carbon en anglais) par un procédé de dépôt chimique en phase vapeur assisté par plasma (ou PECVD pour Plasma Enhanced Chemical Vapor deposition en anglais) a été étudié en utilisant la Spectroscopie d’Émission Optique (OES) et l’analyse partielle par régression des moindres carrés (PLSR). L’objectif de ce mémoire est d’établir un modèle statistique pour prévoir les propriétés des revêtements DLC selon les paramètres du procédé de déposition ou selon les données acquises par OES. Deux séries d’analyse PLSR ont été réalisées. La première examine la corrélation entre les paramètres du procédé et les caractéristiques du plasma pour obtenir une meilleure compréhension du processus de dépôt. La deuxième série montre le potentiel de la technique d’OES comme outil de surveillance du procédé et de prédiction des propriétés de la couche déposée. Les résultats montrent que la prédiction des propriétés des revêtements DLC qui était possible jusqu’à maintenant en se basant sur les paramètres du procédé (la pression, la puissance, et le mode du plasma), serait envisageable désormais grâce aux informations obtenues par OES du plasma (particulièrement les indices qui sont reliées aux concentrations des espèces dans le plasma). En effet, les données obtenues par OES peuvent être utilisées pour surveiller directement le processus de dépôt plutôt que faire une étude complète de l’effet des paramètres du processus, ceux-ci étant strictement reliés au réacteur plasma et étant variables d’un laboratoire à l’autre. La perspective de l’application d’un modèle PLSR intégrant les données de l’OES est aussi démontrée dans cette recherche afin d’élaborer et surveiller un dépôt avec une structure graduelle.
Resumo:
Directors of nonprofits in most countries have legal responsibility for monitoring organisational performance (Brody 2010), although there is typically little guidance on how this should occur. The balanced scorecard (BSC) (Kaplan & Norton, 1996, 2001) potentially provides boards with a monitoring tool (Kaplan $ Norton, 2006; Lorsch, 2002). The BSC is intended to help integrate performance measurement, performance management and strategy implmentation (Kaplan 2009). The scorecards is balanced in that it should incorporate both financial and non-financial measures, external and internal perspectives, short and long-term objectives and both lagging and leading indicators. It is a relatively simple tool, but with potentially profound implications for directing board attention and sbusequent action (Ocasio, 1997; Salterio, 2012).
Resumo:
As there are a myriad of micro organic pollutants that can affect the well-being of human and other organisms in the environment the need for an effective monitoring tool is eminent. Passive sampling techniques, which have been developed over the last decades, could provide several advantages to the conventional sampling methods including simpler sampling devices, more cost-effective sampling campaign, providing time-integrated load as well as representative average of concentrations of pollutants in the environment. Those techniques have been applied to monitor many pollutants caused by agricultural activities, i.e. residues of pesticides, veterinary drugs and so on. Several types of passive samplers are commercially available and their uses are widely accepted. However, not many applications of those techniques have been found in Japan, especially in the field of agricultural environment. This paper aims to introduce the field of passive sampling and then to describe some applications of passive sampling techniques in environmental monitoring studies related to the agriculture industry.
Resumo:
Realization of cloud computing has been possible due to availability of virtualization technologies on commodity platforms. Measuring resource usage on the virtualized servers is difficult because of the fact that the performance counters used for resource accounting are not virtualized. Hence, many of the prevalent virtualization technologies like Xen, VMware, KVM etc., use host specific CPU usage monitoring, which is coarse grained. In this paper, we present a performance monitoring tool for KVM based virtualized machines, which measures the CPU overhead incurred by the hypervisor on behalf of the virtual machine along-with the CPU usage of virtual machine itself. This fine-grained resource usage information, provided by the above tool, can be used for diverse situations like resource provisioning to support performance associated QoS requirements, identification of bottlenecks during VM placements, resource profiling of applications in cloud environments, etc. We demonstrate a use case of this tool by measuring the performance of web-servers hosted on a KVM based virtualized server.
Resumo:
We conducted a pilot study on 10 patients undergoing general surgery to test the feasibility of diffuse reflectance spectroscopy in the visible wavelength range as a noninvasive monitoring tool for blood loss during surgery. Ratios of raw diffuse reflectance at wavelength pairs were tested as a first-pass for estimating hemoglobin concentration. Ratios can be calculated easily and rapidly with limited post-processing, and so this can be considered a near real-time monitoring device. We found the best hemoglobin correlations were when ratios at isosbestic points of oxy- and deoxyhemoglobin were used, specifically 529/500 nm. Baseline subtraction improved correlations, specifically at 520/509 nm. These results demonstrate proof-of-concept for the ability of this noninvasive device to monitor hemoglobin concentration changes due to surgical blood loss. The 529/500 nm ratio also appears to account for variations in probe pressure, as determined from measurements on two volunteers.
Resumo:
This paper discusses the monitoring of complex nonlinear and time-varying processes. Kernel principal component analysis (KPCA) has gained significant attention as a monitoring tool for nonlinear systems in recent years but relies on a fixed model that cannot be employed for time-varying systems. The contribution of this article is the development of a numerically efficient and memory saving moving window KPCA (MWKPCA) monitoring approach. The proposed technique incorporates an up- and downdating procedure to adapt (i) the data mean and covariance matrix in the feature space and (ii) approximates the eigenvalues and eigenvectors of the Gram matrix. The article shows that the proposed MWKPCA algorithm has a computation complexity of O(N2), whilst batch techniques, e.g. the Lanczos method, are of O(N3). Including the adaptation of the number of retained components and an l-step ahead application of the MWKPCA monitoring model, the paper finally demonstrates the utility of the proposed technique using a simulated nonlinear time-varying system and recorded data from an industrial distillation column.
Resumo:
Nonlinear principal component analysis (PCA) based on neural networks has drawn significant attention as a monitoring tool for complex nonlinear processes, but there remains a difficulty with determining the optimal network topology. This paper exploits the advantages of the Fast Recursive Algorithm, where the number of nodes, the location of centres, and the weights between the hidden layer and the output layer can be identified simultaneously for the radial basis function (RBF) networks. The topology problem for the nonlinear PCA based on neural networks can thus be solved. Another problem with nonlinear PCA is that the derived nonlinear scores may not be statistically independent or follow a simple parametric distribution. This hinders its applications in process monitoring since the simplicity of applying predetermined probability distribution functions is lost. This paper proposes the use of a support vector data description and shows that transforming the nonlinear principal components into a feature space allows a simple statistical inference. Results from both simulated and industrial data confirm the efficacy of the proposed method for solving nonlinear principal component problems, compared with linear PCA and kernel PCA.
Resumo:
Coastal systems, such as rocky shores, are among the most heavily anthropogenically-impacted marine ecosystems and are also among the most productive in terms of ecosystem functioning. One of the greatest impacts on coastal ecosystems is nutrient enrichment from human activities such as agricultural run-off and discharge of sewage. The aim of this study was to identify and characterise potential effects of sewage discharges on the biotic diversity of rocky shores and to test current tools for assessing the ecological status of rocky shores in line with the EU Water Framework Directive (WFD). A sampling strategy was designed to test for effects of sewage outfalls on rocky shore assemblages on the east coast of Ireland and to identify the scale of the putative impact. In addition, a separate sampling programme based on the Reduced algal Species List (RSL), the current WFD monitoring tool for rocky shores in Ireland and the UK, was also completed by identifying algae and measuring percent cover in replicate samples on rocky shores during Summer. There was no detectable effect of sewage outfalls on benthic taxon diversity or assemblage structure. However, spatial variability of assemblages was greater at sites proximal or adjacent to sewage outfalls compared to shores without sewage outfalls present. Results based on the RSL, show that algal assemblages were not affected by the presence of sewage outfalls, except when classed into functional groups when variability was greater at the sites with sewage outfalls. A key finding of both surveys, was the prevalence of spatial and temporal variation of assemblages. It is recommended that future metrics of ecological status are based on quantified sampling designs, incorporate changes in variability of assemblages (indicative of community stability), consider shifts in assemblage structure and include both benthic fauna and flora to assess the status of rocky shores.
Resumo:
Dissertação de mestrado, Engenharia Informática, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2015
Resumo:
The pulmonary artery catheter (PAC) is a powerful tool that has been used extensively in the assessment and monitoring of cardiovascular physiology. Gross misinterpretation of data gathered by the PAC is common, and its routine use without any specific interventions has not been shown to influence outcome. However, there currently is no evidence from randomized, controlled trials that any diagnostic or monitoring tool used in intensive care patients improves outcome. Studies evaluating the use of the PAC have included numerous potential confounding factors, and should be interpreted with caution. The information obtained with the PAC should be used to find better treatment strategies, and these strategies, instead of the tool itself, should be tested in clinical trials.
Resumo:
Previous research has shown dietary intake self-monitoring, and culturally tailored weight loss interventions to be effective tools for weight loss. Technology can be used to tailor weight loss interventions to better suit adolescents. There is a lack of research to date on the use of personal digital assistants (PDAs) to self-monitor dietary intake among adolescents. The objective of this study was to determine the difference in dietary intake self-monitoring frequency between using a Personal Digital Assistant (PDA) or paper logs as a diet diary in obese adolescent females; and to describe differences in diet adherence, as well as changes in body size and self-efficacy to resist eating. We hypothesized dietary intake self-monitoring frequency would be greater during PDA use than during paper log use. This study was a randomized crossover trial. Participants recorded their diet for 4 weeks: 2 weeks on a PDA and 2 weeks on paper logs. Thirty-four obese females ages 12-20 were recruited for participation. Thirty were included in analyses. Participants recorded more entries/day while using the paper logs (4.10 entries/day ± 0.63) than while using the PDA (3.01 entries/day ±0.75) (p<0.001). Significantly more meals and snacks were skipped during paper log use (0.81/day ± 0.65) than during PDA use (0.23/day ± 0.22) (p=0.011). Changes in body size (BMI, weight, and waist circumference) and self-efficacy to resist eating did not differ significantly between PDA and paper log use. When compared to paper logs, participants felt the PDA was more convenient (p=0.020), looked forward to using the PDA more (p=0.008), and would rather continue using the PDA than the paper logs (p=0.020). The findings of this study indicate use of a PDA as a dietary intake self-monitoring tool among adolescents would not result in increased dietary intake self-monitoring to aid in weight loss. Use of paper logs would result in greater data returned to clinicians, though use of PDAs would likely get adolescents more excited about adhering to recommendations to record their diet. Future research should look at updated communication devices, such as cell phones and other PDAs with additional features, and the role they can play in increasing dietary intake self-monitoring among adolescents.^