26 resultados para Monitoring tool
Resumo:
Previous research on damage detection based on the response of a structure to a moving load has reported decay in accuracy with increasing load speed. Using a 3D vehicle – bridge interaction model, this paper shows that the area under the filtered acceleration response of the bridge increases with increasing damage, even at highway load speeds. Once a datum reading is established, the area under subsequent readings can be monitored and compared with the baseline reading, if an increase is observed it may indicate the presence of damage. The sensitivity of the proposed approach to road roughness and noise is tested in several damage scenarios. The possibility of identifying damage in the bridge by analysing the acceleration response of the vehicle traversing it is also investigated. While vehicle acceleration is shown to be more sensitive to road roughness and noise and therefore less reliable than direct bridge measurements, damage is successfully identified in favourable scenarios.
Resumo:
This paper discusses the monitoring of complex nonlinear and time-varying processes. Kernel principal component analysis (KPCA) has gained significant attention as a monitoring tool for nonlinear systems in recent years but relies on a fixed model that cannot be employed for time-varying systems. The contribution of this article is the development of a numerically efficient and memory saving moving window KPCA (MWKPCA) monitoring approach. The proposed technique incorporates an up- and downdating procedure to adapt (i) the data mean and covariance matrix in the feature space and (ii) approximates the eigenvalues and eigenvectors of the Gram matrix. The article shows that the proposed MWKPCA algorithm has a computation complexity of O(N2), whilst batch techniques, e.g. the Lanczos method, are of O(N3). Including the adaptation of the number of retained components and an l-step ahead application of the MWKPCA monitoring model, the paper finally demonstrates the utility of the proposed technique using a simulated nonlinear time-varying system and recorded data from an industrial distillation column.
Resumo:
Nonlinear principal component analysis (PCA) based on neural networks has drawn significant attention as a monitoring tool for complex nonlinear processes, but there remains a difficulty with determining the optimal network topology. This paper exploits the advantages of the Fast Recursive Algorithm, where the number of nodes, the location of centres, and the weights between the hidden layer and the output layer can be identified simultaneously for the radial basis function (RBF) networks. The topology problem for the nonlinear PCA based on neural networks can thus be solved. Another problem with nonlinear PCA is that the derived nonlinear scores may not be statistically independent or follow a simple parametric distribution. This hinders its applications in process monitoring since the simplicity of applying predetermined probability distribution functions is lost. This paper proposes the use of a support vector data description and shows that transforming the nonlinear principal components into a feature space allows a simple statistical inference. Results from both simulated and industrial data confirm the efficacy of the proposed method for solving nonlinear principal component problems, compared with linear PCA and kernel PCA.
Resumo:
Coastal systems, such as rocky shores, are among the most heavily anthropogenically-impacted marine ecosystems and are also among the most productive in terms of ecosystem functioning. One of the greatest impacts on coastal ecosystems is nutrient enrichment from human activities such as agricultural run-off and discharge of sewage. The aim of this study was to identify and characterise potential effects of sewage discharges on the biotic diversity of rocky shores and to test current tools for assessing the ecological status of rocky shores in line with the EU Water Framework Directive (WFD). A sampling strategy was designed to test for effects of sewage outfalls on rocky shore assemblages on the east coast of Ireland and to identify the scale of the putative impact. In addition, a separate sampling programme based on the Reduced algal Species List (RSL), the current WFD monitoring tool for rocky shores in Ireland and the UK, was also completed by identifying algae and measuring percent cover in replicate samples on rocky shores during Summer. There was no detectable effect of sewage outfalls on benthic taxon diversity or assemblage structure. However, spatial variability of assemblages was greater at sites proximal or adjacent to sewage outfalls compared to shores without sewage outfalls present. Results based on the RSL, show that algal assemblages were not affected by the presence of sewage outfalls, except when classed into functional groups when variability was greater at the sites with sewage outfalls. A key finding of both surveys, was the prevalence of spatial and temporal variation of assemblages. It is recommended that future metrics of ecological status are based on quantified sampling designs, incorporate changes in variability of assemblages (indicative of community stability), consider shifts in assemblage structure and include both benthic fauna and flora to assess the status of rocky shores.
Resumo:
Subspace monitoring has recently been proposed as a condition monitoring tool that requires considerably fewer variables to be analysed compared to dynamic principal component analysis (PCA). This paper analyses subspace monitoring in identifying and isolating fault conditions, which reveals that the existing work suffers from inherent limitations if complex fault senarios arise. Based on the assumption that the fault signature is deterministic while the monitored variables are stochastic, the paper introduces a regression-based reconstruction technique to overcome these limitations. The utility of the proposed fault identification and isolation method is shown using a simulation example and the analysis of experimental data from an industrial reactive distillation unit.
Resumo:
This study concerns the spatial allocation of material flows, with emphasis on construction material in the Irish housing sector. It addresses some of the key issues concerning anthropogenic impact on the environment through spatial temporal visualisation of the flow of materials, wastes and emissions at different spatial levels. This is presented in the form of a spatial model, Spatial Allocation of Material Flow Analysis (SAMFA), which enables the simulation of construction material flows and associated energy use. SAMFA parallels the Island Limits project (EPA funded under 2004-SD-MS-22-M2), which aimed to create a material flow analysis of the Irish economy classified by industrial sector. SAMFA further develops this by attempting to establish the material flows at the subnational geographical scale that could be used in the development of local authority (LA) sustainability strategies and spatial planning frameworks by highlighting the cumulative environmental impacts of the development of the built environment. By drawing on the idea of planning support systems, SAMFA also aims to provide a cross-disciplinary, integrative medium for involving stakeholders in strategies for a sustainable built environment and, as such, would help illustrate the sustainability consequences of alternative The pilot run of the model in Kildare has shown that the model can be successfully calibrated and applied to develop alternative material flows and energy-use scenarios at the ED level. This has been demonstrated through the development of an integrated and a business-as-usual scenario, with the former integrating a range of potential material efficiency and energysaving policy options and the latter replicating conditions that best describe the current trend. Their comparison shows that the former is better than the latter in terms of both material and energy use. This report also identifies a number of potential areas of future research and areas of broader application. This includes improving the accuracy of the SAMFA model (e.g. by establishing actual life expectancy of buildings in the Irish context through field surveys) and the extension of the model to other Irish counties. This would establish SAMFA as a valuable predicting and monitoring tool that is capable of integrating national and local spatial planning objectives with actual environmental impacts. Furthermore, should the model prove successful at this level, it then has the potential to transfer the modelling approach to other areas of the built environment, such as commercial development and other key contributors of greenhouse emissions. The ultimate aim is to develop a meta-model for predicting the consequences of consumption patterns at the local scale. This therefore offers the possibility of creating critical links between socio technical systems with the most important challenge of all the limitations of the biophysical environment.
Resumo:
There has always been a question mark over how best to integrate developing countries into the world trading system and traditionally the WTO has used special and differential treatment (S&D) to do so. However, since 1996 the WTO has been involved with the Aid for Trade (AfT) initiative typically co-ordinated by the OECD and UN. This article firstly outlines the background to AfT since 1996 under the numerous agencies working in the area, highlighting how importance has always been placed on the monitoring and effectiveness of the process. It then turns to assessing the various methods currently used and the proposal of the WTO’s Trade Policy Review Mechanism (TPRM) as a potential monitoring tool of AfT.
Resumo:
Beta diversity describes how local communities within an area or region differ in species composition/abundance. There have been attempts to use changes in beta diversity as a biotic indicator of disturbance, but lack of theory and methodological caveats have hampered progress. We here propose that the neutral theory of biodiversity plus the definition of beta diversity as the total variance of a community matrix provide a suitable, novel, starting point for ecological applications. Observed levels of beta diversity (BD) can be compared to neutral predictions with three possible outcomes: Observed BD equals neutral prediction or is larger (divergence) or smaller (convergence) than the neutral prediction. Disturbance might lead to either divergence or convergence, depending on type and strength. We here apply these ideas to datasets collected on oribatid mites (a key, very diverse soil taxon) under several regimes of disturbances. When disturbance is expected to increase the heterogeneity of soil spatial properties or the sampling strategy encompassed a range of diverging environmental conditions, we observed diverging assemblages. On the contrary, we observed patterns consistent with neutrality when disturbance could determine homogenization of soil properties in space or the sampling strategy encompassed fairly homogeneous areas. With our method, spatial and temporal changes in beta diversity can be directly and easily monitored to detect significant changes in community dynamics, although the method itself cannot inform on underlying mechanisms. However, human-driven disturbances and the spatial scales at which they operate are usually known. In this case, our approach allows the formulation of testable predictions in terms of expected changes in beta diversity, thereby offering a promising monitoring tool.
Resumo:
One of the most cost effective methods of pollution remediation is through natural attenuation where the resident microorganisms are responsible for the breakdown of pollutants (Dou et al. 2008). Other forms of bioremediation - such as analogue enrichment, composting and bio-venting - also use the microbes already present in a contaminated site to enhance the remediation process. In order for these approaches to be successful, in an industrial setting, some form of monitoring needs to take place enabling conclusions to be drawn about the degradation processes occurring. In this review we look at some key molecular biology techniques that have the potential to act as a monitoring tool for industries dealing with contaminated land.
Resumo:
Chemical Imaging (CI) is an emerging platform technology that integrates conventional imaging and spectroscopy to attain both spatial and spectral information from an object. Vibrational spectroscopic methods, such as Near Infrared (NIR) and Raman spectroscopy, combined with imaging are particularly useful for analysis of biological/pharmaceutical forms. The rapid, non-destructive and non-invasive features of CI mark its potential suitability as a process analytical tool for the pharmaceutical industry, for both process monitoring and quality control in the many stages of drug production. This paper provides an overview of CI principles, instrumentation and analysis. Recent applications of Raman and NIR-CI to pharmaceutical quality and process control are presented; challenges facing Cl implementation and likely future developments in the technology are also discussed. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Introduction Delirium in the intensive care unit (ICU) is associated with increased morbidity and mortality. Using an assessment tool has been shown to improve the ability of clinicians in the ICU to detect delirium. The confusion assessment method for the ICU (CAM-ICU) is a validated delirium-screening tool for critically ill intubated patients. The aim of this project was to establish the feasibility of routine delirium screening using the CAM-ICU and to identify the incidence of delirium in a UK critical care unit.
Resumo:
OBJECTIVES: To determine the extent to which the use of a clinical informatics tool that implements prospective monitoring plans reduces the incidence of potential delirium, falls, hospitalizations potentially due to adverse drug events, and mortality.
DESIGN: Randomized cluster trial.
SETTING: Twenty-five nursing homes serviced by two long-term care pharmacies.
PARTICIPANTS: Residents living in nursing homes during 2003 (1,711 in 12 intervention; 1,491 in 13 usual care) and 2004 (1,769 in 12 intervention; 1,552 in 13 usual care).
INTERVENTION: The pharmacy automatically generated Geriatric Risk Assessment MedGuide (GRAM) reports and automated monitoring plans for falls and delirium within 24 hours of admission or as part of the normal time frame of federally mandated drug regimen review.
MEASUREMENTS: Incidence of potential delirium, falls, hospitalizations potentially due to adverse drug events, and mortality.
RESULTS: GRAM triggered monitoring plans for 491 residents. Newly admitted residents in the intervention homes experienced a lower rate of potential delirium onset than those in usual care homes (adjusted hazard ratio (HR)=0.42, 95% confidence interval (CI)=0.35–0.52), overall hospitalization (adjusted HR=0.89, 95% CI=0.72–1.09), and mortality (adjusted HR=0.88, 95% CI=0.66–1.16). In longer stay residents, the effects of the intervention were attenuated, and all estimates included unity.
CONCLUSION: Using health information technology in long-term care pharmacies to identify residents who might benefit from the implementation of prospective medication monitoring care plans when complex medication regimens carry potential risks for falls and delirium may reduce adverse effects associated with appropriate medication use.
Resumo:
As a clinically complex neurodegenerative disease, Parkinson's disease (PD) requires regular assessment and close monitoring. In our current study, we have developed a home-based tool designed to monitor and assess peripheral motor symptoms. An evaluation of the tool was carried out over a period of ten weeks on ten people with idiopathic PD. Participants were asked to use the tool twice daily over four days, once when their medication was working at its best (