69 resultados para Monitoring Systems
Resumo:
This paper presents a preliminary study of developing a novel distributed adaptive real-time learning framework for wide area monitoring of power systems integrated with distributed generations using synchrophasor technology. The framework comprises distributed agents (synchrophasors) for autonomous local condition monitoring and fault detection, and a central unit for generating global view for situation awareness and decision making. Key technologies that can be integrated into this hierarchical distributed learning scheme are discussed to enable real-time information extraction and knowledge discovery for decision making, without explicitly accumulating and storing all raw data by the central unit. Based on this, the configuration of a wide area monitoring system of power systems using synchrophasor technology, and the functionalities for locally installed open-phasor-measurement-units (OpenPMUs) and a central unit are presented. Initial results on anti-islanding protection using the proposed approach are given to illustrate the effectiveness.
Resumo:
Background: Developing complex interventions for testing in randomised controlled trials is of increasing importance in healthcare planning. There is a need for careful design of interventions for secondary prevention of coronary heart disease (CHD). It has been suggested that integrating qualitative research in the development of a complex intervention may contribute to optimising its design but there is limited evidence of this in practice. This study aims to examine the contribution of qualitative research in developing a complex intervention to improve the provision and uptake of secondary prevention of CHD within primary care in two different healthcare systems.
Methods: In four general practices, one rural and one urban, in Northern Ireland and the Republic of Ireland, patients with CHD were purposively selected. Four focus groups with patients (N = 23) and four with staff (N = 29) informed the development of the intervention by exploring how it could be tailored and integrated with current secondary prevention activities for CHD in the two healthcare settings. Following an exploratory trial the acceptability and feasibility of the intervention were discussed in four focus groups (17 patients) and 10 interviews (staff). The data were analysed using thematic analysis.
Results: Integrating qualitative research into the development of the intervention provided depth of information about the varying impact, between the two healthcare systems, of different funding and administrative arrangements, on their provision of secondary prevention and identified similar barriers of time constraints, training needs and poor patient motivation. The findings also highlighted the importance to patients of stress management, the need for which had been underestimated by the researchers. The qualitative evaluation provided depth of detail not found in evaluation questionnaires. It highlighted how the intervention needed to be more practical by minimising administration, integrating role plays into behaviour change training, providing more practical information about stress management and removing self-monitoring of lifestyle change.
Conclusion: Qualitative research is integral to developing the design detail of a complex intervention and tailoring its components to address individuals' needs in different healthcare systems. The findings highlight how qualitative research may be a valuable component of the preparation for complex interventions and their evaluation.
Resumo:
This is the first paper that shows and theoretically analyses that the presence of auto-correlation can produce considerable alterations in the Type I and Type II errors in univariate and multivariate statistical control charts. To remove this undesired effect, linear inverse ARMA filter are employed and the application studies in this paper show that false alarms (increased Type I errors) and an insensitive monitoring statistics (increased Type II errors) were eliminated.
Resumo:
This paper theoretically analysis the recently proposed "Extended Partial Least Squares" (EPLS) algorithm. After pointing out some conceptual deficiencies, a revised algorithm is introduced that covers the middle ground between Partial Least Squares and Principal Component Analysis. It maximises a covariance criterion between a cause and an effect variable set (partial least squares) and allows a complete reconstruction of the recorded data (principal component analysis). The new and conceptually simpler EPLS algorithm has successfully been applied in detecting and diagnosing various fault conditions, where the original EPLS algorithm did only offer fault detection.
Resumo:
Aim. This paper is a report of a study to describe how treatment fidelity is being enhanced and monitored, using a model from the National Institutes of Health Behavior Change Consortium. Background. The objective of treatment fidelity is to minimize errors in interpreting research trial outcomes, and to ascribe those outcomes directly to the intervention at hand. Treatment fidelity procedures are included in trials of complex interventions to account for inferences made from study outcomes. Monitoring treatment fidelity can help improve study design, maximize reliability of results, increase statistical power, determine whether theory-based interventions are responsible for observed changes, and inform the research dissemination process. Methods. Treatment fidelity recommendations from the Behavior Change Consortium were applied to the SPHERE study (Secondary Prevention of Heart DiseasE in GeneRal PracticE), a randomized controlled trial of a complex intervention. Procedures to enhance and monitor intervention implementation included standardizing training sessions, observing intervention consultations, structuring patient recall systems, and using written practice and patient care plans. The research nurse plays an important role in monitoring intervention implementation. Findings. Several methods of applying treatment fidelity procedures to monitoring interventions are possible. The procedure used may be determined by availability of appropriate personnel, fiscal constraints, or time limits. Complex interventions are not straightforward and necessitate a monitoring process at trial stage. Conclusion. The Behavior Change Consortium’s model of treatment fidelity is useful for structuring a system to monitor the implementation of a complex intervention, and helps to increase the reliability and validity of evaluation findings.
Resumo:
This paper presents two new approaches for use in complete process monitoring. The firstconcerns the identification of nonlinear principal component models. This involves the application of linear
principal component analysis (PCA), prior to the identification of a modified autoassociative neural network (AAN) as the required nonlinear PCA (NLPCA) model. The benefits are that (i) the number of the reduced set of linear principal components (PCs) is smaller than the number of recorded process variables, and (ii) the set of PCs is better conditioned as redundant information is removed. The result is a new set of input data for a modified neural representation, referred to as a T2T network. The T2T NLPCA model is then used for complete process monitoring, involving fault detection, identification and isolation. The second approach introduces a new variable reconstruction algorithm, developed from the T2T NLPCA model. Variable reconstruction can enhance the findings of the contribution charts still widely used in industry by reconstructing the outputs from faulty sensors to produce more accurate fault isolation. These ideas are illustrated using recorded industrial data relating to developing cracks in an industrial glass melter process. A comparison of linear and nonlinear models, together with the combined use of contribution charts and variable reconstruction, is presented.
Resumo:
This paper provides an overview of the current field in wireless networks for monitoring and control. Alternative wireless technologies are introduced, together with current typical industrial applications. The focus then shifts to wireless Ethernet and the specialised requirements for wireless networked control systems (WNCS) are discussed. This is followed by a brief look at some current WNCS research, including reduced communication control.
Resumo:
Treasure et al. (2004) recently proposed a new sub space-monitoring technique, based on the N4SID algorithm, within the multivariate statistical process control framework. This dynamic-monitoring method requires considerably fewer variables to be analysed when compared with dynamic principal component analysis (PCA). The contribution charts and variable reconstruction, traditionally employed for static PCA, are analysed in a dynamic context. The contribution charts and variable reconstruction may be affected by the ratio of the number of retained components to the total number of analysed variables. Particular problems arise if this ratio is large and a new reconstruction chart is introduced to overcome these. The utility of such a dynamic contribution chart and variable reconstruction is shown in a simulation and by application to industrial data from a distillation unit.
Resumo:
This paper introduces a fast algorithm for moving window principal component analysis (MWPCA) which will adapt a principal component model. This incorporates the concept of recursive adaptation within a moving window to (i) adapt the mean and variance of the process variables, (ii) adapt the correlation matrix, and (iii) adjust the PCA model by recomputing the decomposition. This paper shows that the new algorithm is computationally faster than conventional moving window techniques, if the window size exceeds 3 times the number of variables, and is not affected by the window size. A further contribution is the introduction of an N-step-ahead horizon into the process monitoring. This implies that the PCA model, identified N-steps earlier, is used to analyze the current observation. For monitoring complex chemical systems, this work shows that the use of the horizon improves the ability to detect slowly developing drifts.
Resumo:
Subspace monitoring has recently been proposed as a condition monitoring tool that requires considerably fewer variables to be analysed compared to dynamic principal component analysis (PCA). This paper analyses subspace monitoring in identifying and isolating fault conditions, which reveals that the existing work suffers from inherent limitations if complex fault senarios arise. Based on the assumption that the fault signature is deterministic while the monitored variables are stochastic, the paper introduces a regression-based reconstruction technique to overcome these limitations. The utility of the proposed fault identification and isolation method is shown using a simulation example and the analysis of experimental data from an industrial reactive distillation unit.
Resumo:
This paper details the implementation and operational performance of a minimum-power 2.45-GHz pulse receiver and a companion on-off keyed transmitter for use in a semi-active duplex RF biomedical transponder. A 50-Ohm microstrip stub-matched zero-bias diode detector forms the heart of a body-worn receiver that has a CMOS baseband amplifier consuming 20 microamps from +3 V and achieves a tangential sensitivity of -53 dBm. The base transmitter generates 0.5 W of peak RF output power into 50 Ohms. Both linear and right-hand circularly polarized Tx-Rx antenna sets were employed in system reliability trials carried out in a hospital Coronary Care Unit, For transmitting antenna heights between 0.3 and 2.2 m above floor level, transponder interrogations were 95% reliable within the 67-m-sq area of the ward, falling to an average of 46 % in the surrounding rooms and corridors. Overall, the circular antenna set gave the higher reliability and lower propagation power decay index.
Resumo:
Special issue on Sensor Systems for Structural Health Monitoring Abstract—This study addresses the direct calibration of optical fiber strain sensors used for structural monitoring and is carried out in situ. The behavior of fiber-Bragg-grating-based sensor systems when attached to metal bars, in a manner representative of their use as reinforcement bars in structures, was examined and their response calibrated. To ensure the validity of the measurements,this was done using an extensometer with a further calibrationagainst the response of electrical resistance strain gauges, often conventionally used, for comparison. The results show a repeatable calibration generating a suitable geometric factor of extension to strain for these sensors, to enable accurate strain data to be obtained when the fiber-optic sensor system is in use in structural monitoring applications.