193 resultados para detection systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

RATIONALE: Polymer-based surface coatings in outdoor applications experience accelerated degradation due to exposure to solar radiation, oxygen and atmospheric pollutants. These deleterious agents cause undesirable changes to the aesthetic and mechanical properties of the polymer, reducing its lifetime. The use of antioxidants such as hindered amine light stabilisers (HALS) retards these degradative processes; however, mechanisms for HALS action and polymer degradation are poorly understood. METHODS: Detection of the HALS TINUVINW123 (bis(1-octyloxy-2,2,6,6-tetramethyl-4-piperidyl) sebacate) and the polymer degradation products directly from a polyester-based coil coating was achieved by liquid extraction surface analysis (LESA) coupled to a triple quadrupole QTRAPW 5500 mass spectrometer. The detection of TINUVINW123 and melamine was confirmed by the characteristic fragmentation pattern observed in LESA-MS/MS spectra that was identical to that reported for authentic samples. RESULTS: Analysis of an unstabilised coil coating by LESA-MS after exposure to 4 years of outdoor field testing revealed the presence of melamine (1,3,5-triazine-2,4,6-triamine) as a polymer degradation product at elevated levels. Changes to the physical appearance of the coil coating, including powder-like deposits on the coating's surface, were observed to coincide with melamine deposits and are indicative of the phenomenon known as polymer ' blooming'. CONCLUSIONS: For the first time, in situ detection of analytes from a thermoset polymer coating was accomplished without any sample preparation, providing advantages over traditional extraction-analysis approaches and some contemporary ambient MS methods. Detection of HALS and polymer degradation products such as melamine provides insight into the mechanisms by which degradation occurs and suggests LESA-MS is a powerful new tool for polymer analysis. Copyright (C) 2012 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As high-throughput genetic marker screening systems are essential for a range of genetics studies and plant breeding applications, the International RosBREED SNP Consortium (IRSC) has utilized the Illumina Infinium® II system to develop a medium- to high-throughput SNP screening tool for genome-wide evaluation of allelic variation in apple (Malus×domestica) breeding germplasm. For genome-wide SNP discovery, 27 apple cultivars were chosen to represent worldwide breeding germplasm and re-sequenced at low coverage with the Illumina Genome Analyzer II. Following alignment of these sequences to the whole genome sequence of 'Golden Delicious', SNPs were identified using SoapSNP. A total of 2,113,120 SNPs were detected, corresponding to one SNP to every 288 bp of the genome. The Illumina GoldenGate® assay was then used to validate a subset of 144 SNPs with a range of characteristics, using a set of 160 apple accessions. This validation assay enabled fine-tuning of the final subset of SNPs for the Illumina Infinium® II system. The set of stringent filtering criteria developed allowed choice of a set of SNPs that not only exhibited an even distribution across the apple genome and a range of minor allele frequencies to ensure utility across germplasm, but also were located in putative exonic regions to maximize genotyping success rate. A total of 7867 apple SNPs was established for the IRSC apple 8K SNP array v1, of which 5554 were polymorphic after evaluation in segregating families and a germplasm collection. This publicly available genomics resource will provide an unprecedented resolution of SNP haplotypes, which will enable marker-locus-trait association discovery, description of the genetic architecture of quantitative traits, investigation of genetic variation (neutral and functional), and genomic selection in apple.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a practical recursive fault detection and diagnosis (FDD) scheme for online identification of actuator faults for unmanned aerial systems (UASs) based on the unscented Kalman filtering (UKF) method. The proposed FDD algorithm aims to monitor health status of actuators and provide indication of actuator faults with reliability, offering necessary information for the design of fault-tolerant flight control systems to compensate for side-effects and improve fail-safe capability when actuator faults occur. The fault detection is conducted by designing separate UKFs to detect aileron and elevator faults using a nonlinear six degree-of-freedom (DOF) UAS model. The fault diagnosis is achieved by isolating true faults by using the Bayesian Classifier (BC) method together with a decision criterion to avoid false alarms. High-fidelity simulations with and without measurement noise are conducted with practical constraints considered for typical actuator fault scenarios, and the proposed FDD exhibits consistent effectiveness in identifying occurrence of actuator faults, verifying its suitability for integration into the design of fault-tolerant flight control systems for emergency landing of UASs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interior permanent-magnet synchronous motors (IPMSMs) become attractive candidates in modern hybrid electric vehicles and industrial applications. Usually, to obtain good control performance, the electric drives of this kind of motor require one position, one dc link, and at least two current sensors. Failure of any of these sensors might lead to degraded system performance or even instability. As such, sensor fault resilient control becomes a very important issue in modern drive systems. This paper proposes a novel sensor fault detection and isolation algorithm based on an extended Kalman filter. It is robust to system random noise and efficient in real-time implementation. Moreover, the proposed algorithm is compact and can detect and isolate all the sensor faults for IPMSM drives. Thorough theoretical analysis is provided, and the effectiveness of the proposed approach is proven by extensive experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Detection of outbreaks is an important part of disease surveillance. Although many algorithms have been designed for detecting outbreaks, few have been specifically assessed against diseases that have distinct seasonal incidence patterns, such as those caused by vector-borne pathogens. Methods We applied five previously reported outbreak detection algorithms to Ross River virus (RRV) disease data (1991-2007) for the four local government areas (LGAs) of Brisbane, Emerald, Redland and Townsville in Queensland, Australia. The methods used were the Early Aberration Reporting System (EARS) C1, C2 and C3 methods, negative binomial cusum (NBC), historical limits method (HLM), Poisson outbreak detection (POD) method and the purely temporal SaTScan analysis. Seasonally-adjusted variants of the NBC and SaTScan methods were developed. Some of the algorithms were applied using a range of parameter values, resulting in 17 variants of the five algorithms. Results The 9,188 RRV disease notifications that occurred in the four selected regions over the study period showed marked seasonality, which adversely affected the performance of some of the outbreak detection algorithms. Most of the methods examined were able to detect the same major events. The exception was the seasonally-adjusted NBC methods that detected an excess of short signals. The NBC, POD and temporal SaTScan algorithms were the only methods that consistently had high true positive rates and low false positive and false negative rates across the four study areas. The timeliness of outbreak signals generated by each method was also compared but there was no consistency across outbreaks and LGAs. Conclusions This study has highlighted several issues associated with applying outbreak detection algorithms to seasonal disease data. In lieu of a true gold standard, a quantitative comparison is difficult and caution should be taken when interpreting the true positives, false positives, sensitivity and specificity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Novel computer vision techniques have been developed to automatically detect unusual events in crowded scenes from video feeds of surveillance cameras. The research is useful in the design of the next generation intelligent video surveillance systems. Two major contributions are the construction of a novel machine learning model for multiple instance learning through compressive sensing, and the design of novel feature descriptors in the compressed video domain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research contributes a fully-operational approach for managing business process risk in near real-time. The approach consists of a language for defining risks on top of process models, a technique to detect such risks as they eventuate during the execution of business processes, a recommender system for making risk-informed decisions, and a technique to automatically mitigate the detected risks when they are no longer tolerable. Through the incorporation of risk management elements in all stages of the lifecycle of business processes, this work contributes to the effective integration of the fields of Business Process Management and Risk Management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the popularity of security cameras in public places, it is of interest to design an intelligent system that can efficiently detect events automatically. This paper proposes a novel algorithm for multi-person event detection. To ensure greater than real-time performance, features are extracted directly from compressed MPEG video. A novel histogram-based feature descriptor that captures the angles between extracted particle trajectories is proposed, which allows us to capture motion patterns of multi-person events in the video. To alleviate the need for fine-grained annotation, we propose the use of Labelled Latent Dirichlet Allocation, a “weakly supervised” method that allows the use of coarse temporal annotations which are much simpler to obtain. This novel system is able to run at approximately ten times real-time, while preserving state-of-theart detection performance for multi-person events on a 100-hour real-world surveillance dataset (TRECVid SED).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abnormal event detection has attracted a lot of attention in the computer vision research community during recent years due to the increased focus on automated surveillance systems to improve security in public places. Due to the scarcity of training data and the definition of an abnormality being dependent on context, abnormal event detection is generally formulated as a data-driven approach where activities are modeled in an unsupervised fashion during the training phase. In this work, we use a Gaussian mixture model (GMM) to cluster the activities during the training phase, and propose a Gaussian mixture model based Markov random field (GMM-MRF) to estimate the likelihood scores of new videos in the testing phase. Further-more, we propose two new features: optical acceleration, and the histogram of optical flow gradients; to detect the presence of any abnormal objects and speed violations in the scene. We show that our proposed method outperforms other state of the art abnormal event detection algorithms on publicly available UCSD dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose the use of optical flow information as a method for detecting and describing changes in the environment, from the perspective of a mobile camera. We analyze the characteristics of the optical flow signal and demonstrate how robust flow vectors can be generated and used for the detection of depth discontinuities and appearance changes at key locations. To successfully achieve this task, a full discussion on camera positioning, distortion compensation, noise filtering, and parameter estimation is presented. We then extract statistical attributes from the flow signal to describe the location of the scene changes. We also employ clustering and dominant shape of vectors to increase the descriptiveness. Once a database of nodes (where a node is a detected scene change) and their corresponding flow features is created, matching can be performed whenever nodes are encountered, such that topological localization can be achieved. We retrieve the most likely node according to the Mahalanobis and Chi-square distances between the current frame and the database. The results illustrate the applicability of the technique for detecting and describing scene changes in diverse lighting conditions, considering indoor and outdoor environments and different robot platforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Viewer interests, evoked by video content, can potentially identify the highlights of the video. This paper explores the use of facial expressions (FE) and heart rate (HR) of viewers captured using camera and non-strapped sensor for identifying interesting video segments. The data from ten subjects with three videos showed that these signals are viewer dependent and not synchronized with the video contents. To address this issue, new algorithms are proposed to effectively combine FE and HR signals for identifying the time when viewer interest is potentially high. The results show that, compared with subjective annotation and match report highlights, ‘non-neutral’ FE and ‘relatively higher and faster’ HR is able to capture 60%-80% of goal, foul, and shot-on-goal soccer video events. FE is found to be more indicative than HR of viewer’s interests, but the fusion of these two modalities outperforms each of them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a technique for the automated removal of noise from process execution logs. Noise is the result of data quality issues such as logging errors and manifests itself in the form of infrequent process behavior. The proposed technique generates an abstract representation of an event log as an automaton capturing the direct follows relations between event labels. This automaton is then pruned from arcs with low relative frequency and used to remove from the log those events not fitting the automaton, which are identified as outliers. The technique has been extensively evaluated on top of various auto- mated process discovery algorithms using both artificial logs with different levels of noise, as well as a variety of real-life logs. The results show that the technique significantly improves the quality of the discovered process model along fitness, appropriateness and simplicity, without negative effects on generalization. Further, the technique scales well to large and complex logs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business processes are prone to continuous and unexpected changes. Process workers may start executing a process differently in order to adjust to changes in workload, season, guidelines or regulations for example. Early detection of business process changes based on their event logs – also known as business process drift detection – enables analysts to identify and act upon changes that may otherwise affect process performance. Previous methods for business process drift detection are based on an exploration of a potentially large feature space and in some cases they require users to manually identify the specific features that characterize the drift. Depending on the explored feature set, these methods may miss certain types of changes. This paper proposes a fully automated and statistically grounded method for detecting process drift. The core idea is to perform statistical tests over the distributions of runs observed in two consecutive time windows. By adaptively sizing the window, the method strikes a trade-off between classification accuracy and drift detection delay. A validation on synthetic and real-life logs shows that the method accurately detects typical change patterns and scales up to the extent it is applicable for online drift detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews a variety of advanced signal processing algorithms that have been developed at the University of Southampton as part of the Prometheus (PROgraMme for European Traffic flow with Highest Efficiency and Unprecedented Safety) research programme to achieve an intelligent driver warning system (IDWS). The IDWS includes: visual detection of both generic obstacles and other vehicles, together with their tracking and identification, estimates of time to collision and behavioural modelling of drivers for a variety of scenarios. These application areas are used to show the applicability of neurofuzzy techniques to the wide range of problems required to support an IDWS, and for future fully autonomous vehicles.