884 resultados para detection performance
Resumo:
In the last twenty years aerospace and automotive industries started working widely with composite materials, which are not easy to test using classic Non-Destructive Inspection (NDI) techniques. Pairwise, the development of safety regulations sets higher and higher standards for the qualification and certification of those materials. In this thesis a new concept of a Non-Destructive defect detection technique is proposed, based on Ultrawide-Band (UWB) Synthetic Aperture Radar (SAR) imaging. Similar SAR methods are yet applied either in minefield [22] and head stroke [14] detection. Moreover feasibility studies have already demonstrated the validity of defect detection by means of UWB radars [12, 13]. The system was designed using a cheap commercial off-the-shelf radar device by Novelda and several tests of the developed system have been performed both on metallic specimen (aluminum plate) and on composite coupon (carbon fiber). The obtained results confirm the feasibility of the method and highlight the good performance of the developed system considered the radar resolution. In particular, the system is capable of discerning healthy coupons from damaged ones, and correctly reconstruct the reflectivity image of the tested defects, namely a 8 x 8 mm square bulge and a 5 mm drilled holes on metal specimen and a 5 mm drilled hole on composite coupon.
Resumo:
In cardiovascular disease the definition and the detection of the ECG parameters related to repolarization dynamics in post MI patients is still a crucial unmet need. In addition, the use of a 3D sensor in the implantable medical devices would be a crucial mean in the assessment or prediction of Heart Failure status, but the inclusion of such feature is limited by hardware and firmware constraints. The aim of this thesis is the definition of a reliable surrogate of the 500 Hz ECG signal to reach the aforementioned objective. To evaluate the worsening of reliability due to sampling frequency reduction on delineation performance, the signals have been consecutively down sampled by a factor 2, 4, 8 thus obtaining the ECG signals sampled at 250, 125 and 62.5 Hz, respectively. The final goal is the feasibility assessment of the detection of the fiducial points in order to translate those parameters into meaningful clinical parameter for Heart Failure prediction, such as T waves intervals heterogeneity and variability of areas under T waves. An experimental setting for data collection on healthy volunteers has been set up at the Bakken Research Center in Maastricht. A 16 – channel ambulatory system, provided by TMSI, has recorded the standard 12 – Leads ECG, two 3D accelerometers and a respiration sensor. The collection platform has been set up by the TMSI property software Polybench, the data analysis of such signals has been performed with Matlab. The main results of this study show that the 125 Hz sampling rate has demonstrated to be a good candidate for a reliable detection of fiducial points. T wave intervals proved to be consistently stable, even at 62.5 Hz. Further studies would be needed to provide a better comparison between sampling at 250 Hz and 125 Hz for areas under the T waves.
Resumo:
Le dimensionnement basé sur la performance (DBP), dans une approche déterministe, caractérise les objectifs de performance par rapport aux niveaux de performance souhaités. Les objectifs de performance sont alors associés à l'état d'endommagement et au niveau de risque sismique établis. Malgré cette approche rationnelle, son application est encore difficile. De ce fait, des outils fiables pour la capture de l'évolution, de la distribution et de la quantification de l'endommagement sont nécessaires. De plus, tous les phénomènes liés à la non-linéarité (matériaux et déformations) doivent également être pris en considération. Ainsi, cette recherche montre comment la mécanique de l'endommagement pourrait contribuer à résoudre cette problématique avec une adaptation de la théorie du champ de compression modifiée et d'autres théories complémentaires. La formulation proposée adaptée pour des charges monotones, cycliques et de type pushover permet de considérer les effets non linéaires liés au cisaillement couplé avec les mécanismes de flexion et de charge axiale. Cette formulation est spécialement appliquée à l'analyse non linéaire des éléments structuraux en béton soumis aux effets de cisaillement non égligeables. Cette nouvelle approche mise en œuvre dans EfiCoS (programme d'éléments finis basé sur la mécanique de l'endommagement), y compris les critères de modélisation, sont également présentés ici. Des calibrations de cette nouvelle approche en comparant les prédictions avec des données expérimentales ont été réalisées pour les murs de refend en béton armé ainsi que pour des poutres et des piliers de pont où les effets de cisaillement doivent être pris en considération. Cette nouvelle version améliorée du logiciel EFiCoS a démontrée être capable d'évaluer avec précision les paramètres associés à la performance globale tels que les déplacements, la résistance du système, les effets liés à la réponse cyclique et la quantification, l'évolution et la distribution de l'endommagement. Des résultats remarquables ont également été obtenus en référence à la détection appropriée des états limites d'ingénierie tels que la fissuration, les déformations unitaires, l'éclatement de l'enrobage, l'écrasement du noyau, la plastification locale des barres d'armature et la dégradation du système, entre autres. Comme un outil pratique d'application du DBP, des relations entre les indices d'endommagement prédits et les niveaux de performance ont été obtenus et exprimés sous forme de graphiques et de tableaux. Ces graphiques ont été développés en fonction du déplacement relatif et de la ductilité de déplacement. Un tableau particulier a été développé pour relier les états limites d'ingénierie, l'endommagement, le déplacement relatif et les niveaux de performance traditionnels. Les résultats ont démontré une excellente correspondance avec les données expérimentales, faisant de la formulation proposée et de la nouvelle version d'EfiCoS des outils puissants pour l'application de la méthodologie du DBP, dans une approche déterministe.
Resumo:
This paper provides an overview of IDS types and how they work as well as configuration considerations and issues that affect them. Advanced methods of increasing the performance of an IDS are explored such as specification based IDS for protecting Supervisory Control And Data Acquisition (SCADA) and Cloud networks. Also by providing a review of varied studies ranging from issues in configuration and specific problems to custom techniques and cutting edge studies a reference can be provided to others interested in learning about and developing IDS solutions. Intrusion Detection is an area of much required study to provide solutions to satisfy evolving services and networks and systems that support them. This paper aims to be a reference for IDS technologies other researchers and developers interested in the field of intrusion detection.
Resumo:
Cognitive radio (CR) was developed for utilizing the spectrum bands efficiently. Spectrum sensing and awareness represent main tasks of a CR, providing the possibility of exploiting the unused bands. In this thesis, we investigate the detection and classification of Long Term Evolution (LTE) single carrier-frequency division multiple access (SC-FDMA) signals, which are used in uplink LTE, with applications to cognitive radio. We explore the second-order cyclostationarity of the LTE SC-FDMA signals, and apply results obtained for the cyclic autocorrelation function to signal detection and classification (in other words, to spectrum sensing and awareness). The proposed detection and classification algorithms provide a very good performance under various channel conditions, with a short observation time and at low signal-to-noise ratios, with reduced complexity. The validity of the proposed algorithms is verified using signals generated and acquired by laboratory instrumentation, and the experimental results show a good match with computer simulation results.
Resumo:
INTRODUCTION In recent years computer systems have become increasingly complex and consequently the challenge of protecting these systems has become increasingly difficult. Various techniques have been implemented to counteract the misuse of computer systems in the form of firewalls, antivirus software and intrusion detection systems. The complexity of networks and dynamic nature of computer systems leaves current methods with significant room for improvement. Computer scientists have recently drawn inspiration from mechanisms found in biological systems and, in the context of computer security, have focused on the human immune system (HIS). The human immune system provides an example of a robust, distributed system that provides a high level of protection from constant attacks. By examining the precise mechanisms of the human immune system, it is hoped the paradigm will improve the performance of real intrusion detection systems. This paper presents an introduction to recent developments in the field of immunology. It discusses the incorporation of a novel immunological paradigm, Danger Theory, and how this concept is inspiring artificial immune systems (AIS). Applications within the context of computer security are outlined drawing direct reference to the underlying principles of Danger Theory and finally, the current state of intrusion detection systems is discussed and improvements suggested.
Resumo:
For a robot be autonomous and mobile, it requires being attached with a set of sensors that helps it to have a better perception of the surrounding world, to manage to localize itself and the surrounding objects. CAMBADA is the robotic soccer team of the IRIS research group, from IEETA, University of Aveiro, that competes in the Middle-Size League of RoboCup. In competition, in order to win, the main objective of the game it's to score more goals than the conceded, so not conceding goals, and score as much as possible it's desirable, thus, this thesis focus on adapt an agent with a better localization capacity in defensive and offensive moments. It was introduced a laser range finder to the CAMBADA robots, making them capable of detecting their own and the opponent goal, and to detect the opponents in specific game situations. With the new information and adapting the Goalie and Penalty behaviors, the CAMBADA goalkeeper is now able to detect and track its own goal and the CAMBADA striker has a better performance in a penalty situation. The developed work was incorporated within the competition software of the robots, which allows the presentation, in this thesis, of the experimental results obtained with physical robots on the laboratory field.
Resumo:
A micro gas sensor has been developed by our group for the detection of organo-phosphate vapors using an aqueous oxime solution. The analyte diffuses from the high flow rate gas stream through a porous membrane to the low flow rate aqueous phase. It reacts with the oxime PBO (1-Phenyl-1,2,3,-butanetrione 2-oxime) to produce cyanide ions, which are then detected electrochemically from the change in solution potential. Previous work on this oxime based electrochemistry indicated that the optimal buffer pH for the aqueous solution was approximately 10. A basic environment is needed for the oxime anion to form and the detection reaction to take place. At this specific pH, the potential response of the sensor to an analyte (such as acetic anhydride) is maximized. However, sensor response slowly decreases as the aqueous oxime solution ages, by as much as 80% in first 24 hours. The decrease in sensor response is due to cyanide which is produced during the oxime degradation process, as evidenced by the cyanide selective electrode. Solid phase micro-extraction carried out on the oxime solution found several other possible degradation products, including acetic acid, N-hydroxy benzamide, benzoic acid, benzoyl cyanide, 1-Phenyl 1,3-butadione, 2-isonitrosoacetophenone and an imine derived from the oxime. It was concluded that degradation occurred through nucleophilic attack by a hydroxide or oxime anion to produce cyanide, as well as a nitrogen atom rearrangement similar to Beckmann rearrangement. The stability of the oxime in organic solvents is most likely due to the lack of water, and specifically hydroxide ions. The reaction between oxime and organo-phosphate to produce cyanide ions requires hydroxide ions, and therefore pure organic solvents are not compatible with the current micro-sensor electrochemistry. By combining a concentrated organic oxime solution with the basic aqueous buffer just prior to being used in the detection process, oxime degradation can be avoided while preserving the original electrochemical detection scheme. Based on beaker cell experiments with selective cyanide sensitive electrodes, ethanol was chosen as the best organic solvent due to its stabilizing effect on the oxime, minimal interference with the aqueous electrochemistry, and compatibility with the current microsensor material (PMMA). Further studies showed that ethanol had a small effect on micro-sensor performance by reducing the rate of cyanide production and decreasing the overall response time. To avoid incomplete mixing of the aqueous and organic solutions, they were pre-mixed externally at a 10:1 ratio, respectively. To adapt the microsensor design to allow for mixing to take place within the device, a small serpentine channel component was fabricated with the same dimensions and material as the original sensor. This allowed for seamless integration of the microsensor with the serpentine mixing channel. Mixing in the serpentine microchannel takes place via diffusion. Both detector potential response and diffusional mixing improve with increased liquid residence time, and thus decreased liquid flowrate. Micromixer performance was studies at a 10:1 aqueous buffer to organic solution flow rate ratio, for a total rate of 5.5 μL/min. It was found that the sensor response utilizing the integrated micromixer was nearly identical to the response when the solutions were premixed and fed at the same rate.
Resumo:
We consider an LTE network where a secondary user acts as a relay, transmitting data to the primary user using a decode-and-forward mechanism, transparent to the base-station (eNodeB). Clearly, the relay can decode symbols more reliably if the employed precoder matrix indicators (PMIs) are known. However, for closed loop spatial multiplexing (CLSM) transmit mode, this information is not always embedded in the downlink signal, leading to a need for effective methods to determine the PMI. In this thesis, we consider 2x2 MIMO and 4x4 MIMO downlink channels corresponding to CLSM and formulate two techniques to estimate the PMI at the relay using a hypothesis testing framework. We evaluate their performance via simulations for various ITU channel models over a range of SNR and for different channel quality indicators (CQIs). We compare them to the case when the true PMI is known at the relay and show that the performance of the proposed schemes are within 2 dB at 10% block error rate (BLER) in almost all scenarios. Furthermore, the techniques add minimal computational overhead over existent receiver structure. Finally, we also identify scenarios when using the proposed precoder detection algorithms in conjunction with the cooperative decode-and-forward relaying mechanism benefits the PUE and improves the BLER performance for the PUE. Therefore, we conclude from this that the proposed algorithms as well as the cooperative relaying mechanism at the CMR can be gainfully employed in a variety of real-life scenarios in LTE networks.
Resumo:
The immune system provides a rich metaphor for computer security: anomaly detection that works in nature should work for machines. However, early artificial immune system approaches for computer security had only limited success. Arguably, this was due to these artificial systems being based on too simplistic a view of the immune system. We present here a second generation artificial immune system for process anomaly detection. It improves on earlier systems by having different artificial cell types that process information. Following detailed information about how to build such second generation systems, we find that communication between cells types is key to performance. Through realistic testing and validation we show that second generation artificial immune systems are capable of anomaly detection beyond generic system policies. The paper concludes with a discussion and outline of the next steps in this exciting area of computer security.
Resumo:
INTRODUCTION In recent years computer systems have become increasingly complex and consequently the challenge of protecting these systems has become increasingly difficult. Various techniques have been implemented to counteract the misuse of computer systems in the form of firewalls, antivirus software and intrusion detection systems. The complexity of networks and dynamic nature of computer systems leaves current methods with significant room for improvement. Computer scientists have recently drawn inspiration from mechanisms found in biological systems and, in the context of computer security, have focused on the human immune system (HIS). The human immune system provides an example of a robust, distributed system that provides a high level of protection from constant attacks. By examining the precise mechanisms of the human immune system, it is hoped the paradigm will improve the performance of real intrusion detection systems. This paper presents an introduction to recent developments in the field of immunology. It discusses the incorporation of a novel immunological paradigm, Danger Theory, and how this concept is inspiring artificial immune systems (AIS). Applications within the context of computer security are outlined drawing direct reference to the underlying principles of Danger Theory and finally, the current state of intrusion detection systems is discussed and improvements suggested.
Resumo:
Automatic analysis of human behaviour in large collections of videos is gaining interest, even more so with the advent of file sharing sites such as YouTube. However, challenges still exist owing to several factors such as inter- and intra-class variations, cluttered backgrounds, occlusion, camera motion, scale, view and illumination changes. This research focuses on modelling human behaviour for action recognition in videos. The developed techniques are validated on large scale benchmark datasets and applied on real-world scenarios such as soccer videos. Three major contributions are made. The first contribution is in the area of proper choice of a feature representation for videos. This involved a study of state-of-the-art techniques for action recognition, feature extraction processing and dimensional reduction techniques so as to yield the best performance with optimal computational requirements. Secondly, temporal modelling of human behaviour is performed. This involved frequency analysis and temporal integration of local information in the video frames to yield a temporal feature vector. Current practices mostly average the frame information over an entire video and neglect the temporal order. Lastly, the proposed framework is applied and further adapted to real-world scenario such as soccer videos. A dataset consisting of video sequences depicting events of players falling is created from actual match data to this end and used to experimentally evaluate the proposed framework.
Resumo:
In this study, the Schwarz Information Criterion (SIC) is applied in order to detect change-points in the time series of surface water quality variables. The application of change-point analysis allowed detecting change-points in both the mean and the variance in series under study. Time variations in environmental data are complex and they can hinder the identification of the so-called change-points when traditional models are applied to this type of problems. The assumptions of normality and uncorrelation are not present in some time series, and so, a simulation study is carried out in order to evaluate the methodology’s performance when applied to non-normal data and/or with time correlation.
Resumo:
Plants frequently suffer contaminations by toxigenic fungi, and their mycotoxins can be produced throughout growth, harvest, drying and storage periods. The objective of this work was to validate a method for detection of toxins in medicinal and aromatic plants, through a fast and highly sensitive method, optimizing the joint co-extraction of aflatoxins (AF: AFB1, AFB2, AFG1 and AFG2) and ochratoxin A (OTA) by using Aloysia citrodora P. (lemon verbena) as a case study. For optimization purposes, samples were spiked (n=3) with standard solutions of a mix of the four AFs and OTA at 10 ng/g for AFB1, AFG1 and OTA, and at 6 ng/g of AFB2 and AFG2. Several extraction procedures were tested: i) ultrasound-assisted extraction in sodium chloride and methanol/water (80:20, v/v) [(OTA+AFs)1]; ii) maceration in methanol/1% NaHCO3 (70:30, v/v) [(OTA+AFs)2]; iii) maceration in methanol/1% NaHCO3 (70:30, v/v) (OTA1); and iv) maceration in sodium chloride and methanol/water (80:20, v/v) (AF1). AF and OTA were purified using the mycotoxin-specific immunoaffinity columns AflaTest WB and OchraTest WB (VICAM), respectively. Separation was performed with a Merck Chromolith Performance C18 column (100 x 4.6 mm) by reverse-phase HPLC coupled to a fluorescence detector (FLD) and a photochemical derivatization system (for AF). The recoveries obtained from the spiked samples showed that the single-extraction methods (OTA1 and AF1) performed better than co-extraction methods. For in-house validation of the selected methods OTA1 and AF1, recovery and precision were determined (n=6). The recovery of OTA for method OTA1 was 81%, and intermediate precision (RSDint) was 1.1%. The recoveries of AFB1, AFB2, AFG1 and AFG2 ranged from 64% to 110% for method AF1, with RSDint lower than 5%. Methods OTA1 and AF1 showed precision and recoveries within the legislated values and were found to be suitable for the extraction of OTA and AF for the matrix under study.
Resumo:
PURPOSE We aimed to evaluate the added value of diffusion-weighted imaging (DWI) to standard magnetic resonance imaging (MRI) for detecting post-treatment cervical cancer recurrence. The detection accuracy of T2-weighted (T2W) images was compared with that of T2W MRI combined with either dynamic contrast-enhanced (DCE) MRI or DWI. METHODS Thirty-eight women with clinically suspected uterine cervical cancer recurrence more than six months after treatment completion were examined with 1.5 Tesla MRI including T2W, DCE, and DWI sequences. Disease was confirmed histologically and correlated with MRI findings. The diagnostic performance of T2W imaging and its combination with either DCE or DWI were analyzed. Sensitivity, positive predictive value, and accuracy were calculated. RESULTS Thirty-six women had histologically proven recurrence. The accuracy for recurrence detection was 80% with T2W/DCE MRI and 92.1% with T2W/DWI. The addition of DCE sequences did not significantly improve the diagnostic ability of T2W imaging, and this sequence combination misclassified two patients as falsely positive and seven as falsely negative. The T2W/DWI combination revealed a positive predictive value of 100% and only three false negatives. CONCLUSION The addition of DWI to T2W sequences considerably improved the diagnostic ability of MRI. Our results support the inclusion of DWI in the initial MRI protocol for the detection of cervical cancer recurrence, leaving DCE sequences as an option for uncertain cases.