892 resultados para Detection system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present report describes the development of a technique for automatic wheezing recognition in digitally recorded lung sounds. This method is based on the extraction and processing of spectral information from the respiratory cycle and the use of these data for user feedback and automatic recognition. The respiratory cycle is first pre-processed, in order to normalize its spectral information, and its spectrogram is then computed. After this procedure, the spectrogram image is processed by a two-dimensional convolution filter and a half-threshold in order to increase the contrast and isolate its highest amplitude components, respectively. Thus, in order to generate more compressed data to automatic recognition, the spectral projection from the processed spectrogram is computed and stored as an array. The higher magnitude values of the array and its respective spectral values are then located and used as inputs to a multi-layer perceptron artificial neural network, which results an automatic indication about the presence of wheezes. For validation of the methodology, lung sounds recorded from three different repositories were used. The results show that the proposed technique achieves 84.82% accuracy in the detection of wheezing for an isolated respiratory cycle and 92.86% accuracy for the detection of wheezes when detection is carried out using groups of respiratory cycles obtained from the same person. Also, the system presents the original recorded sound and the post-processed spectrogram image for the user to draw his own conclusions from the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acute cerebral hemorrhage (ACH) is an important clinical problem that is often monitored and studied with expensive devices such as computed tomography, magnetic resonance imaging, and positron emission tomography. These devices are not readily available in economically underdeveloped regions of the world, emergency departments, and emergency zones. We have developed a less expensive tool for non-contact monitoring of ACH. The system measures the magnetic induction phase shift (MIPS) between the electromagnetic signals on two coils. ACH was induced in 6 experimental rabbits and edema was induced in 4 control rabbits by stereotactic methods, and their intracranial pressure and heart rate were monitored for 1 h. Signals were continuously monitored for up to 1 h at an exciting frequency of 10.7 MHz. Autologous blood was administered to the experimental group, and saline to the control group (1 to 3 mL) by injection of 1-mL every 5 min. The results showed a significant increase in MIPS as a function of the injection volume, but the heart rate was stable. In the experimental (ACH) group, there was a statistically significant positive correlation of the intracranial pressure and MIPS. The change of MIPS was greater in the ACH group than in the control group. This high-sensitivity system could detect a 1-mL change in blood volume. The MIPS was significantly related to the intracranial pressure. This observation suggests that the method could be valuable for detecting early warning signs in emergency medicine and critical care units.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The difficulty on identifying, lack of segregation systems and absence of suitable standards for coexistence of non trangenic and transgenic soybean are contributing for contaminations that occur during productive system. The objective of this study was to evaluate the efficiency of two methods for detecting mixtures of seeds genetically modified (GM) into samples of non-GM soybean, in a way that seed lots can be assessed within the standards established by seed legislation. Two sizes of soybean samples (200 and 400 seeds), cv. BRSMG 810C (non-GM) and BRSMG 850GRR (GM), were assessed with four contamination levels (addition of GM seeds, for obtaining 0.0%, 0.5%, 1.0%, and 1.5% contamination), and two detection methods: immunoassay of lateral flux (ILF) and bioassay (pre-imbibition into 0.6% herbicide solution; 25 ºC; 16 h). The bioassay is efficient in detecting presence of GM seeds in seed samples of non-GM soybean, even for contamination lower than 1.0%, provided that seeds have high physiological quality. The ILF was positive, detecting the presence of target protein in contaminated samples, indicating test effectiveness. There was significant correlation between the two detection methods (r = 0.82; p < 0.0001). Sample size did not influence efficiency of the two methods in detecting presence of GM seeds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this bachelor’s thesis are examined the benefits of current distortion detection device application in customer premises low voltage networks. The purpose of this study was to find out if there are benefits for measuring current distortion in low-voltage residential networks. Concluding into who can benefit from measuring the power quality. The research focuses on benefits based on the standardization in Europe and United States of America. In this research, were also given examples of appliances in which current distortion detection device could be used. Along with possible illustration of user interface for the device. The research was conducted as an analysis of the benefits of current distortion detection device in residential low voltage networks. The research was based on literature review. The study was divided to three sections. The first explain the reasons for benefitting from usage of the device and the second portrays the low-cost device, which could detect one-phase current distortion, in theory. The last section discuss of the benefits of usage of current distortion detection device while focusing on the beneficiaries. Based on the result of this research, there are benefits from usage to the current distortion detection device. The main benefitting party of the current distortion detection device was found to be manufactures, as they are held responsible of limiting the current distortion on behalf of consumers. Manufactures could adjust equipment to respond better to the distortion by having access to on-going current distortion in network. The other benefitting party are system operators, who would better locate distortion issues in low-voltage residential network to start prevention of long-term problems caused by current distortion early on.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Epilepsy is a chronic brain disorder, characterized by reoccurring seizures. Automatic sei-zure detector, incorporated into a mobile closed-loop system, can improve the quality of life for the people with epilepsy. Commercial EEG headbands, such as Emotiv Epoc, have a potential to be used as the data acquisition devices for such a system. In order to estimate that potential, epileptic EEG signals from the commercial devices were emulated in this work based on the EEG data from a clinical dataset. The emulated characteristics include the referencing scheme, the set of electrodes used, the sampling rate, the sample resolution and the noise level. Performance of the existing algorithm for detection of epileptic seizures, developed in the context of clinical data, has been evaluated on the emulated commercial data. The results show, that after the transformation of the data towards the characteristics of Emotiv Epoc, the detection capabilities of the algorithm are mostly preserved. The ranges of acceptable changes in the signal parameters are also estimated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The addition of L-Glutamate (L-GLU) and L-Hethionine ~ulfoximine (L-HSO) to mechanically isolated. photosynthetically competent, Asparagus sprengeri mesophyll cells ~u~pended in 1mM CaS04 cau~ed an immediate transient alkalinization of the cell su~pension medium in both the light and dark. The alkalinization response was specific and stereospecific as none of the L-isomers of the other 19 protein amino acids tested or D-GLU gave this response. Uptake of 14C-L-GLU was stimulated by the light. The addition of non-radioactive L-GLU. or L-GLU analogs together with 14C-L-GLU showed that only L-GLU and L-HSO stimulated alkalinization whilst inhibiting the uptake of 14C-L-GLU. Both the L-GLU dependent alkalinization and the upt~ke of 14C-L-GLU were stimulated when the external pH was decreased from 6.5 to 5.5. Increasing external K+ concentrations inhibited the uptake of 14C-L-GLU. Fusicoccin (FC) stimulated uptake. The L-GLU dependent alkalinization re~ponse exhibited monophasic saturation kinetics while the uptake of 14C-L-GLU exhibited biphasic saturation kinetics. In addition to a saturable component. the uptake kinetics also showed a linear component of uptake. Addition of L-GLU and L-MSO caused internal acidification of the cell as measured by a change in the distribution of 14C-DMO. There was no change in K+ efflux when L-GLU was added. A H+ to L-GLUinflux stoichiometry of 3:1 wa~ mea~ured at an external I.-GLU concentration of O.5mM and increased with increasing external 13 L-QLU concentration. Metabolism of L-GLU was detected manometrlcally by observing an increase in COa evolution upon the addition of L-QLU and by detection of i*C02 evolution upon the addition of »*C-L-GLU. »*C02 evolution was higher in the dark than in the light. The data are consistent with the operation of a H+/L-QLO cotransport system. The data also show that attempts to quantify the stoichlometry of the process were complicated by the metabolism of L-GLU.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work includes two major parts. The first part of the work concentrated on the studies of the application of the highperfonnance liquid chromatography-particle beam interface-mass spectrometry system of some pesticides. Factors that have effects on the detection sensitivity were studied. The linearity ranges and detection limits of ten pesticides are also given in this work. The second part of the work concentrated on the studies of the reduction phenomena of nitro compounds in the HPLC-PB-MS system. Direct probe mass spectrometry and gas chromatography-mass spectrometry techniques were also used in the work. Factors that have effects on the reduction of the nitro compounds were studied, and the possible explanation is proposed. The final part of this work included the studies of reduction behavior of some other compounds in the HPLC-PB-MS system, included in them are: quinones, sulfoxides, and sulfones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On average approximately 13% of the water that is withdrawn by Canadian municipal water suppliers is lost before it reaches final users. This is an important topic for several reasons: water losses cost money, losses force water agencies to draw more water from lakes and streams thereby putting more stress on aquatic ecosystems, leaks reduce system reliability, leaks may contribute to future pipe failures, and leaks may allow contaminants to enter water systems thereby reducing water quality and threatening the health of water users. Some benefits of leak detection fall outside water agencies’ accounting purview (e.g. reduced health risks to households connected to public water supply systems) and, as a result, may not be considered adequately in water agency decision-making. Because of the regulatory environment in which Canadian water agencies operate, some of these benefits-especially those external to the agency or those that may accrue to the agency in future time periods- may not be fully counted when agencies decide on leak detection efforts. Our analysis suggests potential reforms to promote increased efforts for leak detection: adoption of a Canada-wide goal of universal water metering; development of full-cost accounting and, pricing for water supplies; and co-operation amongst the provinces to promulgate standards for leak detection efforts and provide incentives to promote improved efficiency and rational investment decision-making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les changements sont faits de façon continue dans le code source des logiciels pour prendre en compte les besoins des clients et corriger les fautes. Les changements continus peuvent conduire aux défauts de code et de conception. Les défauts de conception sont des mauvaises solutions à des problèmes récurrents de conception ou d’implémentation, généralement dans le développement orienté objet. Au cours des activités de compréhension et de changement et en raison du temps d’accès au marché, du manque de compréhension, et de leur expérience, les développeurs ne peuvent pas toujours suivre les normes de conception et les techniques de codage comme les patrons de conception. Par conséquent, ils introduisent des défauts de conception dans leurs systèmes. Dans la littérature, plusieurs auteurs ont fait valoir que les défauts de conception rendent les systèmes orientés objet plus difficile à comprendre, plus sujets aux fautes, et plus difficiles à changer que les systèmes sans les défauts de conception. Pourtant, seulement quelques-uns de ces auteurs ont fait une étude empirique sur l’impact des défauts de conception sur la compréhension et aucun d’entre eux n’a étudié l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes. Dans cette thèse, nous proposons trois principales contributions. La première contribution est une étude empirique pour apporter des preuves de l’impact des défauts de conception sur la compréhension et le changement. Nous concevons et effectuons deux expériences avec 59 sujets, afin d’évaluer l’impact de la composition de deux occurrences de Blob ou deux occurrences de spaghetti code sur la performance des développeurs effectuant des tâches de compréhension et de changement. Nous mesurons la performance des développeurs en utilisant: (1) l’indice de charge de travail de la NASA pour leurs efforts, (2) le temps qu’ils ont passé dans l’accomplissement de leurs tâches, et (3) les pourcentages de bonnes réponses. Les résultats des deux expériences ont montré que deux occurrences de Blob ou de spaghetti code sont un obstacle significatif pour la performance des développeurs lors de tâches de compréhension et de changement. Les résultats obtenus justifient les recherches antérieures sur la spécification et la détection des défauts de conception. Les équipes de développement de logiciels doivent mettre en garde les développeurs contre le nombre élevé d’occurrences de défauts de conception et recommander des refactorisations à chaque étape du processus de développement pour supprimer ces défauts de conception quand c’est possible. Dans la deuxième contribution, nous étudions la relation entre les défauts de conception et les fautes. Nous étudions l’impact de la présence des défauts de conception sur l’effort nécessaire pour corriger les fautes. Nous mesurons l’effort pour corriger les fautes à l’aide de trois indicateurs: (1) la durée de la période de correction, (2) le nombre de champs et méthodes touchés par la correction des fautes et (3) l’entropie des corrections de fautes dans le code-source. Nous menons une étude empirique avec 12 défauts de conception détectés dans 54 versions de quatre systèmes: ArgoUML, Eclipse, Mylyn, et Rhino. Nos résultats ont montré que la durée de la période de correction est plus longue pour les fautes impliquant des classes avec des défauts de conception. En outre, la correction des fautes dans les classes avec des défauts de conception fait changer plus de fichiers, plus les champs et des méthodes. Nous avons également observé que, après la correction d’une faute, le nombre d’occurrences de défauts de conception dans les classes impliquées dans la correction de la faute diminue. Comprendre l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes est important afin d’aider les équipes de développement pour mieux évaluer et prévoir l’impact de leurs décisions de conception et donc canaliser leurs efforts pour améliorer la qualité de leurs systèmes. Les équipes de développement doivent contrôler et supprimer les défauts de conception de leurs systèmes car ils sont susceptibles d’augmenter les efforts de changement. La troisième contribution concerne la détection des défauts de conception. Pendant les activités de maintenance, il est important de disposer d’un outil capable de détecter les défauts de conception de façon incrémentale et itérative. Ce processus de détection incrémentale et itérative pourrait réduire les coûts, les efforts et les ressources en permettant aux praticiens d’identifier et de prendre en compte les occurrences de défauts de conception comme ils les trouvent lors de la compréhension et des changements. Les chercheurs ont proposé des approches pour détecter les occurrences de défauts de conception, mais ces approches ont actuellement quatre limites: (1) elles nécessitent une connaissance approfondie des défauts de conception, (2) elles ont une précision et un rappel limités, (3) elles ne sont pas itératives et incrémentales et (4) elles ne peuvent pas être appliquées sur des sous-ensembles de systèmes. Pour surmonter ces limitations, nous introduisons SMURF, une nouvelle approche pour détecter les défauts de conception, basé sur une technique d’apprentissage automatique — machines à vecteur de support — et prenant en compte les retours des praticiens. Grâce à une étude empirique portant sur trois systèmes et quatre défauts de conception, nous avons montré que la précision et le rappel de SMURF sont supérieurs à ceux de DETEX et BDTEX lors de la détection des occurrences de défauts de conception. Nous avons également montré que SMURF peut être appliqué à la fois dans les configurations intra-système et inter-système. Enfin, nous avons montré que la précision et le rappel de SMURF sont améliorés quand on prend en compte les retours des praticiens.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alors que l’Imagerie par résonance magnétique (IRM) permet d’obtenir un large éventail de données anatomiques et fonctionnelles, les scanneurs cliniques sont généralement restreints à l’utilisation du proton pour leurs images et leurs applications spectroscopiques. Le phosphore jouant un rôle prépondérant dans le métabolisme énergétique, l’utilisation de cet atome en spectroscopie RM présente un énorme avantage dans l’observation du corps humain. Cela représente un certain nombre de déEis techniques à relever dus à la faible concentration de phosphore et sa fréquence de résonance différente. L’objectif de ce projet a été de développer la capacité à réaliser des expériences de spectroscopie phosphore sur un scanneur IRM clinique de 3 Tesla. Nous présentons ici les différentes étapes nécessaires à la conception et la validation d’une antenne IRM syntonisée à la fréquence du phosphore. Nous présentons aussi l’information relative à réalisation de fantômes utilisés dans les tests de validation et la calibration. Finalement, nous présentons les résultats préliminaires d’acquisitions spectroscopiques sur un muscle humain permettant d’identiEier les différents métabolites phosphorylés à haute énergie. Ces résultats s’inscrivent dans un projet de plus grande envergure où les impacts des changements du métabolisme énergétique sont étudiés en relation avec l’âge et les pathologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dans l'apprentissage machine, la classification est le processus d’assigner une nouvelle observation à une certaine catégorie. Les classifieurs qui mettent en œuvre des algorithmes de classification ont été largement étudié au cours des dernières décennies. Les classifieurs traditionnels sont basés sur des algorithmes tels que le SVM et les réseaux de neurones, et sont généralement exécutés par des logiciels sur CPUs qui fait que le système souffre d’un manque de performance et d’une forte consommation d'énergie. Bien que les GPUs puissent être utilisés pour accélérer le calcul de certains classifieurs, leur grande consommation de puissance empêche la technologie d'être mise en œuvre sur des appareils portables tels que les systèmes embarqués. Pour rendre le système de classification plus léger, les classifieurs devraient être capable de fonctionner sur un système matériel plus compact au lieu d'un groupe de CPUs ou GPUs, et les classifieurs eux-mêmes devraient être optimisés pour ce matériel. Dans ce mémoire, nous explorons la mise en œuvre d'un classifieur novateur sur une plate-forme matérielle à base de FPGA. Le classifieur, conçu par Alain Tapp (Université de Montréal), est basé sur une grande quantité de tables de recherche qui forment des circuits arborescents qui effectuent les tâches de classification. Le FPGA semble être un élément fait sur mesure pour mettre en œuvre ce classifieur avec ses riches ressources de tables de recherche et l'architecture à parallélisme élevé. Notre travail montre que les FPGAs peuvent implémenter plusieurs classifieurs et faire les classification sur des images haute définition à une vitesse très élevée.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ACCURATE sensing of vehicle position and attitude is still a very challenging problem in many mobile robot applications. The mobile robot vehicle applications must have some means of estimating where they are and in which direction they are heading. Many existing indoor positioning systems are limited in workspace and robustness because they require clear lines-of-sight or do not provide absolute, driftfree measurements.The research work presented in this dissertation provides a new approach to position and attitude sensing system designed specifically to meet the challenges of operation in a realistic, cluttered indoor environment, such as that of an office building, hospital, industrial or warehouse. This is accomplished by an innovative assembly of infrared LED source that restricts the spreading of the light intensity distribution confined to a sheet of light and is encoded with localization and traffic information. This Digital Infrared Sheet of Light Beacon (DISLiB) developed for mobile robot is a high resolution absolute localization system which is simple, fast, accurate and robust, without much of computational burden or significant processing. Most of the available beacon's performance in corridors and narrow passages are not satisfactory, whereas the performance of DISLiB is very encouraging in such situations. This research overcomes most of the inherent limitations of existing systems.The work further examines the odometric localization errors caused by over count readings of an optical encoder based odometric system in a mobile robot due to wheel-slippage and terrain irregularities. A simple and efficient method is investigated and realized using an FPGA for reducing the errors. The detection and correction is based on redundant encoder measurements. The method suggested relies on the fact that the wheel slippage or terrain irregularities cause more count readings from the encoder than what corresponds to the actual distance travelled by the vehicle.The application of encoded Digital Infrared Sheet of Light Beacon (DISLiB) system can be extended to intelligent control of the public transportation system. The system is capable of receiving traffic status input through a GSM (Global System Mobile) modem. The vehicles have infrared receivers and processors capable of decoding the information, and generating the audio and video messages to assist the driver. The thesis further examines the usefulness of the technique to assist the movement of differently-able (blind) persons in indoor or outdoor premises of his residence.The work addressed in this thesis suggests a new way forward in the development of autonomous robotics and guidance systems. However, this work can be easily extended to many other challenging domains, as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design and development of a cost-effective, simple, sensitive and portable LED based fiber optic evanescent wave sensor for simultaneously detecting trace amounts of chromium and nitrite in water are presented. In order to obtain the desired performance, the middle portions of two multimode plastic clad silica fibers are unclad and are used as the sensing elements in the two arms of the sensor. Each of the sensor arms is sourced by separate super bright green LEDs, which are modulated in a time-sharing manner and a single photo detector is employed for detecting these light signals. The performance and characteristics of this system clearly establish the usefulness of the technique for detecting very low concentrations of the dissolved contaminants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machine tool chatter is an unfavorable phenomenon during metal cutting, which results in heavy vibration of cutting tool. With increase in depth of cut, the cutting regime changes from chatter-free cutting to one with chatter. In this paper, we propose the use of permutation entropy (PE), a conceptually simple and computationally fast measurement to detect the onset of chatter from the time series using sound signal recorded with a unidirectional microphone. PE can efficiently distinguish the regular and complex nature of any signal and extract information about the dynamics of the process by indicating sudden change in its value. Under situations where the data sets are huge and there is no time for preprocessing and fine-tuning, PE can effectively detect dynamical changes of the system. This makes PE an ideal choice for online detection of chatter, which is not possible with other conventional nonlinear methods. In the present study, the variation of PE under two cutting conditions is analyzed. Abrupt variation in the value of PE with increase in depth of cut indicates the onset of chatter vibrations. The results are verified using frequency spectra of the signals and the nonlinear measure, normalized coarse-grained information rate (NCIR).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the applications of the recurrence quantification analysis in metal cutting operation in a lathe, with specific objective to detect tool wear and chatter, are presented.This study is based on the discovery that process dynamics in a lathe is low dimensional chaotic. It implies that the machine dynamics is controllable using principles of chaos theory. This understanding is to revolutionize the feature extraction methodologies used in condition monitoring systems as conventional linear methods or models are incapable of capturing the critical and strange behaviors associated with the metal cutting process.As sensor based approaches provide an automated and cost effective way to monitor and control, an efficient feature extraction methodology based on nonlinear time series analysis is much more demanding. The task here is more complex when the information has to be deduced solely from sensor signals since traditional methods do not address the issue of how to treat noise present in real-world processes and its non-stationarity. In an effort to get over these two issues to the maximum possible, this thesis adopts the recurrence quantification analysis methodology in the study since this feature extraction technique is found to be robust against noise and stationarity in the signals.The work consists of two different sets of experiments in a lathe; set-I and set-2. The experiment, set-I, study the influence of tool wear on the RQA variables whereas the set-2 is carried out to identify the sensitive RQA variables to machine tool chatter followed by its validation in actual cutting. To obtain the bounds of the spectrum of the significant RQA variable values, in set-i, a fresh tool and a worn tool are used for cutting. The first part of the set-2 experiments uses a stepped shaft in order to create chatter at a known location. And the second part uses a conical section having a uniform taper along the axis for creating chatter to onset at some distance from the smaller end by gradually increasing the depth of cut while keeping the spindle speed and feed rate constant.The study concludes by revealing the dependence of certain RQA variables; percent determinism, percent recurrence and entropy, to tool wear and chatter unambiguously. The performances of the results establish this methodology to be viable for detection of tool wear and chatter in metal cutting operation in a lathe. The key reason is that the dynamics of the system under study have been nonlinear and the recurrence quantification analysis can characterize them adequately.This work establishes that principles and practice of machining can be considerably benefited and advanced from using nonlinear dynamics and chaos theory.