933 resultados para planets and satellites: detection


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the overlapping distribution of Trypanosoma rangeli and T. cruzi in Central and South America, sharing several reservoirs and triatomine vectors, we herein describe a simple method to collect triatomine feces and hemolymph in filter paper for further detection and specific characterization of these two trypanosomes. Experimentally infected triatomines feces and hemolymph were collected in filter paper and specific detection of T. rangeli or T. cruzi DNA by polymerase chain reaction was achieved. This simple DNA collection method allows sample collection in the field and further specific trypanosome detection and characterization in the laboratory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this report, we examine the adaptability of commercially available serological kits to detect antibodies markers for viral hepatitis in oral fluid samples. We also assessed the prevalence of hepatitis A, B, and C virus-specific antibodies, and related risk factors for these infectious diseases through sensitivity of the tests in saliva samples to evaluate if oral fluid can be an alternative tool to substitute serum in diagnosis of acute viral hepatitis and in epidemiological studies. One hundred and ten paired serum and saliva specimens from suspect patients of having acute hepatitis were collected to detect antibodies to hepatitis A (total and IgM), hepatitis B (anti-HBs, total anti-HBc and IgM anti-HBc), and hepatitis C (anti-HCV) using commercially available enzyme-linked immunossorbent assay (EIA). In relation to serum samples, oral fluid assay sensitivity and specificity were as follows: 87 and 100% for total anti-HAV, 79 and 100% for anti-HAV IgM, 6 and 95% for anti-HBs, 13 and 100% for total anti-HBc, 100 and 100% for anti-HBc IgM, and 75 and 100% for anti-HCV. The consistency observed between antibodies tests in saliva and expected risk factors for hepatitis A and C suggests that the saliva method could replace serum in epidemiological studies for hepatitis A and C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In high hyperdiploid acute lymphoblastic leukemia (ALL), the concurrence of specific trisomies confers a more favorable outcome than hyperdiploidy alone. Interphase fluorescence in situ hybridization (FISH) complements conventional cytogenetics (CC) through its sensitivity and ability to detect chromosome aberrations in nondividing cells. To overcome the limits of manual I-FISH, we developed an automated four-color I-FISH approach and assessed its ability to detect concurrent aneuploidies in ALL. I-FISH was performed using centromeric probes for chromosomes 4, 6, 10, and 17. Parameters established for nucleus selection and signal detection were evaluated. Cutoff values were determined. Combinations of aneuploidies were considered relevant when each aneuploidy was individually significant. Results obtained in 10 patient samples were compared with those obtained with CC. Various combinations of aneuploidies were identified. All clones detected by CC were observed also by I-FISH, and I-FISH revealed numerous additional abnormal clones in all patients, ranging from < or =1% to 31.6% of cells analyzed. We conclude that four-color automated I-FISH permits the identification of concurrent aneuploidies of potential prognostic significance. Large numbers of cells can be analyzed rapidly. The large number of nuclei scored revealed a high level of chromosome variability both at diagnosis and relapse, the prognostic significance of which is of considerable clinical interest and merits further evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In hyperdiploid acute lymphoblastic leukaemia (ALL), the simultaneous occurrence of specific aneuploidies confers a more favourable outcome than hyperdiploidy alone. Interphase (I) FISH complements conventional cytogenetics (CC) through its sensitivity and ability to detect chromosome aberrations in non-dividing cells. To overcome the limits of manual I-FISH, we developed an automated four-colour I-FISH approach and assessed its ability to detect concurrent aneuploidies in ALL. I-FISH was performed using centromeric probes for chromosomes 4, 6, 10 and 17. Parameters established for automatic nucleus selection and signal detection were evaluated (3 controls). Cut-off values were determined (10 controls, 1000 nuclei/case). Combinations of aneuploidies were considered relevant when each aneuploidy was individually significant. Results obtained in 10 ALL patients (1500 nuclei/patient) were compared with those by CC. Various combinations of aneuploidies were identified. All clones detected by CC were observed by I-FISH. I-FISH revealed numerous additional abnormal clones, ranging between 0.1% and 31.6%, based on the large number of nuclei evaluated. Four-colour automated I-FISH permits the identification of concurrent aneuploidies of prognostic significance in hyperdiploid ALL. Large numbers of cells can be analysed rapidly by this method. Owing to its high sensitivity, the method provides a powerful tool for the detection of small abnormal clones at diagnosis and during follow up. Compared to CC, it generates a more detailed cytogenetic picture, the biological and clinical significance of which merits further evaluation. Once optimised for a given set of probes, the system can be easily adapted for other probe combinations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is divided into three volumes: Volume I: Strain-Based Damage Detection; Volume II: Acceleration-Based Damage Detection; Volume III: Wireless Bridge Monitoring Hardware. Volume I: In this work, a previously-developed structural health monitoring (SHM) system was advanced toward a ready-for-implementation system. Improvements were made with respect to automated data reduction/analysis, data acquisition hardware, sensor types, and communication network architecture. The statistical damage-detection tool, control-chart-based damage-detection methodologies, were further investigated and advanced. For the validation of the damage-detection approaches, strain data were obtained from a sacrificial specimen attached to the previously-utilized US 30 Bridge over the South Skunk River (in Ames, Iowa), which had simulated damage,. To provide for an enhanced ability to detect changes in the behavior of the structural system, various control chart rules were evaluated. False indications and true indications were studied to compare the damage detection ability in regard to each methodology and each control chart rule. An autonomous software program called Bridge Engineering Center Assessment Software (BECAS) was developed to control all aspects of the damage detection processes. BECAS requires no user intervention after initial configuration and training. Volume II: In this work, a previously developed structural health monitoring (SHM) system was advanced toward a ready-for-implementation system. Improvements were made with respect to automated data reduction/analysis, data acquisition hardware, sensor types, and communication network architecture. The objective of this part of the project was to validate/integrate a vibration-based damage-detection algorithm with the strain-based methodology formulated by the Iowa State University Bridge Engineering Center. This report volume (Volume II) presents the use of vibration-based damage-detection approaches as local methods to quantify damage at critical areas in structures. Acceleration data were collected and analyzed to evaluate the relationships between sensors and with changes in environmental conditions. A sacrificial specimen was investigated to verify the damage-detection capabilities and this volume presents a transmissibility concept and damage-detection algorithm that show potential to sense local changes in the dynamic stiffness between points across a joint of a real structure. The validation and integration of the vibration-based and strain-based damage-detection methodologies will add significant value to Iowa’s current and future bridge maintenance, planning, and management Volume III: In this work, a previously developed structural health monitoring (SHM) system was advanced toward a ready-for-implementation system. Improvements were made with respect to automated data reduction/analysis, data acquisition hardware, sensor types, and communication network architecture. This report volume (Volume III) summarizes the energy harvesting techniques and prototype development for a bridge monitoring system that uses wireless sensors. The wireless sensor nodes are used to collect strain measurements at critical locations on a bridge. The bridge monitoring hardware system consists of a base station and multiple self-powered wireless sensor nodes. The base station is responsible for the synchronization of data sampling on all nodes and data aggregation. Each wireless sensor node include a sensing element, a processing and wireless communication module, and an energy harvesting module. The hardware prototype for a wireless bridge monitoring system was developed and tested on the US 30 Bridge over the South Skunk River in Ames, Iowa. The functions and performance of the developed system, including strain data, energy harvesting capacity, and wireless transmission quality, were studied and are covered in this volume.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Iowa Department of Transportation initiated this research to evaluate the reliability, benefit and application of the corrosion detection device. Through field testing prior to repair projects and inspection at the time of repair, the device was shown to be reliable. With the reliability established, twelve additional devices were purchased so that this evaluation procedure could be used routinely on all repair projects. The corrosion detection device was established as a means for determining concrete removal for repair. Removal of the concrete down to the top reinforcing steel is required for all areas exhibiting electrical potentials greater than 0.45 Volt. It was determined that the corrosion detection device was not applicable to membrane testing. The corrosion detection device has been used to evaluate corrosion of reinforcing steel in continuously reinforced concrete pavement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Raman spectroscopy combined with chemometrics has recently become a widespread technique for the analysis of pharmaceutical solid forms. The application presented in this paper is the investigation of counterfeit medicines. This increasingly serious issue involves networks that are an integral part of industrialized organized crime. Efficient analytical tools are consequently required to fight against it. Quick and reliable authentication means are needed to allow the deployment of measures from the company and the authorities. For this purpose a method in two steps has been implemented here. The first step enables the identification of pharmaceutical tablets and capsules and the detection of their counterfeits. A nonlinear classification method, the Support Vector Machines (SVM), is computed together with a correlation with the database and the detection of Active Pharmaceutical Ingredient (API) peaks in the suspect product. If a counterfeit is detected, the second step allows its chemical profiling among former counterfeits in a forensic intelligence perspective. For this second step a classification based on Principal Component Analysis (PCA) and correlation distance measurements is applied to the Raman spectra of the counterfeits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Polysialic acid is a carbohydrate polymer which consist of N-acetylneuraminic acid units joined by alpha2,8-linkages. It is developmentally regulated and has an important role during normal neuronal development. In adults, it participates in complex neurological processes, such as memory, neural plasticity, tumor cell growth and metastasis. Polysialic acid also constitutes the capsule of some meningitis and sepsis-causing bacteria, such as Escherichia coli K1, group B meningococci, Mannheimia haemolytica A2 and Moraxella nonliquefaciens. Polysialic acid is poorly immunogenic; therefore high affinity antibodies against it are difficult to prepare, thus specific and fast detection methods are needed. Endosialidase is an enzyme derived from the E. coli K1 bacteriophage, which specifically recognizes and degrades polysialic acid. In this study, a novel detection method for polysialic acid was developed based on a fusion protein of inactive endosialidase and the green fluorescent protein. It utilizes the ability of the mutant, inactive endosialidase to bind but not cleave polysialic acid. Sequencing of the endosialidase gene revealed that amino acid substitutions near the active site of the enzyme differentiate the active and inactive forms of the enzyme. The fusion protein was applied for the detection of polysialic acid in bacteria and neuroblastoma. The results indicate that the fusion protein is a fast, sensitive and specific reagent for the detection of polysialic acid. The use of an inactive enzyme as a specific molecular tool for the detection of its substrate represents an approach which could potentially find wide applicability in the specific detection of diverse macromolecules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on the development and validation of a loop-mediated isothermal amplification assay (LAMP) for the rapid and specific detection of Actinobacillus pleuropneumoniae (A. pleuropneumoniae). A set of six primers were designed derived from the dsbE-like gene of A.pleuropneumoniae and validate the assay using 9 A. pleuropneumoniae reference/field strains, 132 clinical isolates and 9 other pathogens. The results indicated that positive reactions were confirmed for all A. pleuropneumoniae strains and specimens by LAMP at 63ºC for 60 min and no cross-reactivity were observed from other non-A.pleuropneumoniae including Haemophilus parasuis, Escherichia coli, Pasteurella multocida, Bordetella bronchiseptica, Streptococcus suis, Salmonella enterica, Staphylococcus, porcine reproductive and respiratory syndrome virus (PRRSV), and Pseudorabies virus. The detection limit of the conventional PCR was 10² CFU per PCR test tube, while that of the LAMP was 5 copies per tube. Therefore, the sensitivity of LAMP was higher than that of PCR. Moreover, the LAMP assay provided a rapid yet simple test of A. pleuropneumoniae suitable for laboratory diagnosis and pen-side detection due to ease of operation and the requirement of only a regular water bath or heat block for the reaction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Billings and Guarapiranga Reservoirs were deeply affected by environmental disturbances, which more evident consequence are the cyanobacterial blooms. Microcystins are the most common cyanotoxin in freshwaters and more than 70 types are known. Different methods for microcystins analysis in water can be used, among which ELISA and HPLC are the most frequently employed. However, less sophisticated and more economic methods can also be used. This is the case of planar chromatography (thin-layer chromatography) method previously used in cyanotoxins purification but gradually replaced by others. Posterior optimization of the microcystin chromatography conditions and because of its simplicity, rapidity, efficiency and low cost, this method is again considered an option for the analysis of microcystins and nodularins. Considering the importance of Billings and Guarapiranga Reservoirs for drinking water supplies and the few scientific data about cyanobacteria and cyanotoxins in these water bodies, the aims of this work are to analyze the biodiversity of cyanobacteria in the Billings and Guarapiranga Reservoirs and the detection of dissolved microcystins in the water. It was possible to identify 17 species of cyanobacteria, 9 of them being potentially toxic. In Billings Reservoir Microcystis aeruginosa (Kützing) Kützing and Cylindrospermopsis raciborskii (Woloszynska) Seenayya & Subba Raju are the most common species, while in Guarapiranga Reservoir only M. aeruginosa was considered as a common species. Microcystins were detected in all Billings Reservoir samples and in only one sample from Guarapiranga Reservoir.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les changements sont faits de façon continue dans le code source des logiciels pour prendre en compte les besoins des clients et corriger les fautes. Les changements continus peuvent conduire aux défauts de code et de conception. Les défauts de conception sont des mauvaises solutions à des problèmes récurrents de conception ou d’implémentation, généralement dans le développement orienté objet. Au cours des activités de compréhension et de changement et en raison du temps d’accès au marché, du manque de compréhension, et de leur expérience, les développeurs ne peuvent pas toujours suivre les normes de conception et les techniques de codage comme les patrons de conception. Par conséquent, ils introduisent des défauts de conception dans leurs systèmes. Dans la littérature, plusieurs auteurs ont fait valoir que les défauts de conception rendent les systèmes orientés objet plus difficile à comprendre, plus sujets aux fautes, et plus difficiles à changer que les systèmes sans les défauts de conception. Pourtant, seulement quelques-uns de ces auteurs ont fait une étude empirique sur l’impact des défauts de conception sur la compréhension et aucun d’entre eux n’a étudié l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes. Dans cette thèse, nous proposons trois principales contributions. La première contribution est une étude empirique pour apporter des preuves de l’impact des défauts de conception sur la compréhension et le changement. Nous concevons et effectuons deux expériences avec 59 sujets, afin d’évaluer l’impact de la composition de deux occurrences de Blob ou deux occurrences de spaghetti code sur la performance des développeurs effectuant des tâches de compréhension et de changement. Nous mesurons la performance des développeurs en utilisant: (1) l’indice de charge de travail de la NASA pour leurs efforts, (2) le temps qu’ils ont passé dans l’accomplissement de leurs tâches, et (3) les pourcentages de bonnes réponses. Les résultats des deux expériences ont montré que deux occurrences de Blob ou de spaghetti code sont un obstacle significatif pour la performance des développeurs lors de tâches de compréhension et de changement. Les résultats obtenus justifient les recherches antérieures sur la spécification et la détection des défauts de conception. Les équipes de développement de logiciels doivent mettre en garde les développeurs contre le nombre élevé d’occurrences de défauts de conception et recommander des refactorisations à chaque étape du processus de développement pour supprimer ces défauts de conception quand c’est possible. Dans la deuxième contribution, nous étudions la relation entre les défauts de conception et les fautes. Nous étudions l’impact de la présence des défauts de conception sur l’effort nécessaire pour corriger les fautes. Nous mesurons l’effort pour corriger les fautes à l’aide de trois indicateurs: (1) la durée de la période de correction, (2) le nombre de champs et méthodes touchés par la correction des fautes et (3) l’entropie des corrections de fautes dans le code-source. Nous menons une étude empirique avec 12 défauts de conception détectés dans 54 versions de quatre systèmes: ArgoUML, Eclipse, Mylyn, et Rhino. Nos résultats ont montré que la durée de la période de correction est plus longue pour les fautes impliquant des classes avec des défauts de conception. En outre, la correction des fautes dans les classes avec des défauts de conception fait changer plus de fichiers, plus les champs et des méthodes. Nous avons également observé que, après la correction d’une faute, le nombre d’occurrences de défauts de conception dans les classes impliquées dans la correction de la faute diminue. Comprendre l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes est important afin d’aider les équipes de développement pour mieux évaluer et prévoir l’impact de leurs décisions de conception et donc canaliser leurs efforts pour améliorer la qualité de leurs systèmes. Les équipes de développement doivent contrôler et supprimer les défauts de conception de leurs systèmes car ils sont susceptibles d’augmenter les efforts de changement. La troisième contribution concerne la détection des défauts de conception. Pendant les activités de maintenance, il est important de disposer d’un outil capable de détecter les défauts de conception de façon incrémentale et itérative. Ce processus de détection incrémentale et itérative pourrait réduire les coûts, les efforts et les ressources en permettant aux praticiens d’identifier et de prendre en compte les occurrences de défauts de conception comme ils les trouvent lors de la compréhension et des changements. Les chercheurs ont proposé des approches pour détecter les occurrences de défauts de conception, mais ces approches ont actuellement quatre limites: (1) elles nécessitent une connaissance approfondie des défauts de conception, (2) elles ont une précision et un rappel limités, (3) elles ne sont pas itératives et incrémentales et (4) elles ne peuvent pas être appliquées sur des sous-ensembles de systèmes. Pour surmonter ces limitations, nous introduisons SMURF, une nouvelle approche pour détecter les défauts de conception, basé sur une technique d’apprentissage automatique — machines à vecteur de support — et prenant en compte les retours des praticiens. Grâce à une étude empirique portant sur trois systèmes et quatre défauts de conception, nous avons montré que la précision et le rappel de SMURF sont supérieurs à ceux de DETEX et BDTEX lors de la détection des occurrences de défauts de conception. Nous avons également montré que SMURF peut être appliqué à la fois dans les configurations intra-système et inter-système. Enfin, nous avons montré que la précision et le rappel de SMURF sont améliorés quand on prend en compte les retours des praticiens.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ce mémoire de maîtrise présente une nouvelle approche non supervisée pour détecter et segmenter les régions urbaines dans les images hyperspectrales. La méthode proposée n ́ecessite trois étapes. Tout d’abord, afin de réduire le coût calculatoire de notre algorithme, une image couleur du contenu spectral est estimée. A cette fin, une étape de réduction de dimensionalité non-linéaire, basée sur deux critères complémentaires mais contradictoires de bonne visualisation; à savoir la précision et le contraste, est réalisée pour l’affichage couleur de chaque image hyperspectrale. Ensuite, pour discriminer les régions urbaines des régions non urbaines, la seconde étape consiste à extraire quelques caractéristiques discriminantes (et complémentaires) sur cette image hyperspectrale couleur. A cette fin, nous avons extrait une série de paramètres discriminants pour décrire les caractéristiques d’une zone urbaine, principalement composée d’objets manufacturés de formes simples g ́eométriques et régulières. Nous avons utilisé des caractéristiques texturales basées sur les niveaux de gris, la magnitude du gradient ou des paramètres issus de la matrice de co-occurrence combinés avec des caractéristiques structurelles basées sur l’orientation locale du gradient de l’image et la détection locale de segments de droites. Afin de réduire encore la complexité de calcul de notre approche et éviter le problème de la ”malédiction de la dimensionnalité” quand on décide de regrouper des données de dimensions élevées, nous avons décidé de classifier individuellement, dans la dernière étape, chaque caractéristique texturale ou structurelle avec une simple procédure de K-moyennes et ensuite de combiner ces segmentations grossières, obtenues à faible coût, avec un modèle efficace de fusion de cartes de segmentations. Les expérimentations données dans ce rapport montrent que cette stratégie est efficace visuellement et se compare favorablement aux autres méthodes de détection et segmentation de zones urbaines à partir d’images hyperspectrales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La presencia de microorganismos patógenos en alimentos es uno de los problemas esenciales en salud pública, y las enfermedades producidas por los mismos es una de las causas más importantes de enfermedad. Por tanto, la aplicación de controles microbiológicos dentro de los programas de aseguramiento de la calidad es una premisa para minimizar el riesgo de infección de los consumidores. Los métodos microbiológicos clásicos requieren, en general, el uso de pre-enriquecimientos no-selectivos, enriquecimientos selectivos, aislamiento en medios selectivos y la confirmación posterior usando pruebas basadas en la morfología, bioquímica y serología propias de cada uno de los microorganismos objeto de estudio. Por lo tanto, estos métodos son laboriosos, requieren un largo proceso para obtener resultados definitivos y, además, no siempre pueden realizarse. Para solucionar estos inconvenientes se han desarrollado diversas metodologías alternativas para la detección identificación y cuantificación de microorganismos patógenos de origen alimentario, entre las que destacan los métodos inmunológicos y moleculares. En esta última categoría, la técnica basada en la reacción en cadena de la polimerasa (PCR) se ha convertido en la técnica diagnóstica más popular en microbiología, y recientemente, la introducción de una mejora de ésta, la PCR a tiempo real, ha producido una segunda revolución en la metodología diagnóstica molecular, como pude observarse por el número creciente de publicaciones científicas y la aparición continua de nuevos kits comerciales. La PCR a tiempo real es una técnica altamente sensible -detección de hasta una molécula- que permite la cuantificación exacta de secuencias de ADN específicas de microorganismos patógenos de origen alimentario. Además, otras ventajas que favorecen su implantación potencial en laboratorios de análisis de alimentos son su rapidez, sencillez y el formato en tubo cerrado que puede evitar contaminaciones post-PCR y favorece la automatización y un alto rendimiento. En este trabajo se han desarrollado técnicas moleculares (PCR y NASBA) sensibles y fiables para la detección, identificación y cuantificación de bacterias patogénicas de origen alimentario (Listeria spp., Mycobacterium avium subsp. paratuberculosis y Salmonella spp.). En concreto, se han diseñado y optimizado métodos basados en la técnica de PCR a tiempo real para cada uno de estos agentes: L. monocytogenes, L. innocua, Listeria spp. M. avium subsp. paratuberculosis, y también se ha optimizado y evaluado en diferentes centros un método previamente desarrollado para Salmonella spp. Además, se ha diseñado y optimizado un método basado en la técnica NASBA para la detección específica de M. avium subsp. paratuberculosis. También se evaluó la aplicación potencial de la técnica NASBA para la detección específica de formas viables de este microorganismo. Todos los métodos presentaron una especificidad del 100 % con una sensibilidad adecuada para su aplicación potencial a muestras reales de alimentos. Además, se han desarrollado y evaluado procedimientos de preparación de las muestras en productos cárnicos, productos pesqueros, leche y agua. De esta manera se han desarrollado métodos basados en la PCR a tiempo real totalmente específicos y altamente sensibles para la determinación cuantitativa de L. monocytogenes en productos cárnicos y en salmón y productos derivados como el salmón ahumado y de M. avium subsp. paratuberculosis en muestras de agua y leche. Además este último método ha sido también aplicado para evaluar la presencia de este microorganismo en el intestino de pacientes con la enfermedad de Crohn's, a partir de biopsias obtenidas de colonoscopia de voluntarios afectados. En conclusión, este estudio presenta ensayos moleculares selectivos y sensibles para la detección de patógenos en alimentos (Listeria spp., Mycobacterium avium subsp. paratuberculosis) y para una rápida e inambigua identificación de Salmonella spp. La exactitud relativa de los ensayos ha sido excelente, si se comparan con los métodos microbiológicos de referencia y pueden serusados para la cuantificación de tanto ADN genómico como de suspensiones celulares. Por otro lado, la combinación con tratamientos de preamplificación ha resultado ser de gran eficiencia para el análisis de las bacterias objeto de estudio. Por tanto, pueden constituir una estrategia útil para la detección rápida y sensible de patógenos en alimentos y deberían ser una herramienta adicional al rango de herramientas diagnósticas disponibles para el estudio de patógenos de origen alimentario.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have discovered a novel approach of intrusion detection system using an intelligent data classifier based on a self organizing map (SOM). We have surveyed all other unsupervised intrusion detection methods, different alternative SOM based techniques and KDD winner IDS methods. This paper provides a robust designed and implemented intelligent data classifier technique based on a single large size (30x30) self organizing map (SOM) having the capability to detect all types of attacks given in the DARPA Archive 1999 the lowest false positive rate being 0.04 % and higher detection rate being 99.73% tested using full KDD data sets and 89.54% comparable detection rate and 0.18% lowest false positive rate tested using corrected data sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parkinson is a neurodegenerative disease, in which tremor is the main symptom. This paper investigates the use of different classification methods to identify tremors experienced by Parkinsonian patients.Some previous research has focussed tremor analysis on external body signals (e.g., electromyography, accelerometer signals, etc.). Our advantage is that we have access to sub-cortical data, which facilitates the applicability of the obtained results into real medical devices since we are dealing with brain signals directly. Local field potentials (LFP) were recorded in the subthalamic nucleus of 7 Parkinsonian patients through the implanted electrodes of a deep brain stimulation (DBS) device prior to its internalization. Measured LFP signals were preprocessed by means of splinting, down sampling, filtering, normalization and rec-tification. Then, feature extraction was conducted through a multi-level decomposition via a wavelettrans form. Finally, artificial intelligence techniques were applied to feature selection, clustering of tremor types, and tremor detection.The key contribution of this paper is to present initial results which indicate, to a high degree of certainty, that there appear to be two distinct subgroups of patients within the group-1 of patients according to the Consensus Statement of the Movement Disorder Society on Tremor. Such results may well lead to different resultant treatments for the patients involved, depending on how their tremor has been classified. Moreover, we propose a new approach for demand driven stimulation, in which tremor detection is also based on the subtype of tremor the patient has. Applying this knowledge to the tremor detection problem, it can be concluded that the results improve when patient clustering is applied prior to detection.