925 resultados para incompleteness and inconsistency detection
Resumo:
Prostate Specific Antigen (PSA) is the biomarker of choice for screening prostate cancer throughout the population, with PSA values above 10 ng/mL pointing out a high probability of associated cancer1. According to the most recent World Health Organization (WHO) data, prostate cancer is the commonest form of cancer in men in Europe2. Early detection of prostate cancer is thus very important and is currently made by screening PSA in men over 45 years old, combined with other alterations in serum and urine parameters. PSA is a glycoprotein with a molecular mass of approximately 32 kDa consisting of one polypeptide chain, which is produced by the secretory epithelium of human prostate. Currently, the standard methods available for PSA screening are immunoassays like Enzyme-Linked Immunoabsorbent Assay (ELISA). These methods are highly sensitive and specific for the detection of PSA, but they require expensive laboratory facilities and high qualify personal resources. Other highly sensitive and specific methods for the detection of PSA have also become available and are in its majority immunobiosensors1,3-5, relying on antibodies. Less expensive methods producing quicker responses are thus needed, which may be achieved by synthesizing artificial antibodies by means of molecular imprinting techniques. These should also be coupled to simple and low cost devices, such as those of the potentiometric kind, one approach that has been proven successful6. Potentiometric sensors offer the advantage of selectivity and portability for use in point-of-care and have been widely recognized as potential analytical tools in this field. The inherent method is simple, precise, accurate and inexpensive regarding reagent consumption and equipment involved. Thus, this work proposes a new plastic antibody for PSA, designed over the surface of graphene layers extracted from graphite. Charged monomers were used to enable an oriented tailoring of the PSA rebinding sites. Uncharged monomers were used as control. These materials were used as ionophores in conventional solid-contact graphite electrodes. The obtained results showed that the imprinted materials displayed a selective response to PSA. The electrodes with charged monomers showed a more stable and sensitive response, with an average slope of -44.2 mV/decade and a detection limit of 5.8X10-11 mol/L (2 ng/mL). The corresponding non-imprinted sensors showed smaller sensitivity, with average slopes of -24.8 mV/decade. The best sensors were successfully applied to the analysis of serum samples, with percentage recoveries of 106.5% and relatives errors of 6.5%.
Simultaneous detection of cyclopiazonic acid and aflatoxin B1 by HPLC in methanol/water mobile phase
Resumo:
A simple procedure for the simultaneous detection of cyclopiazonic acid (CPA) and aflatoxin B1 from fungal extracts is presented, using a methanol and water mobile phase and fluorescence detection. This methodology has been tested with standard solutions of both mycotoxins CPA and Aflatoxin B1 and with methanolic extracts of Aspergillus section Flavi strains, previously characterized for their mycotoxin production profile. Previously available methodology required the use of two different chromatographic runs for these mycotoxins, with distinct columns and detectors (fluorescence detection with a post-column photochemical derivatization (PHRED) for aflatoxin B1 and UV detection for CPA). The proposed method detects both mycotoxins in a single run. Data from these assays will be presented and discussed.
Resumo:
In hyperdiploid acute lymphoblastic leukaemia (ALL), the simultaneous occurrence of specific aneuploidies confers a more favourable outcome than hyperdiploidy alone. Interphase (I) FISH complements conventional cytogenetics (CC) through its sensitivity and ability to detect chromosome aberrations in non-dividing cells. To overcome the limits of manual I-FISH, we developed an automated four-colour I-FISH approach and assessed its ability to detect concurrent aneuploidies in ALL. I-FISH was performed using centromeric probes for chromosomes 4, 6, 10 and 17. Parameters established for automatic nucleus selection and signal detection were evaluated (3 controls). Cut-off values were determined (10 controls, 1000 nuclei/case). Combinations of aneuploidies were considered relevant when each aneuploidy was individually significant. Results obtained in 10 ALL patients (1500 nuclei/patient) were compared with those by CC. Various combinations of aneuploidies were identified. All clones detected by CC were observed by I-FISH. I-FISH revealed numerous additional abnormal clones, ranging between 0.1 % and 31.6%, based on the large number of nuclei evaluated. Four-colour automated I-FISH permits the identification of concurrent aneuploidies of prognostic significance in hyperdiploid ALL. Large numbers of cells can be analysed rapidly by this method. Owing to its high sensitivity, the method provides a powerful tool for the detection of small abnormal clones at diagnosis and during follow up. Compared to CC, it generates a more detailed cytogenetic picture, the biological and clinical significance of which merits further evaluation. Once optimised for a given set of probes, the system can be easily adapted for other probe combinations.
Resumo:
Due to the overlapping distribution of Trypanosoma rangeli and T. cruzi in Central and South America, sharing several reservoirs and triatomine vectors, we herein describe a simple method to collect triatomine feces and hemolymph in filter paper for further detection and specific characterization of these two trypanosomes. Experimentally infected triatomines feces and hemolymph were collected in filter paper and specific detection of T. rangeli or T. cruzi DNA by polymerase chain reaction was achieved. This simple DNA collection method allows sample collection in the field and further specific trypanosome detection and characterization in the laboratory.
Resumo:
In this report, we examine the adaptability of commercially available serological kits to detect antibodies markers for viral hepatitis in oral fluid samples. We also assessed the prevalence of hepatitis A, B, and C virus-specific antibodies, and related risk factors for these infectious diseases through sensitivity of the tests in saliva samples to evaluate if oral fluid can be an alternative tool to substitute serum in diagnosis of acute viral hepatitis and in epidemiological studies. One hundred and ten paired serum and saliva specimens from suspect patients of having acute hepatitis were collected to detect antibodies to hepatitis A (total and IgM), hepatitis B (anti-HBs, total anti-HBc and IgM anti-HBc), and hepatitis C (anti-HCV) using commercially available enzyme-linked immunossorbent assay (EIA). In relation to serum samples, oral fluid assay sensitivity and specificity were as follows: 87 and 100% for total anti-HAV, 79 and 100% for anti-HAV IgM, 6 and 95% for anti-HBs, 13 and 100% for total anti-HBc, 100 and 100% for anti-HBc IgM, and 75 and 100% for anti-HCV. The consistency observed between antibodies tests in saliva and expected risk factors for hepatitis A and C suggests that the saliva method could replace serum in epidemiological studies for hepatitis A and C.
Resumo:
In high hyperdiploid acute lymphoblastic leukemia (ALL), the concurrence of specific trisomies confers a more favorable outcome than hyperdiploidy alone. Interphase fluorescence in situ hybridization (FISH) complements conventional cytogenetics (CC) through its sensitivity and ability to detect chromosome aberrations in nondividing cells. To overcome the limits of manual I-FISH, we developed an automated four-color I-FISH approach and assessed its ability to detect concurrent aneuploidies in ALL. I-FISH was performed using centromeric probes for chromosomes 4, 6, 10, and 17. Parameters established for nucleus selection and signal detection were evaluated. Cutoff values were determined. Combinations of aneuploidies were considered relevant when each aneuploidy was individually significant. Results obtained in 10 patient samples were compared with those obtained with CC. Various combinations of aneuploidies were identified. All clones detected by CC were observed also by I-FISH, and I-FISH revealed numerous additional abnormal clones in all patients, ranging from < or =1% to 31.6% of cells analyzed. We conclude that four-color automated I-FISH permits the identification of concurrent aneuploidies of potential prognostic significance. Large numbers of cells can be analyzed rapidly. The large number of nuclei scored revealed a high level of chromosome variability both at diagnosis and relapse, the prognostic significance of which is of considerable clinical interest and merits further evaluation.
Resumo:
In hyperdiploid acute lymphoblastic leukaemia (ALL), the simultaneous occurrence of specific aneuploidies confers a more favourable outcome than hyperdiploidy alone. Interphase (I) FISH complements conventional cytogenetics (CC) through its sensitivity and ability to detect chromosome aberrations in non-dividing cells. To overcome the limits of manual I-FISH, we developed an automated four-colour I-FISH approach and assessed its ability to detect concurrent aneuploidies in ALL. I-FISH was performed using centromeric probes for chromosomes 4, 6, 10 and 17. Parameters established for automatic nucleus selection and signal detection were evaluated (3 controls). Cut-off values were determined (10 controls, 1000 nuclei/case). Combinations of aneuploidies were considered relevant when each aneuploidy was individually significant. Results obtained in 10 ALL patients (1500 nuclei/patient) were compared with those by CC. Various combinations of aneuploidies were identified. All clones detected by CC were observed by I-FISH. I-FISH revealed numerous additional abnormal clones, ranging between 0.1% and 31.6%, based on the large number of nuclei evaluated. Four-colour automated I-FISH permits the identification of concurrent aneuploidies of prognostic significance in hyperdiploid ALL. Large numbers of cells can be analysed rapidly by this method. Owing to its high sensitivity, the method provides a powerful tool for the detection of small abnormal clones at diagnosis and during follow up. Compared to CC, it generates a more detailed cytogenetic picture, the biological and clinical significance of which merits further evaluation. Once optimised for a given set of probes, the system can be easily adapted for other probe combinations.
Resumo:
This work is divided into three volumes: Volume I: Strain-Based Damage Detection; Volume II: Acceleration-Based Damage Detection; Volume III: Wireless Bridge Monitoring Hardware. Volume I: In this work, a previously-developed structural health monitoring (SHM) system was advanced toward a ready-for-implementation system. Improvements were made with respect to automated data reduction/analysis, data acquisition hardware, sensor types, and communication network architecture. The statistical damage-detection tool, control-chart-based damage-detection methodologies, were further investigated and advanced. For the validation of the damage-detection approaches, strain data were obtained from a sacrificial specimen attached to the previously-utilized US 30 Bridge over the South Skunk River (in Ames, Iowa), which had simulated damage,. To provide for an enhanced ability to detect changes in the behavior of the structural system, various control chart rules were evaluated. False indications and true indications were studied to compare the damage detection ability in regard to each methodology and each control chart rule. An autonomous software program called Bridge Engineering Center Assessment Software (BECAS) was developed to control all aspects of the damage detection processes. BECAS requires no user intervention after initial configuration and training. Volume II: In this work, a previously developed structural health monitoring (SHM) system was advanced toward a ready-for-implementation system. Improvements were made with respect to automated data reduction/analysis, data acquisition hardware, sensor types, and communication network architecture. The objective of this part of the project was to validate/integrate a vibration-based damage-detection algorithm with the strain-based methodology formulated by the Iowa State University Bridge Engineering Center. This report volume (Volume II) presents the use of vibration-based damage-detection approaches as local methods to quantify damage at critical areas in structures. Acceleration data were collected and analyzed to evaluate the relationships between sensors and with changes in environmental conditions. A sacrificial specimen was investigated to verify the damage-detection capabilities and this volume presents a transmissibility concept and damage-detection algorithm that show potential to sense local changes in the dynamic stiffness between points across a joint of a real structure. The validation and integration of the vibration-based and strain-based damage-detection methodologies will add significant value to Iowa’s current and future bridge maintenance, planning, and management Volume III: In this work, a previously developed structural health monitoring (SHM) system was advanced toward a ready-for-implementation system. Improvements were made with respect to automated data reduction/analysis, data acquisition hardware, sensor types, and communication network architecture. This report volume (Volume III) summarizes the energy harvesting techniques and prototype development for a bridge monitoring system that uses wireless sensors. The wireless sensor nodes are used to collect strain measurements at critical locations on a bridge. The bridge monitoring hardware system consists of a base station and multiple self-powered wireless sensor nodes. The base station is responsible for the synchronization of data sampling on all nodes and data aggregation. Each wireless sensor node include a sensing element, a processing and wireless communication module, and an energy harvesting module. The hardware prototype for a wireless bridge monitoring system was developed and tested on the US 30 Bridge over the South Skunk River in Ames, Iowa. The functions and performance of the developed system, including strain data, energy harvesting capacity, and wireless transmission quality, were studied and are covered in this volume.
Resumo:
The Iowa Department of Transportation initiated this research to evaluate the reliability, benefit and application of the corrosion detection device. Through field testing prior to repair projects and inspection at the time of repair, the device was shown to be reliable. With the reliability established, twelve additional devices were purchased so that this evaluation procedure could be used routinely on all repair projects. The corrosion detection device was established as a means for determining concrete removal for repair. Removal of the concrete down to the top reinforcing steel is required for all areas exhibiting electrical potentials greater than 0.45 Volt. It was determined that the corrosion detection device was not applicable to membrane testing. The corrosion detection device has been used to evaluate corrosion of reinforcing steel in continuously reinforced concrete pavement.
Resumo:
Raman spectroscopy combined with chemometrics has recently become a widespread technique for the analysis of pharmaceutical solid forms. The application presented in this paper is the investigation of counterfeit medicines. This increasingly serious issue involves networks that are an integral part of industrialized organized crime. Efficient analytical tools are consequently required to fight against it. Quick and reliable authentication means are needed to allow the deployment of measures from the company and the authorities. For this purpose a method in two steps has been implemented here. The first step enables the identification of pharmaceutical tablets and capsules and the detection of their counterfeits. A nonlinear classification method, the Support Vector Machines (SVM), is computed together with a correlation with the database and the detection of Active Pharmaceutical Ingredient (API) peaks in the suspect product. If a counterfeit is detected, the second step allows its chemical profiling among former counterfeits in a forensic intelligence perspective. For this second step a classification based on Principal Component Analysis (PCA) and correlation distance measurements is applied to the Raman spectra of the counterfeits.
Resumo:
Polysialic acid is a carbohydrate polymer which consist of N-acetylneuraminic acid units joined by alpha2,8-linkages. It is developmentally regulated and has an important role during normal neuronal development. In adults, it participates in complex neurological processes, such as memory, neural plasticity, tumor cell growth and metastasis. Polysialic acid also constitutes the capsule of some meningitis and sepsis-causing bacteria, such as Escherichia coli K1, group B meningococci, Mannheimia haemolytica A2 and Moraxella nonliquefaciens. Polysialic acid is poorly immunogenic; therefore high affinity antibodies against it are difficult to prepare, thus specific and fast detection methods are needed. Endosialidase is an enzyme derived from the E. coli K1 bacteriophage, which specifically recognizes and degrades polysialic acid. In this study, a novel detection method for polysialic acid was developed based on a fusion protein of inactive endosialidase and the green fluorescent protein. It utilizes the ability of the mutant, inactive endosialidase to bind but not cleave polysialic acid. Sequencing of the endosialidase gene revealed that amino acid substitutions near the active site of the enzyme differentiate the active and inactive forms of the enzyme. The fusion protein was applied for the detection of polysialic acid in bacteria and neuroblastoma. The results indicate that the fusion protein is a fast, sensitive and specific reagent for the detection of polysialic acid. The use of an inactive enzyme as a specific molecular tool for the detection of its substrate represents an approach which could potentially find wide applicability in the specific detection of diverse macromolecules.
Resumo:
This paper reports on the development and validation of a loop-mediated isothermal amplification assay (LAMP) for the rapid and specific detection of Actinobacillus pleuropneumoniae (A. pleuropneumoniae). A set of six primers were designed derived from the dsbE-like gene of A.pleuropneumoniae and validate the assay using 9 A. pleuropneumoniae reference/field strains, 132 clinical isolates and 9 other pathogens. The results indicated that positive reactions were confirmed for all A. pleuropneumoniae strains and specimens by LAMP at 63ºC for 60 min and no cross-reactivity were observed from other non-A.pleuropneumoniae including Haemophilus parasuis, Escherichia coli, Pasteurella multocida, Bordetella bronchiseptica, Streptococcus suis, Salmonella enterica, Staphylococcus, porcine reproductive and respiratory syndrome virus (PRRSV), and Pseudorabies virus. The detection limit of the conventional PCR was 10² CFU per PCR test tube, while that of the LAMP was 5 copies per tube. Therefore, the sensitivity of LAMP was higher than that of PCR. Moreover, the LAMP assay provided a rapid yet simple test of A. pleuropneumoniae suitable for laboratory diagnosis and pen-side detection due to ease of operation and the requirement of only a regular water bath or heat block for the reaction.
Resumo:
Billings and Guarapiranga Reservoirs were deeply affected by environmental disturbances, which more evident consequence are the cyanobacterial blooms. Microcystins are the most common cyanotoxin in freshwaters and more than 70 types are known. Different methods for microcystins analysis in water can be used, among which ELISA and HPLC are the most frequently employed. However, less sophisticated and more economic methods can also be used. This is the case of planar chromatography (thin-layer chromatography) method previously used in cyanotoxins purification but gradually replaced by others. Posterior optimization of the microcystin chromatography conditions and because of its simplicity, rapidity, efficiency and low cost, this method is again considered an option for the analysis of microcystins and nodularins. Considering the importance of Billings and Guarapiranga Reservoirs for drinking water supplies and the few scientific data about cyanobacteria and cyanotoxins in these water bodies, the aims of this work are to analyze the biodiversity of cyanobacteria in the Billings and Guarapiranga Reservoirs and the detection of dissolved microcystins in the water. It was possible to identify 17 species of cyanobacteria, 9 of them being potentially toxic. In Billings Reservoir Microcystis aeruginosa (Kützing) Kützing and Cylindrospermopsis raciborskii (Woloszynska) Seenayya & Subba Raju are the most common species, while in Guarapiranga Reservoir only M. aeruginosa was considered as a common species. Microcystins were detected in all Billings Reservoir samples and in only one sample from Guarapiranga Reservoir.
Resumo:
Les changements sont faits de façon continue dans le code source des logiciels pour prendre en compte les besoins des clients et corriger les fautes. Les changements continus peuvent conduire aux défauts de code et de conception. Les défauts de conception sont des mauvaises solutions à des problèmes récurrents de conception ou d’implémentation, généralement dans le développement orienté objet. Au cours des activités de compréhension et de changement et en raison du temps d’accès au marché, du manque de compréhension, et de leur expérience, les développeurs ne peuvent pas toujours suivre les normes de conception et les techniques de codage comme les patrons de conception. Par conséquent, ils introduisent des défauts de conception dans leurs systèmes. Dans la littérature, plusieurs auteurs ont fait valoir que les défauts de conception rendent les systèmes orientés objet plus difficile à comprendre, plus sujets aux fautes, et plus difficiles à changer que les systèmes sans les défauts de conception. Pourtant, seulement quelques-uns de ces auteurs ont fait une étude empirique sur l’impact des défauts de conception sur la compréhension et aucun d’entre eux n’a étudié l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes. Dans cette thèse, nous proposons trois principales contributions. La première contribution est une étude empirique pour apporter des preuves de l’impact des défauts de conception sur la compréhension et le changement. Nous concevons et effectuons deux expériences avec 59 sujets, afin d’évaluer l’impact de la composition de deux occurrences de Blob ou deux occurrences de spaghetti code sur la performance des développeurs effectuant des tâches de compréhension et de changement. Nous mesurons la performance des développeurs en utilisant: (1) l’indice de charge de travail de la NASA pour leurs efforts, (2) le temps qu’ils ont passé dans l’accomplissement de leurs tâches, et (3) les pourcentages de bonnes réponses. Les résultats des deux expériences ont montré que deux occurrences de Blob ou de spaghetti code sont un obstacle significatif pour la performance des développeurs lors de tâches de compréhension et de changement. Les résultats obtenus justifient les recherches antérieures sur la spécification et la détection des défauts de conception. Les équipes de développement de logiciels doivent mettre en garde les développeurs contre le nombre élevé d’occurrences de défauts de conception et recommander des refactorisations à chaque étape du processus de développement pour supprimer ces défauts de conception quand c’est possible. Dans la deuxième contribution, nous étudions la relation entre les défauts de conception et les fautes. Nous étudions l’impact de la présence des défauts de conception sur l’effort nécessaire pour corriger les fautes. Nous mesurons l’effort pour corriger les fautes à l’aide de trois indicateurs: (1) la durée de la période de correction, (2) le nombre de champs et méthodes touchés par la correction des fautes et (3) l’entropie des corrections de fautes dans le code-source. Nous menons une étude empirique avec 12 défauts de conception détectés dans 54 versions de quatre systèmes: ArgoUML, Eclipse, Mylyn, et Rhino. Nos résultats ont montré que la durée de la période de correction est plus longue pour les fautes impliquant des classes avec des défauts de conception. En outre, la correction des fautes dans les classes avec des défauts de conception fait changer plus de fichiers, plus les champs et des méthodes. Nous avons également observé que, après la correction d’une faute, le nombre d’occurrences de défauts de conception dans les classes impliquées dans la correction de la faute diminue. Comprendre l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes est important afin d’aider les équipes de développement pour mieux évaluer et prévoir l’impact de leurs décisions de conception et donc canaliser leurs efforts pour améliorer la qualité de leurs systèmes. Les équipes de développement doivent contrôler et supprimer les défauts de conception de leurs systèmes car ils sont susceptibles d’augmenter les efforts de changement. La troisième contribution concerne la détection des défauts de conception. Pendant les activités de maintenance, il est important de disposer d’un outil capable de détecter les défauts de conception de façon incrémentale et itérative. Ce processus de détection incrémentale et itérative pourrait réduire les coûts, les efforts et les ressources en permettant aux praticiens d’identifier et de prendre en compte les occurrences de défauts de conception comme ils les trouvent lors de la compréhension et des changements. Les chercheurs ont proposé des approches pour détecter les occurrences de défauts de conception, mais ces approches ont actuellement quatre limites: (1) elles nécessitent une connaissance approfondie des défauts de conception, (2) elles ont une précision et un rappel limités, (3) elles ne sont pas itératives et incrémentales et (4) elles ne peuvent pas être appliquées sur des sous-ensembles de systèmes. Pour surmonter ces limitations, nous introduisons SMURF, une nouvelle approche pour détecter les défauts de conception, basé sur une technique d’apprentissage automatique — machines à vecteur de support — et prenant en compte les retours des praticiens. Grâce à une étude empirique portant sur trois systèmes et quatre défauts de conception, nous avons montré que la précision et le rappel de SMURF sont supérieurs à ceux de DETEX et BDTEX lors de la détection des occurrences de défauts de conception. Nous avons également montré que SMURF peut être appliqué à la fois dans les configurations intra-système et inter-système. Enfin, nous avons montré que la précision et le rappel de SMURF sont améliorés quand on prend en compte les retours des praticiens.
Resumo:
Ce mémoire de maîtrise présente une nouvelle approche non supervisée pour détecter et segmenter les régions urbaines dans les images hyperspectrales. La méthode proposée n ́ecessite trois étapes. Tout d’abord, afin de réduire le coût calculatoire de notre algorithme, une image couleur du contenu spectral est estimée. A cette fin, une étape de réduction de dimensionalité non-linéaire, basée sur deux critères complémentaires mais contradictoires de bonne visualisation; à savoir la précision et le contraste, est réalisée pour l’affichage couleur de chaque image hyperspectrale. Ensuite, pour discriminer les régions urbaines des régions non urbaines, la seconde étape consiste à extraire quelques caractéristiques discriminantes (et complémentaires) sur cette image hyperspectrale couleur. A cette fin, nous avons extrait une série de paramètres discriminants pour décrire les caractéristiques d’une zone urbaine, principalement composée d’objets manufacturés de formes simples g ́eométriques et régulières. Nous avons utilisé des caractéristiques texturales basées sur les niveaux de gris, la magnitude du gradient ou des paramètres issus de la matrice de co-occurrence combinés avec des caractéristiques structurelles basées sur l’orientation locale du gradient de l’image et la détection locale de segments de droites. Afin de réduire encore la complexité de calcul de notre approche et éviter le problème de la ”malédiction de la dimensionnalité” quand on décide de regrouper des données de dimensions élevées, nous avons décidé de classifier individuellement, dans la dernière étape, chaque caractéristique texturale ou structurelle avec une simple procédure de K-moyennes et ensuite de combiner ces segmentations grossières, obtenues à faible coût, avec un modèle efficace de fusion de cartes de segmentations. Les expérimentations données dans ce rapport montrent que cette stratégie est efficace visuellement et se compare favorablement aux autres méthodes de détection et segmentation de zones urbaines à partir d’images hyperspectrales.