990 resultados para Step detection
Resumo:
Introduction: Approximately one fifth of stage I and II colon cancer patients will suffer from recurrent disease. This is partly due to the presence of small nodal tumour infiltrates, which are undetected by standard histopathology using Haematoxylin & Eosin (H&E) staining on one slice and thus may not receive beneficial adjuvant therapy. A new diagnostic, semi-automatic system, called one-step nucleic acid amplification (OSNA), was recently designed for the detection of cytokeratin 19 (CK19) mRNA as a surrogate for lymph node metastases. The objective of the present investigation was to compare the performance of OSNA with both standard H&E as well as intensive histopathologic analyses in the detection of colon cancer lymph node micro- and macro-metastases.Methods: In this prospective study 313 lymph nodes from 22 consecutive stage I - III colon cancer patients were assessed. Half of each lymph node was analysed initially based on one slice of H&E followed by an intensive histologic work-up (5 levels of H&E and immuno-histochemistry staining for each slice), the other half was analysed using OSNA.Results: All OSNA results were available after less than 40 minutes. Fifty-one lymph nodes were positive and 246 lymph nodes negative with both OSNA and standard H&E. OSNA was more sensitive to detect small nodal tumor infiltrates compared to H&E (11 OSNA pos. /H&E neg.). Compared to intensive histopathologic analyses, OSNA had a sensitivity of 94.5% and a specificity of 97.6% to detect lymph node micro- and macro-metastases with a concordance rate of 97.1%. An upstaging due to OSNA was found in 2/13 (15.3%) initially node negative colon cancer patients.Conclusion: OSNA appears to be a powerful and promising molecular tool for the detection of lymph node macro- and micro-metastases in colon cancer patients. OSNA has a similar performance in the detection of micro- and macro-metastases compared to intensive histopathologic investigations and appears to be superior to standard histology with H&E. Since the use of OSNA allows the analysis of the whole lymph node, the problem of sampling bias and undetected tumor deposits due to uninvestigated material will be overcome in the future and OSNA may thus improve staging in colon cancer patients. It is hoped that this improved staging will lead to better patient selection for adjuvant therapy and consecutively improved local and distant control as well as better overall survival.
Resumo:
Agricultural practices, such as spreading liquid manure or the utilisation of land as animal pastures, can result in faecal contamination of water resources. Rhodococcus coprophilus is used in microbial source tracking to indicate animal faecal contamination in water. Methods previously described for detecting of R. coprophilus in water were neither sensitive nor specific. Therefore, the aim of this study was to design and validate a new quantitative polymerase chain reaction (qPCR) to improve the detection of R. coprophilus in water. The new PCR assay was based on the R. coprophilus 16S rRNA gene. The validation showed that the new approach was specific and sensitive for deoxyribunucleic acid from target host species. Compared with other PCR assays tested in this study, the detection limit of the new qPCR was between 1 and 3 log lower. The method, including a filtration step, was further validated and successfully used in a field investigation in Switzerland. Our work demonstrated that the new detection method is sensitive and robust to detect R. coprophilus in surface and spring water. Compared with PCR assays that are available in the literature or to the culture-dependent method, the new molecular approach improves the detection of R. coprophilus.
Resumo:
Raman spectroscopy combined with chemometrics has recently become a widespread technique for the analysis of pharmaceutical solid forms. The application presented in this paper is the investigation of counterfeit medicines. This increasingly serious issue involves networks that are an integral part of industrialized organized crime. Efficient analytical tools are consequently required to fight against it. Quick and reliable authentication means are needed to allow the deployment of measures from the company and the authorities. For this purpose a method in two steps has been implemented here. The first step enables the identification of pharmaceutical tablets and capsules and the detection of their counterfeits. A nonlinear classification method, the Support Vector Machines (SVM), is computed together with a correlation with the database and the detection of Active Pharmaceutical Ingredient (API) peaks in the suspect product. If a counterfeit is detected, the second step allows its chemical profiling among former counterfeits in a forensic intelligence perspective. For this second step a classification based on Principal Component Analysis (PCA) and correlation distance measurements is applied to the Raman spectra of the counterfeits.
Resumo:
PURPOSE: Prostate cancer is the most commonly diagnosed cancer in the United States. The diagnosis or followup of prostate cancer in men older than 50 years is based on digital rectal examination, measurement of the free-to-total prostatic specific antigen ratio and transrectal ultrasound assisted needle biopsy of the prostate. We developed and evaluated a noninvasive method for diagnosing prostate cancer based on the measurement of telomerase activity after prostatic massage in fresh voided urine or after urethral washing. MATERIALS AND METHODS: We obtained 36 specimens of cells after prostatic massage in the fresh voided urine of 16 patients who subsequently underwent radical prostatectomy and after urethral washing in 20 who underwent prostate needle biopsies. Ethylenediaminetetraacetic acid was immediately added to the collected urine or washing to a final concentration of 20 mM. After protein extraction by CHAPS buffer each specimen was tested for telomerase activity in a 2-step modified telomeric repeat amplification protocol assay. The 2 prostate cancer cell lines PC-3 and LNCaP with high telomerase activity were used as a positive control. RESULTS: Telomerase activity was detected in 14 of 24 samples with known prostate cancer (sensitivity 58%). In contrast, no telomerase activity was found in the 12 cases without histological evidence of prostate tumor (specificity 100%). Eight of 9 poorly differentiated cancers expressed telomerase activity (89%), while only 6 of 15 well and moderately differentiated cancers showed telomerase activity (40%). CONCLUSIONS: Our data illustrate that telomerase activity may be detected in voided urine or washing after prostatic massage in patients with prostate cancer. Sensitivity was higher for poorly differentiated tumors. This approach is not currently available for detecting prostate cancer in clinical practice. However, these results are promising and further studies are ongoing.
Resumo:
Water delivered by dental units during routine dental practice is densely contaminated by bacteria. The aim of this study was to determine actual isolation of the microorganisms sprayed from Dental Unit Water Lines (DUWLs) when enrichment cultures are performed and to compare frequencies with those obtained without enrichment cultures. Moreover, the antimicrobial susceptibilities of the microorganisms isolated were also studied. Water samples were collected from one hundred dental equipments in use at Dental Hospital of our University in order to evaluate the presence/absence of microorganisms and to perform their presumptive identification. Aliquots from all of the samples were inoculated in eight different media including both enrichment and selective media. Minimal inhibitory concentrations (MIC) were determined by the broth dilution method. The results herein reported demonstrate that most of the DUWLs were colonized by bacteria from human oral cavity; when enrichment procedures were applied the percentage of DUWLs with detectable human bacteria was one hundred percent. The results showed that in order to evaluate the actual risk of infections spread by DUWLs the inclusion of a step of pre-enrichment should be performed. The need for devices preventing bacterial contamination of DUWLs is a goal to be achieved in the near future that would contribute to maintain safety in dental medical assistance
Resumo:
To develop systems in order to detect Alzheimer’s disease we want to use EEG signals. Available database is raw, so the first step must be to clean signals properly. We propose a new way of ICA cleaning on a database recorded from patients with Alzheimer's disease (mildAD, early stage). Two researchers visually inspected all the signals (EEG channels), and each recording's least corrupted (artefact-clean) continuous 20 sec interval were chosen for the analysis. Each trial was then decomposed using ICA. Sources were ordered using a kurtosis measure, and the researchers cleared up to seven sources per trial corresponding to artefacts (eye movements, EMG corruption, EKG, etc), using three criteria: (i) Isolated source on the scalp (only a few electrodes contribute to the source), (ii) Abnormal wave shape (drifts, eye blinks, sharp waves, etc.), (iii) Source of abnormally high amplitude ( �100 �V). We then evaluated the outcome of this cleaning by means of the classification of patients using multilayer perceptron neural networks. Results are very satisfactory and performance is increased from 50.9% to 73.1% correctly classified data using ICA cleaning procedure.
Resumo:
The enhanced functional sensitivity offered by ultra-high field imaging may significantly benefit simultaneous EEG-fMRI studies, but the concurrent increases in artifact contamination can strongly compromise EEG data quality. In the present study, we focus on EEG artifacts created by head motion in the static B0 field. A novel approach for motion artifact detection is proposed, based on a simple modification of a commercial EEG cap, in which four electrodes are non-permanently adapted to record only magnetic induction effects. Simultaneous EEG-fMRI data were acquired with this setup, at 7T, from healthy volunteers undergoing a reversing-checkerboard visual stimulation paradigm. Data analysis assisted by the motion sensors revealed that, after gradient artifact correction, EEG signal variance was largely dominated by pulse artifacts (81-93%), but contributions from spontaneous motion (4-13%) were still comparable to or even larger than those of actual neuronal activity (3-9%). Multiple approaches were tested to determine the most effective procedure for denoising EEG data incorporating motion sensor information. Optimal results were obtained by applying an initial pulse artifact correction step (AAS-based), followed by motion artifact correction (based on the motion sensors) and ICA denoising. On average, motion artifact correction (after AAS) yielded a 61% reduction in signal power and a 62% increase in VEP trial-by-trial consistency. Combined with ICA, these improvements rose to a 74% power reduction and an 86% increase in trial consistency. Overall, the improvements achieved were well appreciable at single-subject and single-trial levels, and set an encouraging quality mark for simultaneous EEG-fMRI at ultra-high field.
Resumo:
Objective To develop procedures to ensure consistency of printing quality of digital images, by means of hardcopy quantitative analysis based on a standard image. Materials and Methods Characteristics of mammography DI-ML and general purpose DI-HL films were studied through the QC-Test utilizing different processing techniques in a FujiFilm®-DryPix4000 printer. A software was developed for sensitometric evaluation, generating a digital image including a gray scale and a bar pattern to evaluate contrast and spatial resolution. Results Mammography films showed maximum optical density of 4.11 and general purpose films, 3.22. The digital image was developed with a 33-step wedge scale and a high-contrast bar pattern (1 to 30 lp/cm) for spatial resolution evaluation. Conclusion Mammographic films presented higher values for maximum optical density and contrast resolution as compared with general purpose films. The utilized digital processing technique could only change the image pixels matrix values and did not affect the printing standard. The proposed digital image standard allows greater control of the relationship between pixels values and optical density obtained in the analysis of films quality and printing systems.
Resumo:
In the world of transport management, the term ‘anticipation’ is gradually replacing ‘reaction’. Indeed, the ability to forecast traffic evolution in a network should ideally form the basis for many traffic management strategies and multiple ITS applications. Real-time prediction capabilities are therefore becoming a concrete need for the management of networks, both for urban and interurban environments, and today’s road operator has increasingly complex and exacting requirements. Recognising temporal patterns in traffic or the manner in which sequential traffic events evolve over time have been important considerations in short-term traffic forecasting. However, little work has been conducted in the area of identifying or associating traffic pattern occurrence with prevailing traffic conditions. This paper presents a framework for detection pattern identification based on finite mixture models using the EM algorithm for parameter estimation. The computation results have been conducted taking into account the traffic data available in an urban network.
Resumo:
A two-step experiment is proposed for a third year class in experimental organic chemistry. Over a period of five weeks, the students synthesized calix[4]pyrrole, a receptor that is highly selective for fluoride, and a pyridinium N-phenolate dye. Subsequently, the students used the synthesized compounds to investigate a displacement assay on the basis of the competition in acetonitrile between fluoride and the dye for calix[4]pyrrole. The experiment increased the students' skills in organic synthesis and in the characterization of organic compounds, provided a very attractive and accessible illustration of important supramolecular phenomena, and allowed the study of a chromogenic chemosensor.
Resumo:
A method to detect Apple stem grooving virus (ASGV) based on reverse transcription polymerase chain reaction (RT-PCR) was developed using primers ASGV4F-ASGV4R targeting the viral replicase gene, followed by a sandwich hybridisation, in microtiter plates, for colorimetric detection of the PCR products. The RT-PCR was performed with the Titan™ RT-PCR system, using AMV and diluted crude extracts of apple (Malus domestica) leaf or bark for the first strand synthesis and a mixture of Taq and PWO DNA polymerase for the PCR step. The RT-PCR products is hybridised with both a biotin-labelled capture probe linked to a streptavidin-coated microtiter plate and a digoxigenin (DIG)-labelled detection probe. The complex was detected with an anti-DIG conjugate labelled with alkaline phosphatase. When purified ASGV was added to extracts of plant tissue, as little as 400 fg of the virus was detected with this method. The assay with ASGV4F-ASGV4R primers specifically detected the virus in ASGV-infected apple trees from different origins, whereas no signal was observed with amplification products obtained with primers targeting the coat protein region of the ASGV genome or with primers specific for Apple chlorotic leaf spot virus (ACLSV) and Apple stem pitting virus (ASPV). The technique combines the power of PCR to increase the number of copies of the targeted gene, the specificity of DNA hybridization, and the ease of colorimetric detection and sample handling in microplates.
Resumo:
In order to develop a molecular method for detection and identification of Xanthomonas campestris pv. viticola (Xcv) the causal agent of grapevine bacterial canker, primers were designed based on the partial sequence of the hrpB gene. Primer pairs Xcv1F/Xcv3R and RST2/Xcv3R, which amplified 243- and 340-bp fragments, respectively, were tested for specificity and sensitivity in detecting DNA from Xcv. Amplification was positive with DNA from 44 Xcv strains and with DNA from four strains of X. campestris pv. mangiferaeindicae and five strains of X. axonopodis pv. passiflorae, with both primer pairs. However, the enzymatic digestion of PCR products could differentiate Xcv strains from the others. None of the primer pairs amplified DNA from grapevine, from 20 strains of nonpathogenic bacteria from grape leaves and 10 strains from six representative genera of plant pathogenic bacteria. Sensitivity of primers Xcv1F/Xcv3R and RST2/Xcv3R was 10 pg and 1 pg of purified Xcv DNA, respectively. Detection limit of primers RST2/Xcv3R was 10(4) CFU/ml, but this limit could be lowered to 10² CFU/ml with a second round of amplification using the internal primer Xcv1F. Presence of Xcv in tissues of grapevine petioles previously inoculated with Xcv could not be detected by PCR using macerated extract added directly in the reaction. However, amplification was positive with the introduction of an agar plating step prior to PCR. Xcv could be detected in 1 µl of the plate wash and from a cell suspension obtained from a single colony. Bacterium identity was confirmed by RFLP analysis of the RST2/Xcv3R amplification products digested with Hae III.
Resumo:
The use of the flow vs time relationship obtained with the nasal prongs of the AutoSetä (AS) system (diagnosis mode) has been proposed to detect apneas and hypopneas in patients with reasonable nasal patency. Our aim was to compare the accuracy of AS to that of a computerized polysomnographic (PSG) system. The study was conducted on 56 individuals (45 men) with clinical characteristics of obstructive sleep apnea (OSA). Their mean (± SD) age was 44.6 ± 12 years and their body mass index was 31.3 ± 7 kg/m2. Data were submitted to parametric analysis to determine the agreement between methods and the intraclass correlation coefficient was calculated. The Student t-test and Bland and Altman plots were also used. Twelve patients had an apnea-hypopnea index (AHI) <10 in bed and 20 had values >40. The mean (± SD) AHI PSG index of 37.6 (28.8) was significantly lower (P = 0.0003) than AHI AS (41.8 (25.3)), but there was a high intraclass correlation coefficient (0.93), with 0.016 variance. For a threshold of AHI of 20, AS showed 73.0% accuracy, 97% sensitivity and 60% specificity, with positive and negative predictive values of 78% and 93%, respectively. Sensitivity, specificity and negative predictive values increased in parallel to the increase in AHI threshold for detecting OSA. However, when the differences of AHI PSG-AS were plotted against their means, the limits of agreement between the methods (95% of the differences) were +13 and -22, showing the discrepancy between the AHI values obtained with PSG and AS. Finally, cubic regression analysis was used to better predict the result of AHI PSG as a function of the method proposed, i.e., AHI AS. We conclude that, despite these differences, AHI measured by AutoSetä can be useful for the assessment of patients with high pre-test clinical probability of OSA, for whom standard PSG is not possible as an initial step in diagnosis.
Resumo:
The increasing presence of products derived from genetically modified (GM) plants in human and animal diets has led to the development of detection methods to distinguish biotechnology-derived foods from conventional ones. The conventional and real-time PCR have been used, respectively, to detect and quantify GM residues in highly processed foods. DNA extraction is a critical step during the analysis process. Some factors such as DNA degradation, matrix effects, and the presence of PCR inhibitors imply that a detection or quantification limit, established for a given method, is restricted to a matrix used during validation and cannot be projected to any other matrix outside the scope of the method. In Brazil, sausage samples were the main class of processed products in which Roundup Ready® (RR) soybean residues were detected. Thus, the validation of methodologies for the detection and quantification of those residues is absolutely necessary. Sausage samples were submitted to two different methods of DNA extraction: modified Wizard and the CTAB method. The yield and quality were compared for both methods. DNA samples were analyzed by conventional and real-time PCR for the detection and quantification of Roundup Ready® soybean in the samples. At least 200 ng of total sausage DNA was necessary for a reliable quantification. Reactions containing DNA amounts below this value led to large variations on the expected GM percentage value. In conventional PCR, the detection limit varied from 1.0 to 500 ng, depending on the GM soybean content in the sample. The precision, performance, and linearity were relatively high indicating that the method used for analysis was satisfactory.
Resumo:
Les changements sont faits de façon continue dans le code source des logiciels pour prendre en compte les besoins des clients et corriger les fautes. Les changements continus peuvent conduire aux défauts de code et de conception. Les défauts de conception sont des mauvaises solutions à des problèmes récurrents de conception ou d’implémentation, généralement dans le développement orienté objet. Au cours des activités de compréhension et de changement et en raison du temps d’accès au marché, du manque de compréhension, et de leur expérience, les développeurs ne peuvent pas toujours suivre les normes de conception et les techniques de codage comme les patrons de conception. Par conséquent, ils introduisent des défauts de conception dans leurs systèmes. Dans la littérature, plusieurs auteurs ont fait valoir que les défauts de conception rendent les systèmes orientés objet plus difficile à comprendre, plus sujets aux fautes, et plus difficiles à changer que les systèmes sans les défauts de conception. Pourtant, seulement quelques-uns de ces auteurs ont fait une étude empirique sur l’impact des défauts de conception sur la compréhension et aucun d’entre eux n’a étudié l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes. Dans cette thèse, nous proposons trois principales contributions. La première contribution est une étude empirique pour apporter des preuves de l’impact des défauts de conception sur la compréhension et le changement. Nous concevons et effectuons deux expériences avec 59 sujets, afin d’évaluer l’impact de la composition de deux occurrences de Blob ou deux occurrences de spaghetti code sur la performance des développeurs effectuant des tâches de compréhension et de changement. Nous mesurons la performance des développeurs en utilisant: (1) l’indice de charge de travail de la NASA pour leurs efforts, (2) le temps qu’ils ont passé dans l’accomplissement de leurs tâches, et (3) les pourcentages de bonnes réponses. Les résultats des deux expériences ont montré que deux occurrences de Blob ou de spaghetti code sont un obstacle significatif pour la performance des développeurs lors de tâches de compréhension et de changement. Les résultats obtenus justifient les recherches antérieures sur la spécification et la détection des défauts de conception. Les équipes de développement de logiciels doivent mettre en garde les développeurs contre le nombre élevé d’occurrences de défauts de conception et recommander des refactorisations à chaque étape du processus de développement pour supprimer ces défauts de conception quand c’est possible. Dans la deuxième contribution, nous étudions la relation entre les défauts de conception et les fautes. Nous étudions l’impact de la présence des défauts de conception sur l’effort nécessaire pour corriger les fautes. Nous mesurons l’effort pour corriger les fautes à l’aide de trois indicateurs: (1) la durée de la période de correction, (2) le nombre de champs et méthodes touchés par la correction des fautes et (3) l’entropie des corrections de fautes dans le code-source. Nous menons une étude empirique avec 12 défauts de conception détectés dans 54 versions de quatre systèmes: ArgoUML, Eclipse, Mylyn, et Rhino. Nos résultats ont montré que la durée de la période de correction est plus longue pour les fautes impliquant des classes avec des défauts de conception. En outre, la correction des fautes dans les classes avec des défauts de conception fait changer plus de fichiers, plus les champs et des méthodes. Nous avons également observé que, après la correction d’une faute, le nombre d’occurrences de défauts de conception dans les classes impliquées dans la correction de la faute diminue. Comprendre l’impact des défauts de conception sur l’effort des développeurs pour corriger les fautes est important afin d’aider les équipes de développement pour mieux évaluer et prévoir l’impact de leurs décisions de conception et donc canaliser leurs efforts pour améliorer la qualité de leurs systèmes. Les équipes de développement doivent contrôler et supprimer les défauts de conception de leurs systèmes car ils sont susceptibles d’augmenter les efforts de changement. La troisième contribution concerne la détection des défauts de conception. Pendant les activités de maintenance, il est important de disposer d’un outil capable de détecter les défauts de conception de façon incrémentale et itérative. Ce processus de détection incrémentale et itérative pourrait réduire les coûts, les efforts et les ressources en permettant aux praticiens d’identifier et de prendre en compte les occurrences de défauts de conception comme ils les trouvent lors de la compréhension et des changements. Les chercheurs ont proposé des approches pour détecter les occurrences de défauts de conception, mais ces approches ont actuellement quatre limites: (1) elles nécessitent une connaissance approfondie des défauts de conception, (2) elles ont une précision et un rappel limités, (3) elles ne sont pas itératives et incrémentales et (4) elles ne peuvent pas être appliquées sur des sous-ensembles de systèmes. Pour surmonter ces limitations, nous introduisons SMURF, une nouvelle approche pour détecter les défauts de conception, basé sur une technique d’apprentissage automatique — machines à vecteur de support — et prenant en compte les retours des praticiens. Grâce à une étude empirique portant sur trois systèmes et quatre défauts de conception, nous avons montré que la précision et le rappel de SMURF sont supérieurs à ceux de DETEX et BDTEX lors de la détection des occurrences de défauts de conception. Nous avons également montré que SMURF peut être appliqué à la fois dans les configurations intra-système et inter-système. Enfin, nous avons montré que la précision et le rappel de SMURF sont améliorés quand on prend en compte les retours des praticiens.