83 resultados para incompleteness and inconsistency detection


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sensitive detection of pathogens is critical to ensure the safety of food supplies and to prevent bacterial disease infection and outbreak at the first onset. While conventional techniques such as cell culture, ELISA, PCR, etc. have been used as the predominant detection workhorses, they are however limited by either time-consuming procedure, complicated sample pre-treatment, expensive analysis and operation, or inability to be implemented at point-of-care testing. Here, we present our recently developed assay exploiting enzyme-induced aggregation of plasmonic gold nanoparticles (AuNPs) for label-free and ultrasensitive detection of bacterial DNA. In the experiments, AuNPs are first functionalized with specific, single-stranded RNA probes so that they exhibit high stability in solution even under high electrolytic condition thus exhibiting red color. When bacterial DNA is present in a sample, a DNA-RNA heteroduplex will be formed and subsequently prone to the RNase H cleavage on the RNA probe, allowing the DNA to liberate and hybridize with another RNA strand. This continuously happens until all of the RNA strands are cleaved, leaving the nanoparticles ‘unprotected’. The addition of NaCl will cause the ‘unprotected’ nanoparticles to aggregate, initiating a colour change from red to blue. The reaction is performed in a multi-well plate format, and the distinct colour signal can be discriminated by naked eye or simple optical spectroscopy. As a result, bacterial DNA as low as pM could be unambiguously detected, suggesting that the enzyme-induced aggregation of AuNPs assay is very easy to perform and sensitive, it will significantly benefit to development of fast and ultrasensitive methods that can be used for disease detection and diagnosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study describes further validation of a previously described Peptide-mediated magnetic separation (PMS)-Phage assay, and its application to test raw cows’ milk for presence of viable Mycobacterium avium subsp. paratuberculosis (MAP). The inclusivity and exclusivity of the PMS-phage assay were initially assessed, before the 50% limit of detection (LOD50) was determined and compared with those of PMS-qPCR (targeting both IS900 and f57) and PMS-culture. These methods were then applied in parallel to test 146 individual milk samples and 22 bulk tank milk samples from Johne’s affected herds. Viable MAP were detected by the PMS-phage assay in 31 (21.2%) of 146 individual milk samples (mean plaque count of 228.1 PFU/50 ml, range 6-948 PFU/50 ml), and 13 (59.1%) of 22 bulk tank milks (mean plaque count of 136.83 PFU/50 ml, range 18-695 PFU/50 ml). In contrast, only 7 (9.1%) of 77 individual milks and 10 (45.4%) of 22 bulk tank milks tested PMS-qPCR positive, and 17 (11.6%) of 146 individual milks and 11 (50%) of 22 bulk tank milks tested PMS-culture positive. The mean 50% limits of detection (LOD50) of the PMS-phage, PMS-IS900 qPCR and PMS-f57 qPCR assays, determined by testing MAP-spiked milk, were 0.93, 135.63 and 297.35 MAP CFU/50 ml milk, respectively. Collectively, these results demonstrate that, in our laboratory, the PMS-phage assay is a sensitive and specific method to quickly detect the presence of viable MAP cells in milk. However, due to its complicated, multi-step nature, the method would not be a suitable MAP screening method for the dairy industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The need for chemical and biological entities of predetermined selectivity and affinity towards target analytes is greater than ever, in applications such as environmental monitoring, bioterrorism detection and analysis of natural toxin contaminants in the food chain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this preliminary case study, we investigate how inconsistency in a network intrusion detection rule set can be measured. To achieve this, we first examine the structure of these rules which incorporate regular expression (Regex) pattern matching. We then identify primitive elements in these rules in order to translate the rules into their (equivalent) logical forms and to establish connections between them. Additional rules from background knowledge are also introduced to make the correlations among rules more explicit. Finally, we measure the degree of inconsistency in formulae of such a rule set (using the Scoring function, Shapley inconsistency values and Blame measure for prioritized knowledge) and compare the informativeness of these measures. We conclude that such measures are useful for the network intrusion domain assuming that incorporating domain knowledge for correlation of rules is feasible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a class of defects in software requirements specification, inconsistency has been widely studied in both requirements engineering and software engineering. It has been increasingly recognized that maintaining consistency alone often results in some other types of non-canonical requirements, including incompleteness of a requirements specification, vague requirements statements, and redundant requirements statements. It is therefore desirable for inconsistency handling to take into account the related non-canonical requirements in requirements engineering. To address this issue, we propose an intuitive generalization of logical techniques for handling inconsistency to those that are suitable for managing non-canonical requirements, which deals with incompleteness and redundancy, in addition to inconsistency. We first argue that measuring non-canonical requirements plays a crucial role in handling them effectively. We then present a measure-driven logic framework for managing non-canonical requirements. The framework consists of five main parts, identifying non-canonical requirements, measuring them, generating candidate proposals for handling them, choosing commonly acceptable proposals, and revising them according to the chosen proposals. This generalization can be considered as an attempt to handle non-canonical requirements along with logic-based inconsistency handling in requirements engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article gives an extensive overview of the wide range of analytical procedures developed for the detection of amphenicol antibiotic residues (chloramphenicol, thiamphenicol, and florfenicol) in many different types of foodstuffs (milk, meat, eggs, honey, seafood). Screening methods such as microbial inhibition methods, antibody-based immunoassays using conventional and biosensor-based detection systems, and some methods based on alternative recognition systems are described. The relative advantages and disadvantages of these methods are discussed and compared. The current status and future trends and developments in the need for accurate and rapid detection of this group of antimicrobials are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study the design and development of two real-time PCR assays for the rapid, sensitive and specific detection of infectious laryngotracheitis virus (ILTV) DNA is described. A Primer-Probe Energy Transfer (PriProET) assay and 5' conjugated Minor Groove Binder (MGB) method are compared and contrasted. Both have been designed to target the thymidine kinase gene of the ILTV genome. Both PriProET and MGB assays are capable of detecting 20 copies of a DNA standard per reaction and are linear from 2 x 10(8) to 2 x 10(2) copies/mu l. Neither PriProET, nor MGB reacted with heterologous herpesviruses, indicating a high specificity of the two methods as novel tools for virus detection and identification. This study demonstrates the suitability of PriProET and 5' conjugated MGB probes as real-time PCR chemistries for the diagnosis of respiratory diseases caused by ILTV. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The presence of paralytic shellfish poisoning (PSP), diarrheic shellfish poisoning (DSP) and amnesic shellfish poisoning (ASP) toxins in seafood is a severe and growing threat to human health. In order to minimize the risks of human exposure, the maximum content of these toxins in seafood has been limited by legal regulations worldwide. The regulated limits are established in equivalents of the main representatives of the groups: saxitoxin (STX), okadaic acid (OA) and domoic acid (DA), for PSP, DSP and ASP, respectively. In this study a multi-detection method to screen shellfish samples for the presence of these toxins simultaneously was developed. Multiplexing was achieved using a solid-phase microsphere assay coupled to flow-fluorimetry detection, based on the Luminex xMap technology. The multi-detection method consists of three simultaneous competition immunoassays. Free toxins in solution compete with STX, OA or DA immobilized on the surface of three different classes of microspheres for binding to specific monoclonal antibodies. The IC50 obtained in buffer was similar in single- and multi-detection: 5.6 ± 1.1 ng/mL for STX, 1.1 ± 0.03 ng/mL for OA and 1.9 ± 0.1 ng/mL for DA. The sample preparation protocol was optimized for the simultaneous extraction of STX, OA and DA with a mixture of methanol and acetate buffer. The three immunoassays performed well with mussel and scallop matrixes displaying adequate dynamic ranges and recovery rates (around 90 % for STX, 80 % for OA and 100 % for DA). This microsphere-based multi-detection immunoassay provides an easy and rapid screening method capable of detecting simultaneously in the same sample three regulated groups of marine toxins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIMS: Mutation detection accuracy has been described extensively; however, it is surprising that pre-PCR processing of formalin-fixed paraffin-embedded (FFPE) samples has not been systematically assessed in clinical context. We designed a RING trial to (i) investigate pre-PCR variability, (ii) correlate pre-PCR variation with EGFR/BRAF mutation testing accuracy and (iii) investigate causes for observed variation. METHODS: 13 molecular pathology laboratories were recruited. 104 blinded FFPE curls including engineered FFPE curls, cell-negative FFPE curls and control FFPE tissue samples were distributed to participants for pre-PCR processing and mutation detection. Follow-up analysis was performed to assess sample purity, DNA integrity and DNA quantitation. RESULTS: Rate of mutation detection failure was 11.9%. Of these failures, 80% were attributed to pre-PCR error. Significant differences in DNA yields across all samples were seen using analysis of variance (p

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Although most gastrointestinal stromal tumours (GIST) carry oncogenic mutations in KIT exons 9, 11, 13 and 17, or in platelet-derived growth factor receptor alpha (PDGFRA) exons 12, 14 and 18, around 10% of GIST are free of these mutations. Genotyping and accurate detection of KIT/PDGFRA mutations in GIST are becoming increasingly useful for clinicians in the management of the disease. METHOD: To evaluate and improve laboratory practice in GIST mutation detection, we developed a mutational screening quality control program. Eleven laboratories were enrolled in this program and 50 DNA samples were analysed, each of them by four different laboratories, giving 200 mutational reports. RESULTS: In total, eight mutations were not detected by at least one laboratory. One false positive result was reported in one sample. Thus, the mean global rate of error with clinical implication based on 200 reports was 4.5%. Concerning specific polymorphisms detection, the rate varied from 0 to 100%, depending on the laboratory. The way mutations were reported was very heterogeneous, and some errors were detected. CONCLUSION: This study demonstrated that such a program was necessary for laboratories to improve the quality of the analysis, because an error rate of 4.5% may have clinical consequences for the patient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Osteoporosis (OP) is one of the most prevalent bone diseases worldwide with bone fracture the major clinical consequence. The effect of OP on fracture repair is disputed and although it might be expected for fracture repair to be delayed in osteoporotic individuals, a definitive answer to this question still eludes us. The aim of this study was to clarify the effect of osteoporosis in a rodent fracture model. OP was induced in 3-month-old rats (n = 53) by ovariectomy (OVX) followed by an externally fixated, mid-diaphyseal femoral osteotomy at 6 months (OVX group). A further 40 animals underwent a fracture at 6 months (control group). Animals were sacrificed at 1, 2, 4, 6, and 8 weeks postfracture with outcome measures of histology, biomechanical strength testing, pQCT, relative BMD, and motion detection. OVX animals had significantly lower BMD, slower fracture repair (histologically), reduced stiffness in the fractured femora (8 weeks) and strength in the contralateral femora (6 and 8 weeks), increased body weight, and decreased motion. This study has demonstrated that OVX is associated with decrease in BMD (particularly in trabecular bone) and a reduction in the mechanical properties of intact bone and healing fractures. The histological, biomechanical, and radiological measures of union suggest that OVX delayed fracture healing. (C) 2007 Orthopaedic Research Society. Published by Wiley Periodicals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the rapid growth in the quantity and complexity of scientific knowledge available for scientists, and allied professionals, the problems associated with harnessing this knowledge are well recognized. Some of these problems are a result of the uncertainties and inconsistencies that arise in this knowledge. Other problems arise from heterogeneous and informal formats for this knowledge. To address these problems, developments in the application of knowledge representation and reasoning technologies can allow scientific knowledge to be captured in logic-based formalisms. Using such formalisms, we can undertake reasoning with the uncertainty and inconsistency to allow automated techniques to be used for querying and combining of scientific knowledge. Furthermore, by harnessing background knowledge, the querying and combining tasks can be carried out more intelligently. In this paper, we review some of the significant proposals for formalisms for representing and reasoning with scientific knowledge.