953 resultados para Probability of detection
Resumo:
F. psychrophilum is the causative agent of Bacterial Cold Water Disease (BCW) and Rainbow Trout Fry Syndrome (RTFS). To date, diagnosis relies mainly on direct microscopy or cultural methods. Direct microscopy is fast but not very reliable, whereas cultural methods are reliable but time-consuming and labor-intensive. So far fluorescent in situ hybridization (FISH) has not been used in the diagnosis of flavobacteriosis but it has the potential to rapidly and specifically detect F. psychrophilum in infected tissues. Outbreaks in fish farms, caused by pathogenic strains of Flavobacterium species, are increasingly frequent and there is a need for reliable and cost-effective techniques to rapidly diagnose flavobacterioses. This study is aimed at developing a FISH that could be used for the diagnosis of F. psychrophilum infections in fish. We constructed a generic probe for the genus Flavobacterium ("Pan-Flavo") and two specific probes targeting F. psychrophilum based on 16S rRNA gene sequences. We tested their specificity and sensitivity on pure cultures of different Flavobacterium and other aquatic bacterial species. After assessing their sensitivity and specificity, we established their limit of detection and tested the probes on infected fresh tissues (spleen and skin) and on paraffin-embedded tissues. The results showed high sensitivity and specificity of the probes (100% and 91% for the Pan-Flavo probe and 100% and 97% for the F. psychrophilum probe, respectively). FISH was able to detect F. psychrophilum in infected fish tissues, thus the findings from this study indicate this technique is suitable as a fast and reliable method for the detection of Flavobacterium spp. and F. psychrophilum.
Resumo:
The accuracy of Global Positioning System (GPS) time series is degraded by the presence of offsets. To assess the effectiveness of methods that detect and remove these offsets, we designed and managed the Detection of Offsets in GPS Experiment. We simulated time series that mimicked realistic GPS data consisting of a velocity component, offsets, white and flicker noises (1/f spectrum noises) composed in an additive model. The data set was made available to the GPS analysis community without revealing the offsets, and several groups conducted blind tests with a range of detection approaches. The results show that, at present, manual methods (where offsets are hand picked) almost always give better results than automated or semi‒automated methods (two automated methods give quite similar velocity bias as the best manual solutions). For instance, the fifth percentile range (5% to 95%) in velocity bias for automated approaches is equal to 4.2 mm/year (most commonly ±0.4 mm/yr from the truth), whereas it is equal to 1.8 mm/yr for the manual solutions (most commonly 0.2 mm/yr from the truth). The magnitude of offsets detectable by manual solutions is smaller than for automated solutions, with the smallest detectable offset for the best manual and automatic solutions equal to 5 mm and 8 mm, respectively. Assuming the simulated time series noise levels are representative of real GPS time series, robust geophysical interpretation of individual site velocities lower than 0.2–0.4 mm/yr is therefore certainly not robust, although a limit of nearer 1 mm/yr would be a more conservative choice. Further work to improve offset detection in GPS coordinates time series is required before we can routinely interpret sub‒mm/yr velocities for single GPS stations.
Resumo:
Commercially available assays for the simultaneous detection of multiple inflammatory and cardiac markers in porcine blood samples are currently lacking. Therefore, this study was aimed at developing a bead-based, multiplexed flow cytometric assay to simultaneously detect porcine cytokines [interleukin (IL)-1β, IL-6, IL-10, and tumor necrosis factor alpha], chemokines (IL-8 and monocyte chemotactic protein 1), growth factors [basic fibroblast growth factor (bFGF), vascular endothelial growth factor, and platelet-derived growth factor-bb], and injury markers (cardiac troponin-I) as well as complement activation markers (C5a and sC5b-9). The method was based on the Luminex xMAP technology, resulting in the assembly of a 6- and 11-plex from the respective individual singleplex situation. The assay was evaluated for dynamic range, sensitivity, cross-reactivity, intra-assay and interassay variance, spike recovery, and correlation between multiplex and commercially available enzyme-linked immunosorbent assay as well as the respective singleplex. The limit of detection ranged from 2.5 to 30,000 pg/ml for all analytes (6- and 11-plex assays), except for soluble C5b-9 with a detection range of 2-10,000 ng/ml (11-plex). Typically, very low cross-reactivity (<3% and <1.4% by 11- and 6-plex, respectively) between analytes was found. Intra-assay variances ranged from 4.9 to 7.4% (6-plex) and 5.3 to 12.9% (11-plex). Interassay variances for cytokines were between 8.1 and 28.8% (6-plex) and 10.1 and 26.4% (11-plex). Correlation coefficients with singleplex assays for 6-plex as well as for 11-plex were high, ranging from 0.988 to 0.997 and 0.913 to 0.999, respectively. In this study, a bead-based porcine 11-plex and 6-plex assay with a good assay sensitivity, broad dynamic range, and low intra-assay variance and cross-reactivity was established. These assays therefore represent a new, useful tool for the analysis of samples generated from experiments with pigs.
Resumo:
Leptospirosis is a global zoonotic disease. Pathogenic Leptospira species, the causative agent of leptospirosis, colonize the renal tubules of chronically infected maintenance hosts such as dogs, rats and cattle. Maintenance hosts typically remain clinically asymptomatic and shed leptospires into the environment via urine. In contrast, accidental hosts such as humans can suffer severe acute forms of the disease. Infection results from direct contact with infected urine or indirectly, through contaminated water sources. In this study, a quantitative real-time PCR specific for lipL32 was designed to detect the urinary shedding of leptospires from dogs. The sensitivity and specificity of the assay was evaluated using both a panel of pathogenic Leptospira species and clinical microbial isolates, and samples of urine collected from experimentally infected rats and non-infected controls. The lower limit of detection was approximately 3 genome equivalents per reaction. The assay was applied to canine urine samples collected from local dog sanctuaries and the University Veterinary Hospital (UVH) at University College Dublin. Of 525 canine urine samples assayed, 37 were positive, indicating a prevalence of urinary shedding of leptospires of 7.05%. These results highlight the need to provide effective canine vaccination strategies and raise public health awareness.
Resumo:
Methods for tracking an object have generally fallen into two groups: tracking by detection and tracking through local optimization. The advantage of detection-based tracking is its ability to deal with target appearance and disappearance, but it does not naturally take advantage of target motion continuity during detection. The advantage of local optimization is efficiency and accuracy, but it requires additional algorithms to initialize tracking when the target is lost. To bridge these two approaches, we propose a framework for unified detection and tracking as a time-series Bayesian estimation problem. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a target in each frame. To do this we integrate the Active Testing (AT) paradigm with Bayesian filtering, and this results in a framework capable of both detecting and tracking robustly in situations where the target object enters and leaves the field of view regularly. We demonstrate our approach on a retinal tool tracking problem and show through extensive experiments that our method provides an efficient and robust tracking solution.
Resumo:
This in vivo study aimed to evaluate the influence of contact points on the approximal caries detection in primary molars, by comparing the performance of the DIAGNOdent pen and visual-tactile examination after tooth separation to bitewing radiography (BW). A total of 112 children were examined and 33 children were selected. In three periods (a, b, and c), 209 approximal surfaces were examined: (a) examiner 1 performed visual-tactile examination using the Nyvad criteria (EX1); examiner 2 used DIAGNOdent pen (LF1) and took BW; (b) 1 week later, after tooth separation, examiner 1 performed the second visual-tactile examination (EX2) and examiner 2 used DIAGNOdent again (LF2); (c) after tooth exfoliation, surfaces were directly examined using DIAGNOdent (LF3). Teeth were examined by computed microtomography as a reference standard. Analyses were based on diagnostic thresholds: D1: D 0 = health, D 1 –D 4 = disease; D2: D 0 , D 1 = health, D 2 –D 4 = disease; D3: D 0 –D 2 = health, D 3 , D 4 = disease. At D1, the highest sensitivity/specificity were observed for EX1 (1.00)/LF3 (0.68), respectively. At D2, the highest sensitivity/ specificity were observed for LF3 (0.69)/BW (1.00), respectively. At D3, the highest sensitivity/specificity were observed for LF3 (0.78)/EX1, EX2 and BW (1.00). EX1 showed higher accuracy values than LF1, and EX2 showed similar values to LF2. We concluded that the visual-tactile examination showed better results in detecting sound surfaces and approximal caries lesions without tooth separation. However, the effectiveness of approximal caries lesion detection of both methods was increased by the absence of contact points. Therefore, regardless of the method of detection, orthodontic separating elastics should be used as a complementary tool for the diagnosis of approximal noncavitated lesions in primary molars.
Resumo:
BACKGROUND Ductal carcinoma in situ (DCIS) is a noninvasive breast lesion with uncertain risk for invasive progression. Usual care (UC) for DCIS consists of treatment upon diagnosis, thus potentially overtreating patients with low propensity for progression. One strategy to reduce overtreatment is active surveillance (AS), whereby DCIS is treated only upon detection of invasive disease. Our goal was to perform a quantitative evaluation of outcomes following an AS strategy for DCIS. METHODS Age-stratified, 10-year disease-specific cumulative mortality (DSCM) for AS was calculated using a computational risk projection model based upon published estimates for natural history parameters, and Surveillance, Epidemiology, and End Results data for outcomes. AS projections were compared with the DSCM for patients who received UC. To quantify the propagation of parameter uncertainty, a 95% projection range (PR) was computed, and sensitivity analyses were performed. RESULTS Under the assumption that AS cannot outperform UC, the projected median differences in 10-year DSCM between AS and UC when diagnosed at ages 40, 55, and 70 years were 2.6% (PR = 1.4%-5.1%), 1.5% (PR = 0.5%-3.5%), and 0.6% (PR = 0.0%-2.4), respectively. Corresponding median numbers of patients needed to treat to avert one breast cancer death were 38.3 (PR = 19.7-69.9), 67.3 (PR = 28.7-211.4), and 157.2 (PR = 41.1-3872.8), respectively. Sensitivity analyses showed that the parameter with greatest impact on DSCM was the probability of understaging invasive cancer at diagnosis. CONCLUSION AS could be a viable management strategy for carefully selected DCIS patients, particularly among older age groups and those with substantial competing mortality risks. The effectiveness of AS could be markedly improved by reducing the rate of understaging.
Resumo:
Detection of malarial sporozoites by a double antibody sandwich enzyme linked immunosorbent assay (ELISA) is described. This investigation utilized the Anopheles stephensi-Plasmodium berghei malaria model for the generation of sporozoites. Anti-sporozoite antibody was obtained from the sera of rats which had been bitten by An. stephensi with salivary gland sporozoites. Mosquitoes were irradiated prior to feeding on the rats to render the sporozoites non-viable.^ The assay employed microtiter plates coated with their rat anti-sporozoite antiserum or rat anti-sporozoite IgG. Intact and sonicated sporozoites were used as antigens. Initially, sporozoites were detected by an ELISA using staphylococcal protein A conjugated with alkaline phosphatase. Sporozoites were also detected using alkaline phosphatase or horseradish peroxidase conjugated to anti-sporozoite IgG. Best results were obtained using the alkaline phosphatase conjugate.^ This investigation included the titration of antigen, coating antibody and labelled antibody as well as studies of various incubation times. A radioimmunoassay (RIA) was also developed and compared with the ELISA for detecting sporozoites. Finally, the detection of a single infected mosquito in pools of 5 to 10 whole, uninfested ones was studied using both ELISA and RIA.^ Sonicated sporozoites were more readily detected than intact sporozoites. The lower limit of detection was approximately 500 sporozoites per ml. Results using ELISA or RIA were similar. The ability of the ELISA to detect a single infected mosquito in a pool of uninfected ones indicates that this technique has potential use in entomological field studies which aim at determining the vector status of anopheline mosquitoes. The potential of the ELISA for identifying sporozoites of different species of malaria is discussed. ^
Resumo:
To deliver sample estimates provided with the necessary probability foundation to permit generalization from the sample data subset to the whole target population being sampled, probability sampling strategies are required to satisfy three necessary not sufficient conditions: (i) All inclusion probabilities be greater than zero in the target population to be sampled. If some sampling units have an inclusion probability of zero, then a map accuracy assessment does not represent the entire target region depicted in the map to be assessed. (ii) The inclusion probabilities must be: (a) knowable for nonsampled units and (b) known for those units selected in the sample: since the inclusion probability determines the weight attached to each sampling unit in the accuracy estimation formulas, if the inclusion probabilities are unknown, so are the estimation weights. This original work presents a novel (to the best of these authors' knowledge, the first) probability sampling protocol for quality assessment and comparison of thematic maps generated from spaceborne/airborne Very High Resolution (VHR) images, where: (I) an original Categorical Variable Pair Similarity Index (CVPSI, proposed in two different formulations) is estimated as a fuzzy degree of match between a reference and a test semantic vocabulary, which may not coincide, and (II) both symbolic pixel-based thematic quality indicators (TQIs) and sub-symbolic object-based spatial quality indicators (SQIs) are estimated with a degree of uncertainty in measurement in compliance with the well-known Quality Assurance Framework for Earth Observation (QA4EO) guidelines. Like a decision-tree, any protocol (guidelines for best practice) comprises a set of rules, equivalent to structural knowledge, and an order of presentation of the rule set, known as procedural knowledge. The combination of these two levels of knowledge makes an original protocol worth more than the sum of its parts. The several degrees of novelty of the proposed probability sampling protocol are highlighted in this paper, at the levels of understanding of both structural and procedural knowledge, in comparison with related multi-disciplinary works selected from the existing literature. In the experimental session the proposed protocol is tested for accuracy validation of preliminary classification maps automatically generated by the Satellite Image Automatic MapperT (SIAMT) software product from two WorldView-2 images and one QuickBird-2 image provided by DigitalGlobe for testing purposes. In these experiments, collected TQIs and SQIs are statistically valid, statistically significant, consistent across maps and in agreement with theoretical expectations, visual (qualitative) evidence and quantitative quality indexes of operativeness (OQIs) claimed for SIAMT by related papers. As a subsidiary conclusion, the statistically consistent and statistically significant accuracy validation of the SIAMT pre-classification maps proposed in this contribution, together with OQIs claimed for SIAMT by related works, make the operational (automatic, accurate, near real-time, robust, scalable) SIAMT software product eligible for opening up new inter-disciplinary research and market opportunities in accordance with the visionary goal of the Global Earth Observation System of Systems (GEOSS) initiative and the QA4EO international guidelines.
Resumo:
Understanding changes over time in the distribution of interacting native and invasive species that may be symptomatic of competitive exclusion is critical to identify the need for and effectiveness of management interventions. Occupancy models greatly increase the robustness of inference that can be made from presence/absence data when species are imperfectly detected, and recent novel developments allow for the quantification of the strength of interaction between pairs of species. We used a two-species multi-season occupancy model to quantify the impact of the invasive American mink on the native European mink in Spain through the analysis of their co-occurrence pattern over twelve years (2000 - 2011) in the entire Spanish range of European mink distribution, where both species were detected by live trapping but American mink were culled. We detected a negative temporal trend in the rate of occupancy of European mink and a simultaneous positive trend in the occupancy of American mink. The species co-occurred less often than expected and the native mink was more likely to become extinct from sites occupied by the invasive species. Removal of American mink resulted in a high probability of local extinction where it co-occurred with the endemic mink, but the overall increase in the probability of occupancy over the last decade indicates that the ongoing management is failing to halt its spread. More intensive culling effort where both species co-exist as well as in adjacent areas where the invasive American mink is found at high densities is required in order to stop thedecline of European mink.
Resumo:
Is Benford's law a good instrument to detect fraud in reports of statistical and scientific data? For a valid test the probability of "false positives" and "false negatives" has to be low. However, it is very doubtful whether the Benford distribution is an appropriate tool to discriminate between manipulated and non-manipulated estimates. Further research should focus more on the validity of the test and test results should be interpreted more carefully.
Resumo:
An efficient approach for the simulation of ion scattering from solids is proposed. For every encountered atom, we take multiple samples of its thermal displacements among those which result in scattering with high probability to finally reach the detector. As a result, the detector is illuminated by intensive “showers,” where each event of detection must be weighted according to the actual probability of the atom displacement. The computational cost of such simulation is orders of magnitude lower than in the direct approach, and a comprehensive analysis of multiple and plural scattering effects becomes possible. We use this method for two purposes. First, the accuracy of the approximate approaches, developed mainly for ion-beam structural analysis, is verified. Second, the possibility to reproduce a wide class of experimental conditions is used to analyze some basic features of ion-solid collisions: the role of double violent collisions in low-energy ion scattering; the origin of the “surface peak” in scattering from amorphous samples; the low-energy tail in the energy spectra of scattered medium-energy ions due to plural scattering; and the degradation of blocking patterns in two-dimensional angular distributions with increasing depth of scattering. As an example of simulation for ions of MeV energies, we verify the time reversibility for channeling and blocking of 1-MeV protons in a W crystal. The possibilities of analysis that our approach offers may be very useful for various applications, in particular, for structural analysis with atomic resolution.
Resumo:
In this paper, the authors provide a methodology to design nonparametric permutation tests and, in particular, nonparametric rank tests for applications in detection. In the first part of the paper, the authors develop the optimization theory of both permutation and rank tests in the Neyman?Pearson sense; in the second part of the paper, they carry out a comparative performance analysis of the permutation and rank tests (detectors) against the parametric ones in radar applications. First, a brief review of some contributions on nonparametric tests is realized. Then, the optimum permutation and rank tests are derived. Finally, a performance analysis is realized by Monte-Carlo simulations for the corresponding detectors, and the results are shown in curves of detection probability versus signal-to-noise ratio
Resumo:
Along the recent years, several moving object detection strategies by non-parametric background-foreground modeling have been proposed. To combine both models and to obtain the probability of a pixel to belong to the foreground, these strategies make use of Bayesian classifiers. However, these classifiers do not allow to take advantage of additional prior information at different pixels. So, we propose a novel and efficient alternative Bayesian classifier that is suitable for this kind of strategies and that allows the use of whatever prior information. Additionally, we present an effective method to dynamically estimate prior probability from the result of a particle filter-based tracking strategy.
Resumo:
We have studied enhancer function in transient and stable expression assays in mammalian cells by using systems that distinguish expressing from nonexpressing cells. When expression is studied in this way, enhancers are found to increase the probability of a construct being active but not the level of expression per template. In stably integrated constructs, large differences in expression level are observed but these are not related to the presence of an enhancer. Together with earlier studies, these results suggest that enhancers act to affect a binary (on/off) switch in transcriptional activity. Although this idea challenges the widely accepted model of enhancer activity, it is consistent with much, if not all, experimental evidence on this subject. We hypothesize that enhancers act to increase the probability of forming a stably active template. When randomly integrated into the genome, enhancers may affect a metastable state of repression/activity, permitting expression in regions that would not permit activity of an isolated promoter.