17 resultados para false negative rate
em University of Queensland eSpace - Australia
Resumo:
Background: False-negative interpretations of do-butamine stress echocardiography (DSE) may be associated with reduced wall stress. using measurements of contraction, we sought whether these segments were actually ischemic but unrecognized or showed normal contraction. Methods. We studied 48 patients (29 men; mean age 60 +/- 10 years) with normal regional function on the basis of standard qualitative interpretation of DSE. At coronary angiography within. 6 months of DSE, 32 were identified as having true-negative and 16 as having false-negative results of DSE. Three apical views were used to measure regional function with color Doppler tissue, integrated backscatter, and strain rate imaging. Cyclic variation of integrated backscatter was measured in 16 segments, and strain rate and peak systolic strain was calculated in 6 walls at rest and peak stress. Results. Segments with false-negative results of DSE were divided into 2 groups with and without low wall stress according to previously published cut-off values. Age, sex, left ventricular mass, left ventricular geometric pattern, and peak workload were not significantly different between patients with true and false-negative results of DSE. Importantly, no significant differences in cyclic variation and strain parameters at rest and peak stress were found among segments with true-and false-negative results of DSE with and without low wall stress. Stenosis severity had no influence on cyclic variation and strain parameters at peak stress. Conclusions: False-negative results of DSE reflect lack of ischemia rather than underinterpretation of regional left ventricular function. Quantitative markers are unlikely to increase the sensitivity of DSE.
Resumo:
The use of presence/absence data in wildlife management and biological surveys is widespread. There is a growing interest in quantifying the sources of error associated with these data. We show that false-negative errors (failure to record a species when in fact it is present) can have a significant impact on statistical estimation of habitat models using simulated data. Then we introduce an extension of logistic modeling, the zero-inflated binomial (ZIB) model that permits the estimation of the rate of false-negative errors and the correction of estimates of the probability of occurrence for false-negative errors by using repeated. visits to the same site. Our simulations show that even relatively low rates of false negatives bias statistical estimates of habitat effects. The method with three repeated visits eliminates the bias, but estimates are relatively imprecise. Six repeated visits improve precision of estimates to levels comparable to that achieved with conventional statistics in the absence of false-negative errors In general, when error rates are less than or equal to50% greater efficiency is gained by adding more sites, whereas when error rates are >50% it is better to increase the number of repeated visits. We highlight the flexibility of the method with three case studies, clearly demonstrating the effect of false-negative errors for a range of commonly used survey methods.
Resumo:
Background: Sentinel node biopsy (SNB) is being increasingly used but its place outside randomized trials has not yet been established. Methods: The first 114 sentinel node (SN) biopsies performed for breast cancer at the Princess Alexandra Hospital from March 1999 to June 2001 are presented. In 111 cases axillary dissection was also performed, allowing the accuracy of the technique to be assessed. A standard combination of preoperative lymphoscintigraphy, intraoperative gamma probe and injection of blue dye was used in most cases. Results are discussed in relation to the risk and potential consequences of understaging. Results: Where both probe and dye were used, the SN was identified in 90% of patients. A significant number of patients were treated in two stages and the technique was no less effective in patients who had SNB performed at a second operation after the primary tumour had already been removed. The interval from radioisotope injection to operation was very wide (between 2 and 22 h) and did not affect the outcome. Nodal metastases were present in 42 patients in whom an SN was found, and in 40 of these the SN was positive, giving a false negative rate of 4.8% (2/42), with the overall percentage of patients understaged being 2%. For this particular group as a whole, the increased risk of death due to systemic therapy being withheld as a consequence of understaging (if SNB alone had been employed) is estimated at less than 1/500. The risk for individuals will vary depending on other features of the particular primary tumour. Conclusion: For patients who elect to have the axilla staged using SNB alone, the risk and consequences of understaging need to be discussed. These risks can be estimated by allowing for the specific surgeon's false negative rate for the technique, and considering the likelihood of nodal metastases for a given tumour. There appears to be no disadvantage with performing SNB at a second operation after the primary tumour has already been removed. Clearly, for a large number of patients, SNB alone will be safe, but ideally participation in randomized trials should continue to be encouraged.
Resumo:
An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local FDR (false discovery rate) is provided for each gene. An attractive feature of the mixture model approach is that it provides a framework for the estimation of the prior probability that a gene is not differentially expressed, and this probability can subsequently be used in forming a decision rule. The rule can also be formed to take the false negative rate into account. We apply this approach to a well-known publicly available data set on breast cancer, and discuss our findings with reference to other approaches.
Resumo:
An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local false discovery rate is provided for each gene, and it can be implemented so that the implied global false discovery rate is bounded as with the Benjamini-Hochberg methodology based on tail areas. The latter procedure is too conservative, unless it is modified according to the prior probability that a gene is not differentially expressed. An attractive feature of the mixture model approach is that it provides a framework for the estimation of this probability and its subsequent use in forming a decision rule. The rule can also be formed to take the false negative rate into account.
Resumo:
An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local FDR (false discovery rate) is provided for each gene. An attractive feature of the mixture model approach is that it provides a framework for the estimation of the prior probability that a gene is not differentially expressed, and this probability can subsequently be used in forming a decision rule. The rule can also be formed to take the false negative rate into account. We apply this approach to a well-known publicly available data set on breast cancer, and discuss our findings with reference to other approaches.
Resumo:
In 48 university students performing single-item spelling recognition, prior exposure to misspelled words improved slightly the accuracy on correctly spelled words and increased markedly the 'false alarm' rate (classifying a misspelling seen at study as correct). In a group given a dictation test (N = 24) the only effect of exposure to misspellings was a small increment in the number of misspellings that matched the misspelling seen at study. The two test groups showed no advantage of having the same display format at study and test (AA or BB vs AB or BA). Experiment 2 (in progress) investigated a format match at study and test against a condition with a new test context (AA or BB vs AC or BC). The results to date suggest an influence of memory of the study trial rather than simply an updating by the study exposures of abstract lexical representations. [ABSTRACT FROM AUTHOR]
Resumo:
The Roche Cobas Amplicor system is widely used for the detection of Neisseria gonorrhoeae but is known to cross react with some commensal Neisseria spp. Therefore, a confirmatory test is required. The most common target for confirmatory tests is the cppB gene of N. gonorrhoeae. However, the cppB gene is also present in other Neisseria spp. and is absent in some N. gonorrhoeae isolates. As a result, laboratories targeting this gene run the risk of obtaining both false-positive and false-negative results. In the study presented here, a newly developed N. gonorrhoeae LightCycler assay (NGpapLC) targeting the N. gonorrhoeae porA pseudogene was tested. The NGpapLC assay was used to test 282 clinical samples, and the results were compared to those obtained using a testing algorithm combining the Cobas Amplicor System (Roche Diagnostics, Sydney, Australia) and an in-house LightCycler assay targeting the cppB gene (cppB-LC). In addition, the specificity of the NGpapLC assay was investigated by testing a broad panel of bacteria including isolates of several Neisseria spp. The NGpapLC assay proved to have comparable clinical sensitivity to the cppB-LC assay. In addition; testing of the bacterial panel showed the NGpapLC assay to be highly specific for N. gonorrhoeae DNA. The results of this study show the NGpapLC assay is a suitable alternative to the cppB-LC assay for confirmation of N. gonorrhoeae-positive results obtained with Cobas Amplicor.
Resumo:
This study presents tympanometric normative data for Australian children at school entry in view of the lack of age-specific population-based data for this group. Participants were 327 children (164 boys, 163 girls) aged between 5 and 6 years, who had no history of middle ear infection, and passed pure-tone screening at 20 dB HL. Normative values for static admittance (SA), ear canal volume (ECV), tympanometric peak pressure, tympanometric width (TW) and tympanometric gradient were established. Based on these normative data, the use of the ASHA (1997) guidelines for medical referral, in which ECV > 1.0 ml in the presence of a flat tympanogram, SA < 0.3 ml, or TW > 200 daPa may not provide the best criteria for Australian children aged between 5 and 6 years. If SA < 0.3 ml were used instead of SA < 0.16 ml, a greater proportion of Australian children would have failed tympanometry, thus increasing the false alarm rate.
Resumo:
Motivation: An important problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. We provide a straightforward and easily implemented method for estimating the posterior probability that an individual gene is null. The problem can be expressed in a two-component mixture framework, using an empirical Bayes approach. Current methods of implementing this approach either have some limitations due to the minimal assumptions made or with more specific assumptions are computationally intensive. Results: By converting to a z-score the value of the test statistic used to test the significance of each gene, we propose a simple two-component normal mixture that models adequately the distribution of this score. The usefulness of our approach is demonstrated on three real datasets.
Resumo:
Nucleic acid amplification tests (NAATs) for the detection of Neisseria gonorrhoeae became available in the early 1990s. Although offering several advantages over traditional detection methods, N. gonorrhoeae NAATs do have some limitations. These include cost, risk of carryover contamination, inhibition, and inability to provide antibiotic resistance data. In addition, there are sequence-related limitations that are unique to N. gonorrhoeae NAATs. In particular, false-positive results are a major consideration. These primarily stem from the frequent horizontal genetic exchange occurring within the Neisseria genus, leading to commensal Neisseria species acquiring N. gonorrhoeae genes. Furthermore, some N. gonorrhoeae subtypes may lack specific sequences targeted by a particular NAAT. Therefore, NAAT false-negative results because of sequence variation may occur in some gonococcal populations. Overall, the N. gonorrhoeae species continues to present a considerable challenge for molecular diagnostics. The need to evaluate N. gonorrhoeae NAATs before their use in any new patient population and to educate physicians on the limitations of these tests is emphasized in this review.
Resumo:
PTS1 proteins are peroxisomal matrix proteins that have a well conserved targeting motif at the C-terminal end. However, this motif is present in many non peroxisomal proteins as well, thus predicting peroxisomal proteins involves differentiating fake PTS1 signals from actual ones. In this paper we report on the development of an SVM classifier with a separately trained logistic output function. The model uses an input window containing 12 consecutive residues at the C-terminus and the amino acid composition of the full sequence. The final model gives a Matthews Correlation Coefficient of 0.77, representing an increase of 54% compared with the well-known PeroxiP predictor. We test the model by applying it to several proteomes of eukaryotes for which there is no evidence of a peroxisome, producing a false positive rate of 0.088%.
Resumo:
One reason for the neglect of the role of positive factors in cognitive-behavioural therapy (CBT) may relate to a failure to develop cognitive models that integrate positive and negative cognitions. Bandura [Psychol. Rev. 84 (1977) 191; Anxiety Res. 1 (1988) 77] proposed that self-efficacy beliefs mediate a range of emotional and behavioural outcomes. However, in panic disorder, cognitively based research to date has largely focused on catastrophic misinterpretation of bodily sensations. Although a number of studies support each of the predictions associated with the account of panic disorder that is based on the role of negative cognitions, a review of the literature indicated that a cognitively based explanation of the disorder may be considerably strengthened by inclusion of positive cognitions that emphasize control or coping. Evidence to support an Integrated Cognitive Model (ICM) of panic disorder was examined and the theoretical implications of this model were discussed in terms of both schema change and compensatory skills accounts of change processes in CBT. (C) 2004 Elsevier Ltd. All rights reserved.