957 resultados para false negative rate
Resumo:
Since the first anti-doping tests in the 1960s, the analytical aspects of the testing remain challenging. The evolution of the analytical process in doping control is discussed in this paper with a particular emphasis on separation techniques, such as gas chromatography and liquid chromatography. These approaches are improving in parallel with the requirements of increasing sensitivity and selectivity for detecting prohibited substances in biological samples from athletes. Moreover, fast analyses are mandatory to deal with the growing number of doping control samples and the short response time required during particular sport events. Recent developments in mass spectrometry and the expansion of accurate mass determination has improved anti-doping strategies with the possibility of using elemental composition and isotope patterns for structural identification. These techniques must be able to distinguish equivocally between negative and suspicious samples with no false-negative or false-positive results. Therefore, high degree of reliability must be reached for the identification of major metabolites corresponding to suspected analytes. Along with current trends in pharmaceutical industry the analysis of proteins and peptides remains an important issue in doping control. Sophisticated analytical tools are still mandatory to improve their distinction from endogenous analogs. Finally, indirect approaches will be discussed in the context of anti-doping, in which recent advances are aimed to examine the biological response of a doping agent in a holistic way.
Resumo:
Introduction: Carbon monoxide (CO) poisoning is one of the mostcommon causes of fatal poisoning. Symptoms of CO poisoning arenonspecific and the documentation of elevated carboxyhemoglobin(HbCO) levels in arterial blood sample is the only standard ofconfirming suspected exposure. The treatment of CO poisoning requiresnormobaric or hyperbaric oxygen therapy, according to the symptomsand HbCO levels. A new device, the Rad-57 pulse CO-oximeter allowsnoninvasive transcutaneous measurement of blood carboxyhemoglobinlevel (SpCO) by measurement of light wavelength absorptions.Methods: Prospective cohort study with a sample of patients, admittedbetween October 2008 - March 2009 and October 2009 - March 2010,in the emergency services (ES) of a Swiss regional hospital and aSwiss university hospital (Burn Center). In case of suspected COpoisoning, three successive noninvasive measurements wereperformed, simultaneously with one arterial blood HbCO test. A controlgroup includes patients admitted in the ES for other complaints (cardiacinsufficiency, respiratory distress, acute renal failure), but necessitatingarterial blood testing. Informed consent was obtained from all patients.The primary endpoint was to assess the agreement of themeasurements made by the Rad-57 (SpCO) and the blood levels(HbCO).Results: 50 patients were enrolled, among whom 32 were admittedfor suspected CO poisoning. Baseline demographic and clinicalcharacteristics of patients are presented in table 1. The median age was37.7 ans ± 11.8, 56% being male. Median laboratory carboxyhemoglobinlevels (HbCO) were 4.25% (95% IC 0.6-28.5) for intoxicated patientsand 1.8% (95% IC 1.0-5.3) for control patients. Only five patientspresented with HbCO levels >= 15%. The results disclose relatively faircorrelations between the SpCO levels obtained by the Rad-57 and thestandard HbCO, without any false negative results. However, theRad-57 tend to under-estimate the value of SpCO for patientsintoxicated HbCO levels >10% (fig. 1).Conclusion: Noninvasive transcutaneous measurement of bloodcarboxyhemoglobin level is easy to use. The correlation seems to becorrect for low to moderate levels (<15%). For higher values, weobserve a trend of the Rad-57 to under-estimate the HbCO levels. Apartfrom this potential limitation and a few cases of false-negative resultsdescribed in the literature, the Rad-57 may be useful for initial triageand diagnosis of CO.
Resumo:
INTRODUCTION: Continuous EEG (cEEG) is increasingly used to monitor brain function in neuro-ICU patients. However, its value in patients with coma after cardiac arrest (CA), particularly in the setting of therapeutic hypothermia (TH), is only beginning to be elucidated. The aim of this study was to examine whether cEEG performed during TH may predict outcome. METHODS: From April 2009 to April 2010, we prospectively studied 34 consecutive comatose patients treated with TH after CA who were monitored with cEEG, initiated during hypothermia and maintained after rewarming. EEG background reactivity to painful stimulation was tested. We analyzed the association between cEEG findings and neurologic outcome, assessed at 2 months with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). RESULTS: Continuous EEG recording was started 12 ± 6 hours after CA and lasted 30 ± 11 hours. Nonreactive cEEG background (12 of 15 (75%) among nonsurvivors versus none of 19 (0) survivors; P < 0.001) and prolonged discontinuous "burst-suppression" activity (11 of 15 (73%) versus none of 19; P < 0.001) were significantly associated with mortality. EEG seizures with absent background reactivity also differed significantly (seven of 15 (47%) versus none of 12 (0); P = 0.001). In patients with nonreactive background or seizures/epileptiform discharges on cEEG, no improvement was seen after TH. Nonreactive cEEG background during TH had a positive predictive value of 100% (95% confidence interval (CI), 74 to 100%) and a false-positive rate of 0 (95% CI, 0 to 18%) for mortality. All survivors had cEEG background reactivity, and the majority of them (14 (74%) of 19) had a favorable outcome (CPC 1 or 2). CONCLUSIONS: Continuous EEG monitoring showing a nonreactive or discontinuous background during TH is strongly associated with unfavorable outcome in patients with coma after CA. These data warrant larger studies to confirm the value of continuous EEG monitoring in predicting prognosis after CA and TH.
Resumo:
BACKGROUND: Newborn screening (NBS) for Cystic Fibrosis (CF) has been introduced in many countries, but there is no ideal protocol suitable for all countries. This retrospective study was conducted to evaluate whether the planned two step CF NBS with immunoreactive trypsinogen (IRT) and 7 CFTR mutations would have detected all clinically diagnosed children with CF in Switzerland. METHODS: IRT was measured using AutoDELFIA Neonatal IRT-Kit in stored NBS cards. RESULTS: Between 2006 and 2009, 66 children with CF were reported, 4 of which were excluded for various reasons (born in another country, NBS at 6 months, no informed consent). 98% (61/62) had significantly higher IRT compared to matched control group. There was one false negative IRT result in an asymptomatic child with atypical CF (normal pancreatic function and sweat test). CONCLUSIONS: All children but one with atypical CF would have been detected with the planned two step protocol.
Resumo:
OBJECTIVE: To systematically review and meta-analyze published data about the diagnostic performance of Fluorine-18-Fluorodeoxyglucose ((18)F-FDG) positron emission tomography (PET) and PET/computed tomography (PET/CT) in the assessment of pleural abnormalities in cancer patients. METHODS: A comprehensive literature search of studies published through June 2013 regarding the role of (18)F-FDG-PET and PET/CT in evaluating pleural abnormalities in cancer patients was performed. All retrieved studies were reviewed and qualitatively analyzed. Pooled sensitivity, specificity, positive and negative likelihood ratio (LR+ and LR-) and diagnostic odd ratio (DOR) of (18)F-FDG-PET or PET/CT on a per patient-based analysis were calculated. The area under the summary ROC curve (AUC) was calculated to measure the accuracy of these methods in the assessment of pleural abnormalities. Sub-analyses considering (18)F-FDG-PET/CT and patients with lung cancer only were carried out. RESULTS: Eight studies comprising 360 cancer patients (323 with lung cancer) were included. The meta-analysis of these selected studies provided the following results: sensitivity 86% [95% confidence interval (95%CI): 80-91%], specificity 80% [95%CI: 73-85%], LR+ 3.7 [95%CI: 2.8-4.9], LR- 0.18 [95%CI: 0.09-0.34], DOR 27 [95%CI: 13-56]. The AUC was 0.907. No significant improvement considering PET/CT studies only and patients with lung cancer was found. CONCLUSIONS: (18)F-FDG-PET and PET/CT demonstrated to be useful diagnostic imaging methods in the assessment of pleural abnormalities in cancer patients, nevertheless possible sources of false-negative and false-positive results should be kept in mind. The literature focusing on the use of (18)F-FDG-PET and PET/CT in this setting remains still limited and prospective studies are needed.
Resumo:
Purpose: To compare the performance Glaucoma Quality of Life-15 (GQL-15) Questionnaire, intraocular pressure measurement (IOP Goldmann tonometry) and a measure of visual field loss using Moorfields Motion Displacement Test (MDT) in detecting glaucomatous eyes from a self referred population. Methods: The GQL-15 has been suggested to correlate with visual disability and psychophysical measures of visual function in glaucoma patients. The Moorfields MDT is a multi location perimetry test with 32 white line stimuli presented on a grey background on a standard laptop computer. Each stimulus is displaced between computer frames to give the illusion of "apparent motion". Participants (N=312, 90% older than 45 years; 20.5% family history of glaucoma) self referred to an advertised World Glaucoma Day (March 2009) Jules Gonin Eye Hospital, Lausanne Switzerland. Participants underwent a clinical exam (IOP, slit lamp, angle and disc examination by a general ophthalmologist), 90% completed a GQL-15 questionnaire and over 50% completed a MDT test in both eyes. Those who were classified as abnormal on one or more of the following (IOP >21 mmHg/ GQL-15 score >20/ MDT score >2/ clinical exam) underwent a follow up clinical examination by a glaucoma specialist including imaging and threshold perimetry. After the second examination subjects were classified as "healthy"(H), "glaucoma suspect" (GS) (ocular hypertension and/or suspicious disc, angle closure with SD) or "glaucomatous" (G). Results: One hundred and ten subjects completed all 4 initial examinations; of these 69 were referred to complete the 2nd examination and were classified as; 8 G, 24 GS, and 37 H. MDT detected 7/8 G, and 7/24 GS, with false referral rate of 3.8%. IOP detected 2/8 G and 8/24 GS, with false referral rate of 8.9%. GQL-15 detected 4/8 G, 16/24 GS with a false referral rate of 42%. Conclusions: In this sample of participants attending a self referral glaucoma detection event, the MDT performed significantly better than the GQL-15 and IOP in discriminating glaucomatous patients from healthy subjects. Further studies are required to assess the potential of the MDT as a glaucoma screening tool.
Resumo:
In this study, we report the first ever large-scale environmental validation of a microbial reporter-based test to measure arsenic concentrations in natural water resources. A bioluminescence-producing arsenic-inducible bacterium based on Escherichia coli was used as the reporter organism. Specific protocols were developed with the goal to avoid the negative influence of iron in groundwater on arsenic availability to the bioreporter cells. A total of 194 groundwater samples were collected in the Red River and Mekong River Delta regions of Vietnam and were analyzed both by atomic absorption spectroscopy (AAS) and by the arsenic bioreporter protocol. The bacterial cells performed well at and above arsenic concentrations in groundwater of 7 microg/L, with an almost linearly proportional increase of the bioluminescence signal between 10 and 100 microg As/L (r2 = 0.997). Comparisons between AAS and arsenic bioreporter determinations gave an overall average of 8.0% false negative and 2.4% false positive identifications for the bioreporter prediction at the WHO recommended acceptable arsenic concentration of 10 microg/L, which is far betterthan the performance of chemical field test kits. Because of the ease of the measurement protocol and the low application cost, the microbiological arsenic test has a great potential in large screening campaigns in Asia and in other areas suffering from arsenic pollution in groundwater resources.
Resumo:
Background: Despite its pervasiveness, the genetic basis of adaptation resulting in variation directly or indirectly related to temperature (climatic) gradients is poorly understood. By using 3-fold replicated laboratory thermal stocks covering much of the physiologically tolerable temperature range for the temperate (i.e., cold tolerant) species Drosophila subobscura we have assessed whole-genome transcriptional responses after three years of thermal adaptation, when the populations had already diverged for inversion frequencies, pre-adult life history components, and morphological traits. Total mRNA from each population was compared to a reference pool mRNA in a standard, highly replicated two-colour competitive hybridization experiment using cDNA microarrays.Results: A total of 306 (6.6%) cDNA clones were identified as 'differentially expressed' (following a false discovery rate correction) after contrasting the two furthest apart thermal selection regimes (i.e., 13°C vs . 22°C), also including four previously reported candidate genes for thermotolerance in Drosophila (Hsp26, Hsp68, Fst, and Treh). On the other hand, correlated patterns of gene expression were similar in cold- and warm-adapted populations. Analysis of functional categories defined by the Gene Ontology project point to an overrepresentation of genes involved in carbohydrate metabolism, nucleic acids metabolism and regulation of transcription among other categories. Although the location of differently expressed genes was approximately at random with respect to chromosomes, a physical mapping of 88 probes to the polytene chromosomes of D. subobscura has shown that a larger than expected number mapped inside inverted chromosomal segments.Conclusion: Our data suggest that a sizeable number of genes appear to be involved in thermal adaptation in Drosophila, with a substantial fraction implicated in metabolism. This apparently illustrates the formidable challenge to understanding the adaptive evolution of complex trait variation. Furthermore, some clustering of genes within inverted chromosomal sections was detected. Disentangling the effects of inversions will be obviously required in any future approach if we want to identify the relevant candidate genes.
Resumo:
Introduction: Continuous EEG (cEEG) is increasingly used to monitor brain function in neuro-ICU patients. However, its value in patients with coma after cardiac arrest (CA), particularly in the setting of therapeutic hypothermia (TH), is only beginning to be elucidated. The aim of this study was to examine whether cEEG performed during TH may predict outcome. Methods: From April 2009 to April 2010, we prospectively studied 34 consecutive comatose patients treated with TH after CA who were monitored with cEEG, initiated during hypothermia and maintained after rewarming. EEG background reactivity to painful stimulation was tested. We analyzed the association between cEEG findings and neurologic outcome, assessed at 2 months with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Results: Continuous EEG recording was started 12 ± 6 hours after CA and lasted 30 ± 11 hours. Nonreactive cEEG background (12 of 15 (75%) among nonsurvivors versus none of 19 (0) survivors; P < 0.001) and prolonged discontinuous "burst-suppression" activity (11 of 15 (73%) versus none of 19; P < 0.001) were significantly associated with mortality. EEG seizures with absent background reactivity also differed significantly (seven of 15 (47%) versus none of 12 (0); P = 0.001). In patients with nonreactive background or seizures/epileptiform discharges on cEEG, no improvement was seen after TH. Nonreactive cEEG background during TH had a positive predictive value of 100% (95% confidence interval (CI), 74 to 100%) and a false-positive rate of 0 (95% CI, 0 to 18%) for mortality. All survivors had cEEG background reactivity, and the majority of them (14 (74%) of 19) had a favorable outcome (CPC 1 or 2). Conclusions: Continuous EEG monitoring showing a nonreactive or discontinuous background during TH is strongly associated with unfavorable outcome in patients with coma after CA. These data warrant larger studies to confirm the value of continuous EEG monitoring in predicting prognosis after CA and TH.
Resumo:
Genome-wide association studies have been instrumental in identifying genetic variants associated with complex traits such as human disease or gene expression phenotypes. It has been proposed that extending existing analysis methods by considering interactions between pairs of loci may uncover additional genetic effects. However, the large number of possible two-marker tests presents significant computational and statistical challenges. Although several strategies to detect epistasis effects have been proposed and tested for specific phenotypes, so far there has been no systematic attempt to compare their performance using real data. We made use of thousands of gene expression traits from linkage and eQTL studies, to compare the performance of different strategies. We found that using information from marginal associations between markers and phenotypes to detect epistatic effects yielded a lower false discovery rate (FDR) than a strategy solely using biological annotation in yeast, whereas results from human data were inconclusive. For future studies whose aim is to discover epistatic effects, we recommend incorporating information about marginal associations between SNPs and phenotypes instead of relying solely on biological annotation. Improved methods to discover epistatic effects will result in a more complete understanding of complex genetic effects.
Resumo:
PURPOSE: To evaluate the diagnostic performance of abdominal radiography in the detection of illegal intracorporeal containers (hereafter, packets), with low-dose computed tomography (CT) as the reference standard. MATERIALS AND METHODS: This study was approved by the institutional ethical review board, with written informed consent. From July 2007 to July 2010, 330 people (296 men, 34 women; mean age, 32 years [range, 18-55 years]) suspected of having ingested drug packets underwent supine abdominal radiography and low-dose CT. The presence or absence of packets at abdominal radiography was reported, with low-dose CT as the reference standard. The density and number of packets (≤ 12 or >12) at low-dose CT were recorded and analyzed to determine whether those variables influence interpretation of results at abdominal radiography. RESULTS: Packets were detected at low-dose CT in 53 (16%) suspects. Sensitivity of abdominal radiography for depiction of packets was 0.77 (41 of 53), and specificity was 0.96 (267 of 277). The packets appeared isoattenuated to the bowel contents at low-dose CT in 16 (30%) of the 53 suspects with positive results. Nineteen (36%) of the 53 suspects with positive low-dose CT results had fewer than 12 packets. Packets that were isoattenuated at low-dose CT and a low number of packets (≤12) were both significantly associated with false-negative results at abdominal radiography (P = .004 and P = .016, respectively). CONCLUSION: Abdominal radiography is mainly limited by low sensitivity when compared with low-dose CT in the screening of people suspected of carrying drug packets. Low-dose CT is an effective imaging alternative to abdominal radiography.
Resumo:
The accuracy of peritoneoscopy and liver biopsy in the diagnosis of hepatic cirrhosis was compared in 473 consecutive patients submitted to both procedures. One hundred and fifty-two of them had cirrhosis diagnosed by one or both methods. There was 73% agreement between the two procedures. `Apparent' false-negative results were 17·7% for peritoneoscopy and 9·3% for liver biopsy. The incidence of false-negative results in the diagnosis of cirrhosis can be reduced by combining both procedures.
Resumo:
In this work, a previously-developed, statistical-based, damage-detection approach was validated for its ability to autonomously detect damage in bridges. The damage-detection approach uses statistical differences in the actual and predicted behavior of the bridge caused under a subset of ambient trucks. The predicted behavior is derived from a statistics-based model trained with field data from the undamaged bridge (not a finite element model). The differences between actual and predicted responses, called residuals, are then used to construct control charts, which compare undamaged and damaged structure data. Validation of the damage-detection approach was achieved by using sacrificial specimens that were mounted to the bridge and exposed to ambient traffic loads and which simulated actual damage-sensitive locations. Different damage types and levels were introduced to the sacrificial specimens to study the sensitivity and applicability. The damage-detection algorithm was able to identify damage, but it also had a high false-positive rate. An evaluation of the sub-components of the damage-detection methodology and methods was completed for the purpose of improving the approach. Several of the underlying assumptions within the algorithm were being violated, which was the source of the false-positives. Furthermore, the lack of an automatic evaluation process was thought to potentially be an impediment to widespread use. Recommendations for the improvement of the methodology were developed and preliminarily evaluated. These recommendations are believed to improve the efficacy of the damage-detection approach.
Resumo:
The objective of this study was to comprehensively compare the genomic profiles in the breast of parous and nulliparous postmenopausal women to identify genes that permanently change their expression following pregnancy. The study was designed as a two-phase approach. In the discovery phase, we compared breast genomic profiles of 37 parous with 18 nulliparous postmenopausal women. In the validation phase, confirmation of the genomic patterns observed in the discovery phase was sought in an independent set of 30 parous and 22 nulliparous postmenopausal women. RNA was hybridized to Affymetrix HG_U133 Plus 2.0 oligonucleotide arrays containing probes to 54,675 transcripts, scanned and the images analyzed using Affymetrix GCOS software. Surrogate variable analysis, logistic regression, and significance analysis of microarrays were used to identify statistically significant differences in expression of genes. The false discovery rate (FDR) approach was used to control for multiple comparisons. We found that 208 genes (305 probe sets) were differentially expressed between parous and nulliparous women in both discovery and validation phases of the study at an FDR of 10% and with at least a 1.25-fold change. These genes are involved in regulation of transcription, centrosome organization, RNA splicing, cell-cycle control, adhesion, and differentiation. The results provide initial evidence that full-term pregnancy induces long-term genomic changes in the breast. The genomic signature of pregnancy could be used as an intermediate marker to assess potential chemopreventive interventions with hormones mimicking the effects of pregnancy for prevention of breast cancer.
Resumo:
OBJECTIVE: To evaluate the pertinence of prenatal diagnosis in cases of congenital uropathy. STUDY DESIGN: Retrospective evaluation over a period of 6.5 years. METHOD: 93 cases were involved in the comparison of prenatal ultrasonographic diagnosis with neonatal findings, autopsy results, and follow-up data. RESULTS: 33 fetuses had renal parenchymal lesions, 44 had excretory system lesions, and 6 had bladder and/or urethral lesions. Seventy-three pregnancies lead to live births. Eighteen terminations of pregnancy were performed on the parents' request for extremely severe malformations. Two intrauterine deaths were observed, and two infants died in the postnatal period. Prenatal diagnosis was obtained at an average of 27 weeks gestation. Diagnostic concordance was excellent in 82% and partial in 12% of cases with renal parenchymal lesions; the false-positive rate was 6%. For excretory system lesions, concordance was excellent in 87% and partial in 7.4% of cases, with a false-positive rate of 5.6%. Finally, concordance was excellent in 100% of cases of bladder and/or urethral lesions. The overall rate of total concordance was 86%. Partial concordance cases consisted of malformations different from those previously diagnosed, but prenatal diagnosis nevertheless lead to further investigations in the neonatal period and to proper management. The false-positive diagnoses (5.4%) never lead to termination of pregnancy. CONCLUSION: Prenatal diagnosis of congenital uropathy is effective. A third-trimester ultrasonographic examination is necessary to ensure proper neonatal management, considering that the majority of cases are diagnosed at this gestational age.