186 resultados para false
Resumo:
OBJECTIVE: To systematically review and meta-analyze published data about the diagnostic performance of Fluorine-18-Fluorodeoxyglucose ((18)F-FDG) positron emission tomography (PET) and PET/computed tomography (PET/CT) in the assessment of pleural abnormalities in cancer patients. METHODS: A comprehensive literature search of studies published through June 2013 regarding the role of (18)F-FDG-PET and PET/CT in evaluating pleural abnormalities in cancer patients was performed. All retrieved studies were reviewed and qualitatively analyzed. Pooled sensitivity, specificity, positive and negative likelihood ratio (LR+ and LR-) and diagnostic odd ratio (DOR) of (18)F-FDG-PET or PET/CT on a per patient-based analysis were calculated. The area under the summary ROC curve (AUC) was calculated to measure the accuracy of these methods in the assessment of pleural abnormalities. Sub-analyses considering (18)F-FDG-PET/CT and patients with lung cancer only were carried out. RESULTS: Eight studies comprising 360 cancer patients (323 with lung cancer) were included. The meta-analysis of these selected studies provided the following results: sensitivity 86% [95% confidence interval (95%CI): 80-91%], specificity 80% [95%CI: 73-85%], LR+ 3.7 [95%CI: 2.8-4.9], LR- 0.18 [95%CI: 0.09-0.34], DOR 27 [95%CI: 13-56]. The AUC was 0.907. No significant improvement considering PET/CT studies only and patients with lung cancer was found. CONCLUSIONS: (18)F-FDG-PET and PET/CT demonstrated to be useful diagnostic imaging methods in the assessment of pleural abnormalities in cancer patients, nevertheless possible sources of false-negative and false-positive results should be kept in mind. The literature focusing on the use of (18)F-FDG-PET and PET/CT in this setting remains still limited and prospective studies are needed.
Resumo:
PURPOSE: To assess how different diagnostic decision aids perform in terms of sensitivity, specificity, and harm. METHODS: Four diagnostic decision aids were compared, as applied to a simulated patient population: a findings-based algorithm following a linear or branched pathway, a serial threshold-based strategy, and a parallel threshold-based strategy. Headache in immune-compromised HIV patients in a developing country was used as an example. Diagnoses included cryptococcal meningitis, cerebral toxoplasmosis, tuberculous meningitis, bacterial meningitis, and malaria. Data were derived from literature and expert opinion. Diagnostic strategies' validity was assessed in terms of sensitivity, specificity, and harm related to mortality and morbidity. Sensitivity analyses and Monte Carlo simulation were performed. RESULTS: The parallel threshold-based approach led to a sensitivity of 92% and a specificity of 65%. Sensitivities of the serial threshold-based approach and the branched and linear algorithms were 47%, 47%, and 74%, respectively, and the specificities were 85%, 95%, and 96%. The parallel threshold-based approach resulted in the least harm, with the serial threshold-based approach, the branched algorithm, and the linear algorithm being associated with 1.56-, 1.44-, and 1.17-times higher harm, respectively. Findings were corroborated by sensitivity and Monte Carlo analyses. CONCLUSION: A threshold-based diagnostic approach is designed to find the optimal trade-off that minimizes expected harm, enhancing sensitivity and lowering specificity when appropriate, as in the given example of a symptom pointing to several life-threatening diseases. Findings-based algorithms, in contrast, solely consider clinical observations. A parallel workup, as opposed to a serial workup, additionally allows for all potential diseases to be reviewed, further reducing false negatives. The parallel threshold-based approach might, however, not be as good in other disease settings.
Resumo:
Purpose: To compare the performance Glaucoma Quality of Life-15 (GQL-15) Questionnaire, intraocular pressure measurement (IOP Goldmann tonometry) and a measure of visual field loss using Moorfields Motion Displacement Test (MDT) in detecting glaucomatous eyes from a self referred population. Methods: The GQL-15 has been suggested to correlate with visual disability and psychophysical measures of visual function in glaucoma patients. The Moorfields MDT is a multi location perimetry test with 32 white line stimuli presented on a grey background on a standard laptop computer. Each stimulus is displaced between computer frames to give the illusion of "apparent motion". Participants (N=312, 90% older than 45 years; 20.5% family history of glaucoma) self referred to an advertised World Glaucoma Day (March 2009) Jules Gonin Eye Hospital, Lausanne Switzerland. Participants underwent a clinical exam (IOP, slit lamp, angle and disc examination by a general ophthalmologist), 90% completed a GQL-15 questionnaire and over 50% completed a MDT test in both eyes. Those who were classified as abnormal on one or more of the following (IOP >21 mmHg/ GQL-15 score >20/ MDT score >2/ clinical exam) underwent a follow up clinical examination by a glaucoma specialist including imaging and threshold perimetry. After the second examination subjects were classified as "healthy"(H), "glaucoma suspect" (GS) (ocular hypertension and/or suspicious disc, angle closure with SD) or "glaucomatous" (G). Results: One hundred and ten subjects completed all 4 initial examinations; of these 69 were referred to complete the 2nd examination and were classified as; 8 G, 24 GS, and 37 H. MDT detected 7/8 G, and 7/24 GS, with false referral rate of 3.8%. IOP detected 2/8 G and 8/24 GS, with false referral rate of 8.9%. GQL-15 detected 4/8 G, 16/24 GS with a false referral rate of 42%. Conclusions: In this sample of participants attending a self referral glaucoma detection event, the MDT performed significantly better than the GQL-15 and IOP in discriminating glaucomatous patients from healthy subjects. Further studies are required to assess the potential of the MDT as a glaucoma screening tool.
Resumo:
In this study, we report the first ever large-scale environmental validation of a microbial reporter-based test to measure arsenic concentrations in natural water resources. A bioluminescence-producing arsenic-inducible bacterium based on Escherichia coli was used as the reporter organism. Specific protocols were developed with the goal to avoid the negative influence of iron in groundwater on arsenic availability to the bioreporter cells. A total of 194 groundwater samples were collected in the Red River and Mekong River Delta regions of Vietnam and were analyzed both by atomic absorption spectroscopy (AAS) and by the arsenic bioreporter protocol. The bacterial cells performed well at and above arsenic concentrations in groundwater of 7 microg/L, with an almost linearly proportional increase of the bioluminescence signal between 10 and 100 microg As/L (r2 = 0.997). Comparisons between AAS and arsenic bioreporter determinations gave an overall average of 8.0% false negative and 2.4% false positive identifications for the bioreporter prediction at the WHO recommended acceptable arsenic concentration of 10 microg/L, which is far betterthan the performance of chemical field test kits. Because of the ease of the measurement protocol and the low application cost, the microbiological arsenic test has a great potential in large screening campaigns in Asia and in other areas suffering from arsenic pollution in groundwater resources.
The 5th anniversary of "Patient Safety in Surgery" - from the Journal's origin to its future vision.
Resumo:
A prospective study was undertaken to determine prognostic markers for patients with obstructive jaundice. Along with routine liver function tests, antipyrine clearance was determined in 20 patients. Four patients died after basal investigations. Five patients underwent definitive surgery. The remaining 11 patients were subjected to percutaneous transhepatic biliary decompression. Four patients died during the drainage period, while surgery was carried out for seven patients within 1-3 weeks of drainage. Of 20 patients, only six patients survived. Basal liver function tests were comparable in survivors and nonsurvivors. Discriminant analysis of the basal data revealed that plasma bilirubin, proteins and antipyrine half-life taken together had a strong association with mortality. A mathematical equation was derived using these variables and a score was computed for each patient. It was observed that a score value greater than or equal to 0.84 indicated survival. Omission of antipyrine half-life from the data, however, resulted in prediction of false security in 55% of patients. This study highlights the importance of addition of antipyrine elimination test to the routine liver function tests for precise identification of high risk patients.
Resumo:
Rationale: Clinical and electrophysiological prognostic markers of brain anoxia have been mostly evaluated in comatose survivors of out hospital cardiac arrest (OHCA) after standard resuscitation, but their predictive value in patients treated with mild induced hypothermia (IH) is unknown. The objective of this study was to identify a predictive score of independent clinical and electrophysiological variables in comatose OHCA survivors treated with IH, aiming at a maximal positive predictive value (PPV) and a high negative predictive value (NPV) for mortality. Methods: We prospectively studied consecutive adult comatose OHCA survivors from April 2006 to May 2009, treated with mild IH to 33-34_C for 24h at the intensive care unit of the Lausanne University Hospital, Switzerland. IH was applied using an external cooling method. As soon as subjects passively rewarmed (body temperature >35_C) they underwent EEG and SSEP recordings (off sedation), and were examined by experienced neurologists at least twice. Patients with status epilepticus were treated with AED for at least 24h. A multivariable logistic regression was performed to identify independent predictors of mortality at hospital discharge. These were used to formulate a predictive score. Results: 100 patients were studied; 61 died. Age, gender and OHCA etiology (cardiac vs. non-cardiac) did not differ among survivors and nonsurvivors. Cardiac arrest type (non-ventricular fibrillation vs. ventricular fibrillation), time to return of spontaneous circulation (ROSC) >25min, failure to recover all brainstem reflexes, extensor or no motor response to pain, myoclonus, presence of epileptiform discharges on EEG, EEG background unreactive to pain, and bilaterally absent N20 on SSEP, were all significantly associated with mortality. Absent N20 was the only variable showing no false positive results. Multivariable logistic regression identified four independent predictors (Table). These were used to construct the score, and its predictive values were calculated after a cut-off of 0-1 vs. 2-4 predictors. We found a PPV of 1.00 (95% CI: 0.93-1.00), a NPV of 0.81 (95% CI: 0.67-0.91) and an accuracy of 0.93 for mortality. Among 9 patients who were predicted to survive by the score but eventually died, only 1 had absent N20. Conclusions: Pending validation in a larger cohort, this simple score represents a promising tool to identify patients who will survive, and most subjects who will not, after OHCA and IH. Furthermore, while SSEP are 100% predictive of poor outcome but not available in most hospitals, this study identifies EEG background reactivity as an important predictor after OHCA. The score appears robust even without SSEP, suggesting that SSEP and other investigations (e.g., mismatch negativity, serum NSE) might be principally needed to enhance prognostication in the small subgroup of patients failing to improve despite a favorable score.
Resumo:
In subjects with normal lung mechanics, inspiratory muscle strength can be reliably and easily assessed by the sniff nasal inspiratory pressure (SNIP), which is the pressure measured in an occluded nostril during a maximal sniff performed through the contralateral nostril. The aim of this study was to assess the validity of the SNIP in patients with chronic obstructive pulmonary disease (COPD), where pressure transmission from alveoli to upper airways is likely to be dampened. Twenty eight patients with COPD were studied (mean forced expiratory volume in one second (FEV1) = 36% of predicted). The SNIP and the sniff oesophageal pressure (sniff Poes) were measured simultaneously during maximal sniffs, and were compared to the maximal inspiratory pressure obtained against an occlusion (MIP). All measurements were performed from functional residual capacity in the sitting position. The ratio SNIP/sniff Poes was 0.80, and did not correlate with the degree of airflow limitation. The ratio MIP/sniff Poes was 0.87, and the ratio SNIP/MIP was 0.97. Inspiratory muscle weakness, as defined by a low sniff Poes, was present in 17 of the 28 patients. A false diagnosis of weakness was made in eight patients when MIP was considered alone, in four when SNIP was considered alone, and in only three patients when MIP and SNIP were combined. We conclude that both the sniff nasal inspiratory pressure and the maximal inspiratory pressure moderately underestimate sniff oesophageal pressure in chronic obstructive pulmonary disease. Although suboptimal in this condition, the sniff nasal inspiratory pressure appears useful to complement the maximal inspiratory pressure for assessing inspiratory muscle strength in patients with chronic obstructive pulmonary disease.
Resumo:
Introduction: Continuous EEG (cEEG) is increasingly used to monitor brain function in neuro-ICU patients. However, its value in patients with coma after cardiac arrest (CA), particularly in the setting of therapeutic hypothermia (TH), is only beginning to be elucidated. The aim of this study was to examine whether cEEG performed during TH may predict outcome. Methods: From April 2009 to April 2010, we prospectively studied 34 consecutive comatose patients treated with TH after CA who were monitored with cEEG, initiated during hypothermia and maintained after rewarming. EEG background reactivity to painful stimulation was tested. We analyzed the association between cEEG findings and neurologic outcome, assessed at 2 months with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Results: Continuous EEG recording was started 12 ± 6 hours after CA and lasted 30 ± 11 hours. Nonreactive cEEG background (12 of 15 (75%) among nonsurvivors versus none of 19 (0) survivors; P < 0.001) and prolonged discontinuous "burst-suppression" activity (11 of 15 (73%) versus none of 19; P < 0.001) were significantly associated with mortality. EEG seizures with absent background reactivity also differed significantly (seven of 15 (47%) versus none of 12 (0); P = 0.001). In patients with nonreactive background or seizures/epileptiform discharges on cEEG, no improvement was seen after TH. Nonreactive cEEG background during TH had a positive predictive value of 100% (95% confidence interval (CI), 74 to 100%) and a false-positive rate of 0 (95% CI, 0 to 18%) for mortality. All survivors had cEEG background reactivity, and the majority of them (14 (74%) of 19) had a favorable outcome (CPC 1 or 2). Conclusions: Continuous EEG monitoring showing a nonreactive or discontinuous background during TH is strongly associated with unfavorable outcome in patients with coma after CA. These data warrant larger studies to confirm the value of continuous EEG monitoring in predicting prognosis after CA and TH.
Resumo:
INTRODUCTION: We describe a case of diffuse nesidioblastosis in an adult patient who presented with exclusively fasting symptoms and a focal pancreatic 111In-pentetreotide uptake mimicking an insulinoma. CASE PRESENTATION: A 23-year-old Caucasian man had severe daily fasting hypoglycemia with glucose levels below 2mmol/L. Besides rare neuroglycopenic symptoms (confusion, sleepiness), he was largely asymptomatic. His investigations revealed low venous plasma glucose levels, high insulin and C-peptide levels and a 72-hour fast test that were all highly suggestive for an insulinoma. Abdominal computed tomography and magnetic resonance imaging did not reveal any lesions. The sole imagery that was compatible with an insulinoma was a 111In-somatostatin receptor scintigraphy that showed a faint but definite focal tracer between the head and the body of the pancreas. However, this lesion could not be confirmed by endoscopic ultrasonography of the pancreas. Following duodenopancreatectomy, the histological findings were consistent with diffuse nesidioblastosis. Postoperatively, the patient continued to present with fasting hypoglycemia and was successfully treated with diazoxide. CONCLUSION: In the absence of gastrointestinal surgery, nesidioblastosis is very rare in adults. In addition, nesidioblastosis is usually characterized by post-prandial hypoglycemia, whereas this patient presented with fasting hypoglycemia. This case also illustrates the risk for a false positive result of 111In-pentetreotide scintigraphy in the case of nesidioblastosis. Selective arterial calcium stimulation and venous sampling is the most reliable procedure for the positive diagnosis of insulinoma or nesidioblastosis and should be used to confirm any suspicion based on imaging modalities.
Resumo:
OBJECTIVE: To evaluate an automated seizure detection (ASD) algorithm in EEGs with periodic and other challenging patterns. METHODS: Selected EEGs recorded in patients over 1year old were classified into four groups: A. Periodic lateralized epileptiform discharges (PLEDs) with intermixed electrical seizures. B. PLEDs without seizures. C. Electrical seizures and no PLEDs. D. No PLEDs or seizures. Recordings were analyzed by the Persyst P12 software, and compared to the raw EEG, interpreted by two experienced neurophysiologists; Positive percent agreement (PPA) and false-positive rates/hour (FPR) were calculated. RESULTS: We assessed 98 recordings (Group A=21 patients; B=29, C=17, D=31). Total duration was 82.7h (median: 1h); containing 268 seizures. The software detected 204 (=76.1%) seizures; all ictal events were captured in 29/38 (76.3%) patients; in only in 3 (7.7%) no seizures were detected. Median PPA was 100% (range 0-100; interquartile range 50-100), and the median FPR 0/h (range 0-75.8; interquartile range 0-4.5); however, lower performances were seen in the groups containing periodic discharges. CONCLUSION: This analysis provides data regarding the yield of the ASD in a particularly difficult subset of EEG recordings, showing that periodic discharges may bias the results. SIGNIFICANCE: Ongoing refinements in this technique might enhance its utility and lead to a more extensive application.
Resumo:
Purpose: To evaluate the long-term outcome (up to 7 years) of presumed ocular tuberculosis (TB) when the therapeutic decision was based on WHO guidelines. Methods: Twelve out of 654 new uveitic patients (1998-2004) presented with choroiditis and positive tuberculosis skin test (TST) (skin lesion diameter >15 mm). Therapy was administered according to WHO recommendations after ophthalmic and systemic investigation. The area size of ocular lesions at presentation and after therapy, measured on fluorescein and indocyanine green angiographies, was considered the primary outcome. Relapse of choroiditis was considered a secondary outcome. The T-SPOTTB test was performed when it became available. Results: Visual acuity (VA) significantly improved after therapy (p=0.0357). The mean total surface of fluorescein lesions at entry was 44.8±20.9 (arbitrary units) and decreased to 32.5±16.9 after therapy (p=0.0165). The mean total surface of indocyanine green lesions at entry was 24.5±13.3 and decreased to 10.8±5.4 after therapy (p=0.0631). The T-SPOT TB revealed 2 false TST-positive results. The mean follow-up was 4.5±1.5 years. Two relapses out of 10 confirmed ocular TB was observed after complete lesion healing, 2.5 years and 4.5 years after therapy, respectively. Conclusions: A decrease of ocular lesion mean size and a mean improvement of VA were observed after antituberculous therapy. Our long-term follow-up of chorioretinal lesions demonstrated relapse of ocular tuberculosis in 10% of patients with confirmed ocular TB, despite complete initial retinal scarring.
Resumo:
Genome-wide association studies have been instrumental in identifying genetic variants associated with complex traits such as human disease or gene expression phenotypes. It has been proposed that extending existing analysis methods by considering interactions between pairs of loci may uncover additional genetic effects. However, the large number of possible two-marker tests presents significant computational and statistical challenges. Although several strategies to detect epistasis effects have been proposed and tested for specific phenotypes, so far there has been no systematic attempt to compare their performance using real data. We made use of thousands of gene expression traits from linkage and eQTL studies, to compare the performance of different strategies. We found that using information from marginal associations between markers and phenotypes to detect epistatic effects yielded a lower false discovery rate (FDR) than a strategy solely using biological annotation in yeast, whereas results from human data were inconclusive. For future studies whose aim is to discover epistatic effects, we recommend incorporating information about marginal associations between SNPs and phenotypes instead of relying solely on biological annotation. Improved methods to discover epistatic effects will result in a more complete understanding of complex genetic effects.
Resumo:
PURPOSE: To evaluate the diagnostic performance of abdominal radiography in the detection of illegal intracorporeal containers (hereafter, packets), with low-dose computed tomography (CT) as the reference standard. MATERIALS AND METHODS: This study was approved by the institutional ethical review board, with written informed consent. From July 2007 to July 2010, 330 people (296 men, 34 women; mean age, 32 years [range, 18-55 years]) suspected of having ingested drug packets underwent supine abdominal radiography and low-dose CT. The presence or absence of packets at abdominal radiography was reported, with low-dose CT as the reference standard. The density and number of packets (≤ 12 or >12) at low-dose CT were recorded and analyzed to determine whether those variables influence interpretation of results at abdominal radiography. RESULTS: Packets were detected at low-dose CT in 53 (16%) suspects. Sensitivity of abdominal radiography for depiction of packets was 0.77 (41 of 53), and specificity was 0.96 (267 of 277). The packets appeared isoattenuated to the bowel contents at low-dose CT in 16 (30%) of the 53 suspects with positive results. Nineteen (36%) of the 53 suspects with positive low-dose CT results had fewer than 12 packets. Packets that were isoattenuated at low-dose CT and a low number of packets (≤12) were both significantly associated with false-negative results at abdominal radiography (P = .004 and P = .016, respectively). CONCLUSION: Abdominal radiography is mainly limited by low sensitivity when compared with low-dose CT in the screening of people suspected of carrying drug packets. Low-dose CT is an effective imaging alternative to abdominal radiography.
Resumo:
Positron emission tomography (PET)/CT plays a major role in staging, assessing response to treatment and during follow-up of paediatric Hodgkin's lymphoma (HL). Owing to high sensitivity to detect viable tumoural tissue, negative PET/CT is highly predictive of survival. However, (18)F-FDG is not specific for malignant disease and may concentrate in numerous benign/inflammatory lesions that may cause 'false-positive' results and follow-up PET/CT studies should be interpreted with caution. We report a case of pulmonary inflammatory myofibroblastic tumour, which developed during follow-up in a young patient with complete remission of a stage IIB HL and was fully treated with surgical resection.