975 resultados para False alarms


Relevância:

10.00% 10.00%

Publicador:

Resumo:

RATIONALE AND OBJECTIVES: To systematically review and meta-analyze published data about the diagnostic accuracy of fluorine-18-fluorodeoxyglucose ((18)F-FDG) positron emission tomography (PET) and PET/computed tomography (CT) in the differential diagnosis between malignant and benign pleural lesions. METHODS AND MATERIALS: A comprehensive literature search of studies published through June 2013 regarding the diagnostic performance of (18)F-FDG-PET and PET/CT in the differential diagnosis of pleural lesions was carried out. All retrieved studies were reviewed and qualitatively analyzed. Pooled sensitivity, specificity, positive and negative likelihood ratio (LR+ and LR-) and diagnostic odds ratio (DOR) of (18)F-FDG-PET or PET/CT in the differential diagnosis of pleural lesions on a per-patient-based analysis were calculated. The area under the summary receiver operating characteristic curve (AUC) was calculated to measure the accuracy of these methods. Subanalyses considering device used (PET or PET/CT) were performed. RESULTS: Sixteen studies including 745 patients were included in the systematic review. The meta-analysis of 11 selected studies provided the following results: sensitivity 95% (95% confidence interval [95%CI]: 92-97%), specificity 82% (95%CI: 76-88%), LR+ 5.3 (95%CI: 2.4-11.8), LR- 0.09 (95%CI: 0.05-0.14), DOR 74 (95%CI: 34-161). The AUC was 0.95. No significant improvement of the diagnostic accuracy considering PET/CT studies only was found. CONCLUSIONS: (18)F-FDG-PET and PET/CT demonstrated to be accurate diagnostic imaging methods in the differential diagnosis between malignant and benign pleural lesions; nevertheless, possible sources of false-negative and false-positive results should be kept in mind.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mentally placing the self in the physical position of another person might engage social perspective taking because participants have to match their own position with that of another. We investigated the influence of personal (sex), interpersonal (siblings, parental marital status), and cultural (individualistic, collectivistic) factors on individuals' abilities to mentally take the position of front-facing and back-facing figures in an online study (369 participants). Replicating findings from laboratory studies responses were slower for front-facing than back-facing figures. Having siblings, parents' marital status, and cultural background influenced task performance in theoretically predictable ways. The present perspective-taking task is a promising experimental paradigm to assess social perspective taking and one that is free from the response biases inherent in self-report.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Searching for associations between genetic variants and complex diseases has been a very active area of research for over two decades. More than 51,000 potential associations have been studied and published, a figure that keeps increasing, especially with the recent explosion of array-based Genome-Wide Association Studies. Even if the number of true associations described so far is high, many of the putative risk variants detected so far have failed to be consistently replicated and are widely considered false positives. Here, we focus on the world-wide patterns of replicability of published association studies.Results: We report three main findings. First, contrary to previous results, genes associated to complex diseases present lower degrees of genetic differentiation among human populations than average genome-wide levels. Second, also contrary to previous results, the differences in replicability of disease associated-loci between Europeans and East Asians are highly correlated with genetic differentiation between these populations. Finally, highly replicated genes present increased levels of high-frequency derived alleles in European and Asian populations when compared to African populations. Conclusions: Our findings highlight the heterogeneous nature of the genetic etiology of complex disease, confirm the importance of the recent evolutionary history of our species in current patterns of disease susceptibility and could cast doubts on the status as false positives of some associations that have failed to replicate across populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Functional RNA structures play an important role both in the context of noncoding RNA transcripts as well as regulatory elements in mRNAs. Here we present a computational study to detect functional RNA structures within the ENCODE regions of the human genome. Since structural RNAs in general lack characteristic signals in primary sequence, comparative approaches evaluating evolutionary conservation of structures are most promising. We have used three recently introduced programs based on either phylogenetic–stochastic context-free grammar (EvoFold) or energy directed folding (RNAz and AlifoldZ), yielding several thousand candidate structures (corresponding to ∼2.7% of the ENCODE regions). EvoFold has its highest sensitivity in highly conserved and relatively AU-rich regions, while RNAz favors slightly GC-rich regions, resulting in a relatively small overlap between methods. Comparison with the GENCODE annotation points to functional RNAs in all genomic contexts, with a slightly increased density in 3′-UTRs. While we estimate a significant false discovery rate of ∼50%–70% many of the predictions can be further substantiated by additional criteria: 248 loci are predicted by both RNAz and EvoFold, and an additional 239 RNAz or EvoFold predictions are supported by the (more stringent) AlifoldZ algorithm. Five hundred seventy RNAz structure predictions fall into regions that show signs of selection pressure also on the sequence level (i.e., conserved elements). More than 700 predictions overlap with noncoding transcripts detected by oligonucleotide tiling arrays. One hundred seventy-five selected candidates were tested by RT-PCR in six tissues, and expression could be verified in 43 cases (24.6%).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The completion of the sequencing of the mouse genome promises to help predict human genes with greater accuracy. While current ab initio gene prediction programs are remarkably sensitive (i.e., they predict at least a fragment of most genes), their specificity is often low, predicting a large number of false-positive genes in the human genome. Sequence conservation at the protein level with the mouse genome can help eliminate some of those false positives. Here we describe SGP2, a gene prediction program that combines ab initio gene prediction with TBLASTX searches between two genome sequences to provide both sensitive and specific gene predictions. The accuracy of SGP2 when used to predict genes by comparing the human and mouse genomes is assessed on a number of data sets, including single-gene data sets, the highly curated human chromosome 22 predictions, and entire genome predictions from ENSEMBL. Results indicate that SGP2 outperforms purely ab initio gene prediction methods. Results also indicate that SGP2 works about as well with 3x shotgun data as it does with fully assembled genomes. SGP2 provides a high enough specificity that its predictions can be experimentally verified at a reasonable cost. SGP2 was used to generate a complete set of gene predictions on both the human and mouse by comparing the genomes of these two species. Our results suggest that another few thousand human and mouse genes currently not in ENSEMBL are worth verifying experimentally.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent availability of the chicken genome sequence poses the question of whether there are human protein-coding genes conserved in chicken that are currently not included in the human gene catalog. Here, we show, using comparative gene finding followed by experimental verification of exon pairs by RT–PCR, that the addition to the multi-exonic subset of this catalog could be as little as 0.2%, suggesting that we may be closing in on the human gene set. Our protocol, however, has two shortcomings: (i) the bioinformatic screening of the predicted genes, applied to filter out false positives, cannot handle intronless genes; and (ii) the experimental verification could fail to identify expression at a specific developmental time. This highlights the importance of developing methods that could provide a reliable estimate of the number of these two types of genes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This review paper reports the consensus of a technical workshop hosted by the European network, NanoImpactNet (NIN). The workshop aimed to review the collective experience of working at the bench with manufactured nanomaterials (MNMs), and to recommend modifications to existing experimental methods and OECD protocols. Current procedures for cleaning glassware are appropriate for most MNMs, although interference with electrodes may occur. Maintaining exposure is more difficult with MNMs compared to conventional chemicals. A metal salt control is recommended for experiments with metallic MNMs that may release free metal ions. Dispersing agents should be avoided, but if they must be used, then natural or synthetic dispersing agents are possible, and dispersion controls essential. Time constraints and technology gaps indicate that full characterisation of test media during ecotoxicity tests is currently not practical. Details of electron microscopy, dark-field microscopy, a range of spectroscopic methods (EDX, XRD, XANES, EXAFS), light scattering techniques (DLS, SLS) and chromatography are discussed. The development of user-friendly software to predict particle behaviour in test media according to DLVO theory is in progress, and simple optical methods are available to estimate the settling behaviour of suspensions during experiments. However, for soil matrices such simple approaches may not be applicable. Alternatively, a Critical Body Residue approach may be taken in which body concentrations in organisms are related to effects, and toxicity thresholds derived. For microbial assays, the cell wall is a formidable barrier to MNMs and end points that rely on the test substance penetrating the cell may be insensitive. Instead assays based on the cell envelope should be developed for MNMs. In algal growth tests, the abiotic factors that promote particle aggregation in the media (e.g. ionic strength) are also important in providing nutrients, and manipulation of the media to control the dispersion may also inhibit growth. Controls to quantify shading effects, and precise details of lighting regimes, shaking or mixing should be reported in algal tests. Photosynthesis may be more sensitive than traditional growth end points for algae and plants. Tests with invertebrates should consider non-chemical toxicity from particle adherence to the organisms. The use of semi-static exposure methods with fish can reduce the logistical issues of waste water disposal and facilitate aspects of animal husbandry relevant to MMNs. There are concerns that the existing bioaccumulation tests are conceptually flawed for MNMs and that new test(s) are required. In vitro testing strategies, as exemplified by genotoxicity assays, can be modified for MNMs, but the risk of false negatives in some assays is highlighted. In conclusion, most protocols will require some modifications and recommendations are made to aid the researcher at the bench. [Authors]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cannabis cultivation in order to produce drugs is forbidden in Switzerland. Thus, law enforcement authorities regularly ask forensic laboratories to determinate cannabis plant's chemotype from seized material in order to ascertain that the plantation is legal or not. As required by the EU official analysis protocol the THC rate of cannabis is measured from the flowers at maturity. When laboratories are confronted to seedlings, they have to lead the plant to maturity, meaning a time consuming and costly procedure. This study investigated the discrimination of fibre type from drug type Cannabis seedlings by analysing the compounds found in their leaves and using chemometrics tools. 11 legal varieties allowed by the Swiss Federal Office for Agriculture and 13 illegal ones were greenhouse grown and analysed using a gas chromatograph interfaced with a mass spectrometer. Compounds that show high discrimination capabilities in the seedlings have been identified and a support vector machines (SVMs) analysis was used to classify the cannabis samples. The overall set of samples shows a classification rate above 99% with false positive rates less than 2%. This model allows then discrimination between fibre and drug type Cannabis at an early stage of growth. Therefore it is not necessary to wait plants' maturity to quantify their amount of THC in order to determine their chemotype. This procedure could be used for the control of legal (fibre type) and illegal (drug type) Cannabis production.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we propose a new automatic methodology for computing accurate digital elevation models (DEMs) in urban environments from low baseline stereo pairs that shall be available in the future from a new kind of earth observation satellite. This setting makes both views of the scene similarly, thus avoiding occlusions and illumination changes, which are the main disadvantages of the commonly accepted large-baseline configuration. There still remain two crucial technological challenges: (i) precisely estimating DEMs with strong discontinuities and (ii) providing a statistically proven result, automatically. The first one is solved here by a piecewise affine representation that is well adapted to man-made landscapes, whereas the application of computational Gestalt theory introduces reliability and automation. In fact this theory allows us to reduce the number of parameters to be adjusted, and tocontrol the number of false detections. This leads to the selection of a suitable segmentation into affine regions (whenever possible) by a novel and completely automatic perceptual grouping method. It also allows us to discriminate e.g. vegetation-dominated regions, where such an affine model does not apply anda more classical correlation technique should be preferred. In addition we propose here an extension of the classical ”quantized” Gestalt theory to continuous measurements, thus combining its reliability with the precision of variational robust estimation and fine interpolation methods that are necessary in the low baseline case. Such an extension is very general and will be useful for many other applications as well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rapid adoption of online media like Facebook, Twitter or Wikileaks leaves us with little time to think. Where is information technology taking us, our society and our democratic institutions ? Is the Web replicating social divides that already exist offline or does collaborative technology pave the way for a more equal society ? How do we find the right balance between openness and privacy ? Can social media improve civic participation or do they breed superficial exchange and the promotion of false information ? These and lots of other questions arise when one starts to look at the Internet, society and politics. The first part of this paper gives an overview of the social changes that occur with the rise of the Web. The second part serves as an overview on how the Web is being used for political participation in Switzerland and abroad. Le développement rapide de nouveaux médias comme Facebook, Twitter ou Wikileaks ne laisse que peu de temps à la réflexion. Quels sont les changements que ces technologies de l'information impliquent pour nous, notre société et nos institutions démocratiques ? Internet ne fait-il que reproduire des divisions sociales qui lui préexistent ou constitue-t-il un moyen de lisser et d'égaliser ces mêmes divisions ? Comment trouver le bon équilibre entre transparence et respect de la vie privée ? Les médias sociaux permettent-ils de stimuler la participation politique ou ne sont-ils que le vecteur d'échanges superficiels et de fausses informations ? Ces questions, parmi d'autres, émergent rapidement lorsque l'on s'intéresse à la question des liens entre Internet, la société et la politique. La première partie de ce cahier est consacrée aux changements sociaux générés par l'émergence et le développement d'Internet. La seconde fait l'état des lieux de la manière dont Internet est utilisé pour stimuler la participation politique en Suisse et à l'étranger.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Critics of the U.S. proposal to the World Trade Organization (WTO) made in October 2005 are correct when they argue that adoption of the proposal would significantly reduce available support under the current farm program structure. Using historical prices and yields from 1980 to 2004, we estimate that loan rates would have to drop by 9 percent and target prices would have to drop by 10 percent in order to meet the proposed aggregate Amber Box and Blue Box limits. While this finding should cheer those who think that reform of U.S. farm programs is long overdue, it alarms those who want to maintain a strong safety net for U.S. agriculture. The dilemma of needing to reform farm programs while maintaining a strong safety net could be resolved by redesigning programs so that they target revenue rather than price. Building on a base of 70 percent Green Box income insurance, a program that provides a crop-specific revenue guarantee equal to 98 percent of the product of the current effective target price and expected county yield would fit into the proposed aggregate Amber and Blue Box limits. Payments would be triggered whenever the product of the season-average price and county average yield fell below this 98 percent revenue guarantee. Adding the proposed crop-specific constraints lowers the coverage level to 95 percent. Moving from programs that target price to ones that target revenue would eliminate the rationale for ad hoc disaster payments. Program payments would automatically arrive whenever significant crop losses or economic losses caused by low prices occurred. Also, much of the need for the complicated mechanism (the Standard Reinsurance Agreement) that transfers most risk of the U.S. crop insurance to the federal government would be eliminated because the federal government would directly assume the risk through farm programs. Changing the focus of federal farm programs from price targeting to revenue targeting would not be easy. Farmers have long relied on price supports and the knowledge that crop losses are often adequately covered by heavily subsidized crop insurance or by ad hoc disaster payments. Farmers and their leaders would only be willing to support a change to revenue targeting if they see that the current system is untenable in an era of tight federal budgets and WTO limits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Tests for recent infections (TRIs) are important for HIV surveillance. We have shown that a patient's antibody pattern in a confirmatory line immunoassay (Inno-Lia) also yields information on time since infection. We have published algorithms which, with a certain sensitivity and specificity, distinguish between incident (< = 12 months) and older infection. In order to use these algorithms like other TRIs, i.e., based on their windows, we now determined their window periods. METHODS: We classified Inno-Lia results of 527 treatment-naïve patients with HIV-1 infection < = 12 months according to incidence by 25 algorithms. The time after which all infections were ruled older, i.e. the algorithm's window, was determined by linear regression of the proportion ruled incident in dependence of time since infection. Window-based incident infection rates (IIR) were determined utilizing the relationship 'Prevalence = Incidence x Duration' in four annual cohorts of HIV-1 notifications. Results were compared to performance-based IIR also derived from Inno-Lia results, but utilizing the relationship 'incident = true incident + false incident' and also to the IIR derived from the BED incidence assay. RESULTS: Window periods varied between 45.8 and 130.1 days and correlated well with the algorithms' diagnostic sensitivity (R(2) = 0.962; P<0.0001). Among the 25 algorithms, the mean window-based IIR among the 748 notifications of 2005/06 was 0.457 compared to 0.453 obtained for performance-based IIR with a model not correcting for selection bias. Evaluation of BED results using a window of 153 days yielded an IIR of 0.669. Window-based IIR and performance-based IIR increased by 22.4% and respectively 30.6% in 2008, while 2009 and 2010 showed a return to baseline for both methods. CONCLUSIONS: IIR estimations by window- and performance-based evaluations of Inno-Lia algorithm results were similar and can be used together to assess IIR changes between annual HIV notification cohorts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interferences with the Olympus immunoturbidimetric assay for ferritin have been reported because the antibodies used in the immunoassay are derived from rabbits. Rabbits are familiar pets known to be a risk factor for developing heterophilic (or interfering) antibodies. This report shows how the current Olympus Ferritin assay has been improved to eliminate the interference from heterophilic antibodies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Testosterone abuse is conventionally assessed by the urinary testosterone/epitestosterone (T/E) ratio, levels above 4.0 being considered suspicious. A deletion polymorphism in the gene coding for UGT2B17 is strongly associated with reduced testosterone glucuronide (TG) levels in urine. Many of the individuals devoid of the gene would not reach a T/E ratio of 4.0 after testosterone intake. Future test programs will most likely shift from population based- to individual-based T/E cut-off ratios using Bayesian inference. A longitudinal analysis is dependent on an individual's true negative baseline T/E ratio. The aim was to investigate whether it is possible to increase the sensitivity and specificity of the T/E test by addition of UGT2B17 genotype information in a Bayesian framework. A single intramuscular dose of 500mg testosterone enanthate was given to 55 healthy male volunteers with either two, one or no allele (ins/ins, ins/del or del/del) of the UGT2B17 gene. Urinary excretion of TG and the T/E ratio was measured during 15 days. The Bayesian analysis was conducted to calculate the individual T/E cut-off ratio. When adding the genotype information, the program returned lower individual cut-off ratios in all del/del subjects increasing the sensitivity of the test considerably. It will be difficult, if not impossible, to discriminate between a true negative baseline T/E value and a false negative one without knowledge of the UGT2B17 genotype. UGT2B17 genotype information is crucial, both to decide which initial cut-off ratio to use for an individual, and for increasing the sensitivity of the Bayesian analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Overdiagnosis is the diagnosis of an abnormality that is not associated with a substantial health hazard and that patients have no benefit to be aware of. It is neither a misdiagnosis (diagnostic error), nor a false positive result (positive test in the absence of a real abnormality). It mainly results from screening, use of increasingly sensitive diagnostic tests, incidental findings on routine examinations, and widening diagnostic criteria to define a condition requiring an intervention. The blurring boundaries between risk and disease, physicians' fear of missing a diagnosis and patients' need for reassurance are further causes of overdiagnosis. Overdiagnosis often implies procedures to confirm or exclude the presence of the condition and is by definition associated with useless treatments and interventions, generating harm and costs without any benefit. Overdiagnosis also diverts healthcare professionals from caring about other health issues. Preventing overdiagnosis requires increasing awareness of healthcare professionals and patients about its occurrence, the avoidance of unnecessary and untargeted diagnostic tests, and the avoidance of screening without demonstrated benefits. Furthermore, accounting systematically for the harms and benefits of screening and diagnostic tests and determining risk factor thresholds based on the expected absolute risk reduction would also help prevent overdiagnosis.