906 resultados para Web-Assisted Error Detection


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Oxidation of cholesterol (Ch) by a variety of reactive oxygen species gives rise mainly to hydroperoxides and aldehydes. Despite the growing interest in Ch-oxidized products, the detection and characterization of these products is still a matter of concern. In this work, the main Ch-oxidized products, namely, 3 beta-hydroxycholest-5-ene-7 alpha-hydroperoxide (7 alpha-OOH), 3 beta-5 alpha-cholest-6-ene-5-hydroperoxide (5 alpha-OOH), 3 beta-hydroxycholest-4-ene-6 alpha-hydroperoxide (6 alpha-OOH), 3 beta-hydroxycholest-4-ene-6 beta-hydroperoxide (6 beta-OOH), and 3 beta-hydroxy-5 beta-hydroxy-B-norcholestane-6 beta-carboxaldehyde (ChAld), were detected in the same analysis using high-performance liquid chromatography (HPLC) coupled to dopant assisted atmospheric pressure photoionization tandem mass spectrometry. The use of selected reaction monitoring mode (SRM) allowed a sensitive detection of each oxidized product, while the enhanced product ion mode (EPI) helped to improve the confidence of the analyses. Isotopic labeling experiments enabled one to elucidate mechanistic features during fragmentation processes. The characteristic fragmentation pattern of Ch-oxidized products is the consecutive loss of 1120 molecules, yielding cationic fragments at m/z 401, 383, and 365. Homolytic scissions of the peroxide bond are also seen. With (18)O-labeling approach, it was possible to establish a fragmentation order for each isomer. The SRM transitions ratio along with EPI and (18)O-labeled experiments give detailed information about differences for water elimination, allowing a proper discrimination between the isomers:Phis is of special interest considering the emerging role of Ch-oxidized products in the development of diseases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVES To find the best pairing of first and second reader at highest sensitivity for detecting lung nodules with CT at various dose levels. MATERIALS AND METHODS An anthropomorphic lung phantom and artificial lung nodules were used to simulate screening CT-examination at standard dose (100 mAs, 120 kVp) and 8 different low dose levels, using 120, 100 and 80 kVp combined with 100, 50 and 25 mAs. At each dose level 40 phantoms were randomly filled with 75 solid and 25 ground glass nodules (5-12 mm). Two radiologists and 3 different computer aided detection softwares (CAD) were paired to find the highest sensitivity. RESULTS Sensitivities at standard dose were 92%, 90%, 84%, 79% and 73% for reader 1, 2, CAD1, CAD2, CAD3, respectively. Combined sensitivity for human readers 1 and 2 improved to 97%, (p1=0.063, p2=0.016). Highest sensitivities--between 97% and 99.0%--were achieved by combining any radiologist with any CAD at any dose level. Combining any two CADs, sensitivities between 85% and 88% were significantly lower than for radiologists combined with CAD (p<0.03). CONCLUSIONS Combination of a human observer with any of the tested CAD systems provide optimal sensitivity for lung nodule detection even at reduced dose at 25 mAs/80 kVp.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVES The aim of this phantom study was to minimize the radiation dose by finding the best combination of low tube current and low voltage that would result in accurate volume measurements when compared to standard CT imaging without significantly decreasing the sensitivity of detecting lung nodules both with and without the assistance of CAD. METHODS An anthropomorphic chest phantom containing artificial solid and ground glass nodules (GGNs, 5-12 mm) was examined with a 64-row multi-detector CT scanner with three tube currents of 100, 50 and 25 mAs in combination with three tube voltages of 120, 100 and 80 kVp. This resulted in eight different protocols that were then compared to standard CT sensitivity (100 mAs/120 kVp). For each protocol, at least 127 different nodules were scanned in 21-25 phantoms. The nodules were analyzed in two separate sessions by three independent, blinded radiologists and computer-aided detection (CAD) software. RESULTS The mean sensitivity of the radiologists for identifying solid lung nodules on a standard CT was 89.7% ± 4.9%. The sensitivity was not significantly impaired when the tube and current voltage were lowered at the same time, except at the lowest exposure level of 25 mAs/80 kVp [80.6% ± 4.3% (p = 0.031)]. Compared to the standard CT, the sensitivity for detecting GGNs was significantly lower at all dose levels when the voltage was 80 kVp; this result was independent of the tube current. The CAD significantly increased the radiologists' sensitivity for detecting solid nodules at all dose levels (5-11%). No significant volume measurement errors (VMEs) were documented for the radiologists or the CAD software at any dose level. CONCLUSIONS Our results suggest a CT protocol with 25 mAs and 100 kVp is optimal for detecting solid and ground glass nodules in lung cancer screening. The use of CAD software is highly recommended at all dose levels.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Ontology antipatterns are structures that reflect ontology modelling problems because they lead to inconsistencies, bad reasoning performance or bad formalisation of domain knowledge. We propose four methods for the detection of antipatterns using SPARQL queries.We conduct some experiments to detect antipattern in a corpus of OWL ontologies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Tesis doctoral con mención europea en procesamiento del lenguaje natural realizada en la Universidad de Alicante por Ester Boldrini bajo la dirección del Dr. Patricio Martínez-Barco. El acto de defensa de la tesis tuvo lugar en la Universidad de Alicante el 23 de enero de 2012 ante el tribunal formado por los doctores Manuel Palomar (Universidad de Alicante), Dr. Paloma Moreda (UA), Dr. Mariona Taulé (Universidad de Barcelona), Dr. Horacio Saggion (Universitat Pompeu Fabra) y Dr. Mike Thelwall (University of Wolverhampton). Calificación: Sobresaliente Cum Laude por unanimidad.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Web APIs have gained increasing popularity in recent Web service technology development owing to its simplicity of technology stack and the proliferation of mashups. However, efficiently discovering Web APIs and the relevant documentations on the Web is still a challenging task even with the best resources available on the Web. In this paper we cast the problem of detecting the Web API documentations as a text classification problem of classifying a given Web page as Web API associated or not. We propose a supervised generative topic model called feature latent Dirichlet allocation (feaLDA) which offers a generic probabilistic framework for automatic detection of Web APIs. feaLDA not only captures the correspondence between data and the associated class labels, but also provides a mechanism for incorporating side information such as labelled features automatically learned from data that can effectively help improving classification performance. Extensive experiments on our Web APIs documentation dataset shows that the feaLDA model outperforms three strong supervised baselines including naive Bayes, support vector machines, and the maximum entropy model, by over 3% in classification accuracy. In addition, feaLDA also gives superior performance when compared against other existing supervised topic models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We analyze theoretically the interplay between optical return-to-zero signal degradation due to timing jitter and additive amplified-spontaneous-emission noise. The impact of these two factors on the performance of a square-law direct detection receiver is also investigated. We derive an analytical expression for the bit-error probability and quantitatively determine the conditions when the contributions of the effects of timing jitter and additive noise to the bit error rate can be treated separately. The analysis of patterning effects is also presented. © 2007 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Safeguarding organizations against opportunism and severe deception in computer-mediated communication (CMC) presents a major challenge to CIOs and IT managers. New insights into linguistic cues of deception derive from the speech acts innate to CMC. Applying automated text analysis to archival email exchanges in a CMC system as part of a reward program, we assess the ability of word use (micro-level), message development (macro-level), and intertextual exchange cues (meta-level) to detect severe deception by business partners. We empirically assess the predictive ability of our framework using an ordinal multilevel regression model. Results indicate that deceivers minimize the use of referencing and self-deprecation but include more superfluous descriptions and flattery. Deceitful channel partners also over structure their arguments and rapidly mimic the linguistic style of the account manager across dyadic e-mail exchanges. Thanks to its diagnostic value, the proposed framework can support firms’ decision-making and guide compliance monitoring system development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Person tracking systems to date have either relied on motion detection or optical flow as a basis for person detection and tracking. As yet, systems have not been developed that utilise both these techniques. We propose a person tracking system that uses both, made possible by a novel hybrid optical flow-motion detection technique that we have developed. This provides the system with two methods of person detection, helping to avoid missed detections and the need to predict position, which can lead to errors in tracking and mistakes when handling occlusion situations. Our results show that our system is able to track people accurately, with an average error less than four pixels, and that our system outperforms the current CAVIAR benchmark system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Query reformulation is a key user behavior during Web search. Our research goal is to develop predictive models of query reformulation during Web searching. This article reports results from a study in which we automatically classified the query-reformulation patterns for 964,780 Web searching sessions, composed of 1,523,072 queries, to predict the next query reformulation. We employed an n-gram modeling approach to describe the probability of users transitioning from one query-reformulation state to another to predict their next state. We developed first-, second-, third-, and fourth-order models and evaluated each model for accuracy of prediction, coverage of the dataset, and complexity of the possible pattern set. The results show that Reformulation and Assistance account for approximately 45% of all query reformulations; furthermore, the results demonstrate that the first- and second-order models provide the best predictability, between 28 and 40% overall and higher than 70% for some patterns. Implications are that the n-gram approach can be used for improving searching systems and searching assistance.