965 resultados para Web-Assisted Error Detection
Resumo:
OBJECTIVES The aim of this phantom study was to minimize the radiation dose by finding the best combination of low tube current and low voltage that would result in accurate volume measurements when compared to standard CT imaging without significantly decreasing the sensitivity of detecting lung nodules both with and without the assistance of CAD. METHODS An anthropomorphic chest phantom containing artificial solid and ground glass nodules (GGNs, 5-12 mm) was examined with a 64-row multi-detector CT scanner with three tube currents of 100, 50 and 25 mAs in combination with three tube voltages of 120, 100 and 80 kVp. This resulted in eight different protocols that were then compared to standard CT sensitivity (100 mAs/120 kVp). For each protocol, at least 127 different nodules were scanned in 21-25 phantoms. The nodules were analyzed in two separate sessions by three independent, blinded radiologists and computer-aided detection (CAD) software. RESULTS The mean sensitivity of the radiologists for identifying solid lung nodules on a standard CT was 89.7% ± 4.9%. The sensitivity was not significantly impaired when the tube and current voltage were lowered at the same time, except at the lowest exposure level of 25 mAs/80 kVp [80.6% ± 4.3% (p = 0.031)]. Compared to the standard CT, the sensitivity for detecting GGNs was significantly lower at all dose levels when the voltage was 80 kVp; this result was independent of the tube current. The CAD significantly increased the radiologists' sensitivity for detecting solid nodules at all dose levels (5-11%). No significant volume measurement errors (VMEs) were documented for the radiologists or the CAD software at any dose level. CONCLUSIONS Our results suggest a CT protocol with 25 mAs and 100 kVp is optimal for detecting solid and ground glass nodules in lung cancer screening. The use of CAD software is highly recommended at all dose levels.
Resumo:
Ontology antipatterns are structures that reflect ontology modelling problems because they lead to inconsistencies, bad reasoning performance or bad formalisation of domain knowledge. We propose four methods for the detection of antipatterns using SPARQL queries.We conduct some experiments to detect antipattern in a corpus of OWL ontologies.
Resumo:
Tesis doctoral con mención europea en procesamiento del lenguaje natural realizada en la Universidad de Alicante por Ester Boldrini bajo la dirección del Dr. Patricio Martínez-Barco. El acto de defensa de la tesis tuvo lugar en la Universidad de Alicante el 23 de enero de 2012 ante el tribunal formado por los doctores Manuel Palomar (Universidad de Alicante), Dr. Paloma Moreda (UA), Dr. Mariona Taulé (Universidad de Barcelona), Dr. Horacio Saggion (Universitat Pompeu Fabra) y Dr. Mike Thelwall (University of Wolverhampton). Calificación: Sobresaliente Cum Laude por unanimidad.
Resumo:
Web APIs have gained increasing popularity in recent Web service technology development owing to its simplicity of technology stack and the proliferation of mashups. However, efficiently discovering Web APIs and the relevant documentations on the Web is still a challenging task even with the best resources available on the Web. In this paper we cast the problem of detecting the Web API documentations as a text classification problem of classifying a given Web page as Web API associated or not. We propose a supervised generative topic model called feature latent Dirichlet allocation (feaLDA) which offers a generic probabilistic framework for automatic detection of Web APIs. feaLDA not only captures the correspondence between data and the associated class labels, but also provides a mechanism for incorporating side information such as labelled features automatically learned from data that can effectively help improving classification performance. Extensive experiments on our Web APIs documentation dataset shows that the feaLDA model outperforms three strong supervised baselines including naive Bayes, support vector machines, and the maximum entropy model, by over 3% in classification accuracy. In addition, feaLDA also gives superior performance when compared against other existing supervised topic models.
Resumo:
We analyze theoretically the interplay between optical return-to-zero signal degradation due to timing jitter and additive amplified-spontaneous-emission noise. The impact of these two factors on the performance of a square-law direct detection receiver is also investigated. We derive an analytical expression for the bit-error probability and quantitatively determine the conditions when the contributions of the effects of timing jitter and additive noise to the bit error rate can be treated separately. The analysis of patterning effects is also presented. © 2007 IEEE.
Resumo:
Safeguarding organizations against opportunism and severe deception in computer-mediated communication (CMC) presents a major challenge to CIOs and IT managers. New insights into linguistic cues of deception derive from the speech acts innate to CMC. Applying automated text analysis to archival email exchanges in a CMC system as part of a reward program, we assess the ability of word use (micro-level), message development (macro-level), and intertextual exchange cues (meta-level) to detect severe deception by business partners. We empirically assess the predictive ability of our framework using an ordinal multilevel regression model. Results indicate that deceivers minimize the use of referencing and self-deprecation but include more superfluous descriptions and flattery. Deceitful channel partners also over structure their arguments and rapidly mimic the linguistic style of the account manager across dyadic e-mail exchanges. Thanks to its diagnostic value, the proposed framework can support firms’ decision-making and guide compliance monitoring system development.
Resumo:
Alternative splicing of gene transcripts greatly expands the functional capacity of the genome, and certain splice isoforms may indicate specific disease states such as cancer. Splice junction microarrays interrogate thousands of splice junctions, but data analysis is difficult and error prone because of the increased complexity compared to differential gene expression analysis. We present Rank Change Detection (RCD) as a method to identify differential splicing events based upon a straightforward probabilistic model comparing the over-or underrepresentation of two or more competing isoforms. RCD has advantages over commonly used methods because it is robust to false positive errors due to nonlinear trends in microarray measurements. Further, RCD does not depend on prior knowledge of splice isoforms, yet it takes advantage of the inherent structure of mutually exclusive junctions, and it is conceptually generalizable to other types of splicing arrays or RNA-Seq. RCD specifically identifies the biologically important cases when a splice junction becomes more or less prevalent compared to other mutually exclusive junctions. The example data is from different cell lines of glioblastoma tumors assayed with Agilent microarrays.
Resumo:
The simultaneous use of different sensors technologies is an efficient method to increase the performance of chemical sensors systems. Among the available technologies, mass and capacitance transducers are particularly interesting because they can take advantage also from non-conductive sensing layers, such as most of the more interesting molecular recognition systems. In this paper, an array of quartz microbalance sensors is complemented by an array of capacitors obtained from a commercial biometrics fingerprints detector. The two sets of transducers, properly functionalized by sensitive molecular and polymeric films, are utilized for the estimation of adulteration in gasolines, and in particular to quantify the content of ethanol in gasolines, an application of importance for Brazilian market. Results indicate that the hybrid system outperforms the individual sensor arrays even if the quantification of ethanol in gasoline, due to the variability of gasolines formulation, is affected by a barely acceptable error. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In this work, a wide analysis of local search multiuser detection (LS-MUD) for direct sequence/code division multiple access (DS/CDMA) systems under multipath channels is carried out considering the performance-complexity trade-off. It is verified the robustness of the LS-MUD to variations in loading, E(b)/N(0), near-far effect, number of fingers of the Rake receiver and errors in the channel coefficients estimates. A compared analysis of the bit error rate (BER) and complexity trade-off is accomplished among LS, genetic algorithm (GA) and particle swarm optimization (PSO). Based on the deterministic behavior of the LS algorithm, it is also proposed simplifications over the cost function calculation, obtaining more efficient algorithms (simplified and combined LS-MUD versions) and creating new perspectives for the MUD implementation. The computational complexity is expressed in terms of the number of operations in order to converge. Our conclusion pointed out that the simplified LS (s-LS) method is always more efficient, independent of the system conditions, achieving a better performance with a lower complexity than the others heuristics detectors. Associated to this, the deterministic strategy and absence of input parameters made the s-LS algorithm the most appropriate for the MUD problem. (C) 2008 Elsevier GmbH. All rights reserved.
Resumo:
This work aims at proposing the use of the evolutionary computation methodology in order to jointly solve the multiuser channel estimation (MuChE) and detection problems at its maximum-likelihood, both related to the direct sequence code division multiple access (DS/CDMA). The effectiveness of the proposed heuristic approach is proven by comparing performance and complexity merit figures with that obtained by traditional methods found in literature. Simulation results considering genetic algorithm (GA) applied to multipath, DS/CDMA and MuChE and multi-user detection (MuD) show that the proposed genetic algorithm multi-user channel estimation (GAMuChE) yields a normalized mean square error estimation (nMSE) inferior to 11%, under slowly varying multipath fading channels, large range of Doppler frequencies and medium system load, it exhibits lower complexity when compared to both maximum likelihood multi-user channel estimation (MLMuChE) and gradient descent method (GrdDsc). A near-optimum multi-user detector (MuD) based on the genetic algorithm (GAMuD), also proposed in this work, provides a significant reduction in the computational complexity when compared to the optimum multi-user detector (OMuD). In addition, the complexity of the GAMuChE and GAMuD algorithms were (jointly) analyzed in terms of number of operations necessary to reach the convergence, and compared to other jointly MuChE and MuD strategies. The joint GAMuChE-GAMuD scheme can be regarded as a promising alternative for implementing third-generation (3G) and fourth-generation (4G) wireless systems in the near future. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
This paper describes a simple method for mercury speciation in seafood samples by LC-ICP-MS with a fast sample preparation procedure. Prior to analysis, mercury species were extracted from food samples with a solution containing mercaptoethanol, L-cysteine and HCl and sonication for 15 min. Separation of mercury species was accomplished in less than 5 min on a C8 reverse phase column with a mobile phase containing 0.05%-v/v mercaptoethanol, 0.4% m/v L-cysteine and 0.06 mol L(-1) ammonium acetate. The method detection limits were found to be 0.25, 0.20 and 0.1 ng g(-1) for inorganic mercury, ethylmercury and methylmercury, respectively. Method accuracy is traceable to Certified Reference Materials (DOLT-3 and DORM-3) from the National Research Council Canada (NRCC). With the proposed method there is a considerable reduction of the time of sample preparation. Finally, the method was applied for the speciation of mercury in seafood samples purchased from the Brazilian market. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
A simple and fast method is described for simultaneous determination of methylmercury (MeHg), ethylmercury (Et-Hg) and inorganic mercury (Ino-Hg) in blood samples by using capillary gas chromatography-inductively coupled plasma mass spectrometry (GC-ICP-MS) after derivatization and alkaline digestion. Closed-vessel microwave assisted digestion conditions with tetramethylammonium hydroxide (TMAH) have been optimized. Derivatization by using ethylation and propylation procedures have also been evaluated and compared. The absolute detection limits (using a 1 mu L injection) obtained by GC-ICP-MS with ethylation were 40 fg for MeHg and Ino-Hg, respectively, and with propylation were 50, 20 and 50 fg for MeHg, Et-Hg and Ino-Hg, respectively. Method accuracy is traceable to Standard Reference Material (SRM) 966 Toxic Metals in Bovine Blood from the National Institute of Standards and Technology (NIST). Additional validation is provided based on the comparison of results obtained for mercury speciation in blood samples with the proposed procedure and with a previously reported LC-ICP-MS method. With the new proposed procedure no tedious clean-up steps are required and a considerable improvement of the time of analysis was achieved compared to other methods using GC separation.
Resumo:
The histopathological counterpart of white matter hyperintensities is a matter of debate. Methodological and ethical limitations have prevented this question to be elucidated. We want to introduce a protocol applying state-of-the-art methods in order to solve fundamental questions regarding the neuroimaging-neuropathological uncertainties comprising the most common white matter hyperintensities [WMHs] seen in aging. By this protocol, the correlation between signal features in in situ, post mortem MRI-derived methods, including DTI and MTR and quantitative and qualitative histopathology can be investigated. We are mainly interested in determining the precise neuroanatomical substrate of incipient WMHs. A major issue in this protocol is the exact co-registration of small lesion in a tridimensional coordinate system that compensates tissue deformations after histological processing. The protocol is based on four principles: post mortem MRI in situ performed in a short post mortem interval, minimal brain deformation during processing, thick serial histological sections and computer-assisted 3D reconstruction of the histological sections. This protocol will greatly facilitate a systematic study of the location, pathogenesis, clinical impact, prognosis and prevention of WMHs. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In this paper, methods are presented for automatic detection of the nipple and the pectoral muscle edge in mammograms via image processing in the Radon domain. Radon-domain information was used for the detection of straight-line candidates with high gradient. The longest straight-line candidate was used to identify the pectoral muscle edge. The nipple was detected as the convergence point of breast tissue components, indicated by the largest response in the Radon domain. Percentages of false-positive (FP) and false-negative (FN) areas were determined by comparing the areas of the pectoral muscle regions delimited manually by a radiologist and by the proposed method applied to 540 mediolateral-oblique (MLO) mammographic images. The average FP and FN were 8.99% and 9.13%, respectively. In the detection of the nipple, an average error of 7.4 mm was obtained with reference to the nipple as identified by a radiologist on 1,080 mammographic images (540 MLO and 540 craniocaudal views).