987 resultados para Error detection
Resumo:
In this paper, methods are presented for automatic detection of the nipple and the pectoral muscle edge in mammograms via image processing in the Radon domain. Radon-domain information was used for the detection of straight-line candidates with high gradient. The longest straight-line candidate was used to identify the pectoral muscle edge. The nipple was detected as the convergence point of breast tissue components, indicated by the largest response in the Radon domain. Percentages of false-positive (FP) and false-negative (FN) areas were determined by comparing the areas of the pectoral muscle regions delimited manually by a radiologist and by the proposed method applied to 540 mediolateral-oblique (MLO) mammographic images. The average FP and FN were 8.99% and 9.13%, respectively. In the detection of the nipple, an average error of 7.4 mm was obtained with reference to the nipple as identified by a radiologist on 1,080 mammographic images (540 MLO and 540 craniocaudal views).
Resumo:
Activated sludge models are used extensively in the study of wastewater treatment processes. While various commercial implementations of these models are available, there are many people who need to code models themselves using the simulation packages available to them, Quality assurance of such models is difficult. While benchmarking problems have been developed and are available, the comparison of simulation data with that of commercial models leads only to the detection, not the isolation of errors. To identify the errors in the code is time-consuming. In this paper, we address the problem by developing a systematic and largely automated approach to the isolation of coding errors. There are three steps: firstly, possible errors are classified according to their place in the model structure and a feature matrix is established for each class of errors. Secondly, an observer is designed to generate residuals, such that each class of errors imposes a subspace, spanned by its feature matrix, on the residuals. Finally. localising the residuals in a subspace isolates coding errors. The algorithm proved capable of rapidly and reliably isolating a variety of single and simultaneous errors in a case study using the ASM 1 activated sludge model. In this paper a newly coded model was verified against a known implementation. The method is also applicable to simultaneous verification of any two independent implementations, hence is useful in commercial model development.
Resumo:
Liver steatosis is a common disease usually associated with social and genetic factors. Early detection and quantification is important since it can evolve to cirrhosis. Steatosis is usually a diffuse liver disease, since it is globally affected. However, steatosis can also be focal affecting only some foci difficult to discriminate. In both cases, steatosis is detected by laboratorial analysis and visual inspection of ultrasound images of the hepatic parenchyma. Liver biopsy is the most accurate diagnostic method but its invasive nature suggest the use of other non-invasive methods, while visual inspection of the ultrasound images is subjective and prone to error. In this paper a new Computer Aided Diagnosis (CAD) system for steatosis classification and analysis is presented, where the Bayes Factor, obatined from objective intensity and textural features extracted from US images of the liver, is computed in a local or global basis. The main goal is to provide the physician with an application to make it faster and accurate the diagnosis and quantification of steatosis, namely in a screening approach. The results showed an overall accuracy of 93.54% with a sensibility of 95.83% and 85.71% for normal and steatosis class, respectively. The proposed CAD system seemed suitable as a graphical display for steatosis classification and comparison with some of the most recent works in the literature is also presented.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do Grau de Mestre em Engenharia Informática
Resumo:
A novel reusable molecularly imprinted polymer (MIP) assembled on a polymeric layer of carboxylated poly(vinyl chloride) (PVCsingle bondCOOH) for myoglobin (Myo) detection was developed. This polymer was casted on the gold working area of a screen printed electrode (Au-SPE), creating a novel disposable device relying on plastic antibodies. Electrochemical impedance spectroscopy (EIS), cyclic voltammetry (CV) and Fourier transform infrared spectroscopy (FTIR) studies confirmed the surface modification. The MIP/Au-SPE devices displayed a linear behaviour in EIS from 0.852 to 4.26 μg mL−1, of positive slope 6.50 ± 1.48 (kΩ mL μg−1). The limit of detection was 2.25 μg mL−1. Square wave voltammetric (SWV) assays were made in parallel and showed linear responses between 1.1 and 2.98 μg mL−1. A current decrease was observed against Myo concentration, producing average slopes of −0.28 ± 0.038 μA mL μg−1. MIP/Au-SPE also showed good results in terms of selectivity. The error% found for each interfering species were 7% for troponin T (TnT), 11% for bovine serum albumin (BSA) and 2% for creatine kinase MB (CKMB), respectively. Overall, the technical modification over the Au-SPE was found a suitable approach for screening Myo in biological fluids.
Resumo:
Dissertation submitted in the fufillment of the requirements for the Degree of Master in Biomedical Engineering
Resumo:
Eradication of code smells is often pointed out as a way to improve readability, extensibility and design in existing software. However, code smell detection remains time consuming and error-prone, partly due to the inherent subjectivity of the detection processes presently available. In view of mitigating the subjectivity problem, this dissertation presents a tool that automates a technique for the detection and assessment of code smells in Java source code, developed as an Eclipse plugin. The technique is based upon a Binary Logistic Regression model that uses complexity metrics as independent variables and is calibrated by expert‟s knowledge. An overview of the technique is provided, the tool is described and validated by an example case study.
Resumo:
Nowadays, existing 3D scanning cameras and microscopes in the market use digital or discrete sensors, such as CCDs or CMOS for object detection applications. However, these combined systems are not fast enough for some application scenarios since they require large data processing resources and can be cumbersome. Thereby, there is a clear interest in exploring the possibilities and performances of analogue sensors such as arrays of position sensitive detectors with the final goal of integrating them in 3D scanning cameras or microscopes for object detection purposes. The work performed in this thesis deals with the implementation of prototype systems in order to explore the application of object detection using amorphous silicon position sensors of 32 and 128 lines which were produced in the clean room at CENIMAT-CEMOP. During the first phase of this work, the fabrication and the study of the static and dynamic specifications of the sensors as well as their conditioning in relation to the existing scientific and technological knowledge became a starting point. Subsequently, relevant data acquisition and suitable signal processing electronics were assembled. Various prototypes were developed for the 32 and 128 array PSD sensors. Appropriate optical solutions were integrated to work together with the constructed prototypes, allowing the required experiments to be carried out and allowing the achievement of the results presented in this thesis. All control, data acquisition and 3D rendering platform software was implemented for the existing systems. All these components were combined together to form several integrated systems for the 32 and 128 line PSD 3D sensors. The performance of the 32 PSD array sensor and system was evaluated for machine vision applications such as for example 3D object rendering as well as for microscopy applications such as for example micro object movement detection. Trials were also performed involving the 128 array PSD sensor systems. Sensor channel non-linearities of approximately 4 to 7% were obtained. Overall results obtained show the possibility of using a linear array of 32/128 1D line sensors based on the amorphous silicon technology to render 3D profiles of objects. The system and setup presented allows 3D rendering at high speeds and at high frame rates. The minimum detail or gap that can be detected by the sensor system is approximately 350 μm when using this current setup. It is also possible to render an object in 3D within a scanning angle range of 15º to 85º and identify its real height as a function of the scanning angle and the image displacement distance on the sensor. Simple and not so simple objects, such as a rubber and a plastic fork, can be rendered in 3D properly and accurately also at high resolution, using this sensor and system platform. The nip structure sensor system can detect primary and even derived colors of objects by a proper adjustment of the integration time of the system and by combining white, red, green and blue (RGB) light sources. A mean colorimetric error of 25.7 was obtained. It is also possible to detect the movement of micrometer objects using the 32 PSD sensor system. This kind of setup offers the possibility to detect if a micro object is moving, what are its dimensions and what is its position in two dimensions, even at high speeds. Results show a non-linearity of about 3% and a spatial resolution of < 2µm.
Resumo:
Failure to detect a species in an area where it is present is a major source of error in biological surveys. We assessed whether it is possible to optimize single-visit biological monitoring surveys of highly dynamic freshwater ecosystems by framing them a priori within a particular period of time. Alternatively, we also searched for the optimal number of visits and when they should be conducted. We developed single-species occupancy models to estimate the monthly probability of detection of pond-breeding amphibians during a four-year monitoring program. Our results revealed that detection probability was species-specific and changed among sampling visits within a breeding season and also among breeding seasons. Thereby, the optimization of biological surveys with minimal survey effort (a single visit) is not feasible as it proves impossible to select a priori an adequate sampling period that remains robust across years. Alternatively, a two-survey combination at the beginning of the sampling season yielded optimal results and constituted an acceptable compromise between sampling efficacy and survey effort. Our study provides evidence of the variability and uncertainty that likely affects the efficacy of monitoring surveys, highlighting the need of repeated sampling in both ecological studies and conservation management.
Resumo:
Abstract. Terrestrial laser scanning (TLS) is one of the most promising surveying techniques for rockslope characteriza- tion and monitoring. Landslide and rockfall movements can be detected by means of comparison of sequential scans. One of the most pressing challenges of natural hazards is com- bined temporal and spatial prediction of rockfall. An outdoor experiment was performed to ascertain whether the TLS in- strumental error is small enough to enable detection of pre- cursory displacements of millimetric magnitude. This con- sists of a known displacement of three objects relative to a stable surface. Results show that millimetric changes cannot be detected by the analysis of the unprocessed datasets. Dis- placement measurement are improved considerably by ap- plying Nearest Neighbour (NN) averaging, which reduces the error (1σ ) up to a factor of 6. This technique was ap- plied to displacements prior to the April 2007 rockfall event at Castellfollit de la Roca, Spain. The maximum precursory displacement measured was 45 mm, approximately 2.5 times the standard deviation of the model comparison, hampering the distinction between actual displacement and instrumen- tal error using conventional methodologies. Encouragingly, the precursory displacement was clearly detected by apply- ing the NN averaging method. These results show that mil- limetric displacements prior to failure can be detected using TLS.
Resumo:
Blood pressure (BP) is a heritable, quantitative trait with intraindividual variability and susceptibility to measurement error. Genetic studies of BP generally use single-visit measurements and thus cannot remove variability occurring over months or years. We leveraged the idea that averaging BP measured across time would improve phenotypic accuracy and thereby increase statistical power to detect genetic associations. We studied systolic BP (SBP), diastolic BP (DBP), mean arterial pressure (MAP), and pulse pressure (PP) averaged over multiple years in 46,629 individuals of European ancestry. We identified 39 trait-variant associations across 19 independent loci (p < 5 × 10(-8)); five associations (in four loci) uniquely identified by our LTA analyses included those of SBP and MAP at 2p23 (rs1275988, near KCNK3), DBP at 2q11.2 (rs7599598, in FER1L5), and PP at 6p21 (rs10948071, near CRIP3) and 7p13 (rs2949837, near IGFBP3). Replication analyses conducted in cohorts with single-visit BP data showed positive replication of associations and a nominal association (p < 0.05). We estimated a 20% gain in statistical power with long-term average (LTA) as compared to single-visit BP association studies. Using LTA analysis, we identified genetic loci influencing BP. LTA might be one way of increasing the power of genetic associations for continuous traits in extant samples for other phenotypes that are measured serially over time.
Resumo:
A systolic array to implement lattice-reduction-aided lineardetection is proposed for a MIMO receiver. The lattice reductionalgorithm and the ensuing linear detections are operated in the same array, which can be hardware-efficient. All-swap lattice reduction algorithm (ASLR) is considered for the systolic design.ASLR is a variant of the LLL algorithm, which processes all lattice basis vectors within one iteration. Lattice-reduction-aided linear detection based on ASLR and LLL algorithms have very similarbit-error-rate performance, while ASLR is more time efficient inthe systolic array, especially for systems with a large number ofantennas.
Resumo:
In the assessment of medical malpractice imaging methods can be used for the documentation of crucial morphological findings which are indicative for or against an iatrogenically caused injury. The clarification of deaths in this context can be usefully supported by postmortem imaging (primarily native computed tomography, angiography, magnetic resonance imaging). Postmortem imaging offers significant additional information compared to an autopsy in the detection of iatrogenic air embolisms and documentation of misplaced medical aids before dissection with an inherent danger of relocation. Additional information is supplied by postmortem imaging in the search for sources of bleeding as well as the documentation of perfusion after cardiovascular surgery. Key criteria for the decision to perform postmortem imaging can be obtained from the necessary preliminary inspection of clinical documentation.
Resumo:
Short-TE MRS has been proposed recently as a method for the in vivo detection and quantification of γ-aminobutyric acid (GABA) in the human brain at 3 T. In this study, we investigated the accuracy and reproducibility of short-TE MRS measurements of GABA at 3 T using both simulations and experiments. LCModel analysis was performed on a large number of simulated spectra with known metabolite input concentrations. Simulated spectra were generated using a range of spectral linewidths and signal-to-noise ratios to investigate the effect of varying experimental conditions, and analyses were performed using two different baseline models to investigate the effect of an inaccurate baseline model on GABA quantification. The results of these analyses indicated that, under experimental conditions corresponding to those typically observed in the occipital cortex, GABA concentration estimates are reproducible (mean reproducibility error, <20%), even when an incorrect baseline model is used. However, simulations indicate that the accuracy of GABA concentration estimates depends strongly on the experimental conditions (linewidth and signal-to-noise ratio). In addition to simulations, in vivo GABA measurements were performed using both spectral editing and short-TE MRS in the occipital cortex of 14 healthy volunteers. Short-TE MRS measurements of GABA exhibited a significant positive correlation with edited GABA measurements (R = 0.58, p < 0.05), suggesting that short-TE measurements of GABA correspond well with measurements made using spectral editing techniques. Finally, within-session reproducibility was assessed in the same 14 subjects using four consecutive short-TE GABA measurements in the occipital cortex. Across all subjects, the average coefficient of variation of these four GABA measurements was 8.7 ± 4.9%. This study demonstrates that, under some experimental conditions, short-TE MRS can be employed for the reproducible detection of GABA at 3 T, but that the technique should be used with caution, as the results are dependent on the experimental conditions. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
Terrestrial laser scanning (TLS) is one of the most promising surveying techniques for rockslope characterization and monitoring. Landslide and rockfall movements can be detected by means of comparison of sequential scans. One of the most pressing challenges of natural hazards is combined temporal and spatial prediction of rockfall. An outdoor experiment was performed to ascertain whether the TLS instrumental error is small enough to enable detection of precursory displacements of millimetric magnitude. This consists of a known displacement of three objects relative to a stable surface. Results show that millimetric changes cannot be detected by the analysis of the unprocessed datasets. Displacement measurement are improved considerably by applying Nearest Neighbour (NN) averaging, which reduces the error (1¿) up to a factor of 6. This technique was applied to displacements prior to the April 2007 rockfall event at Castellfollit de la Roca, Spain. The maximum precursory displacement measured was 45 mm, approximately 2.5 times the standard deviation of the model comparison, hampering the distinction between actual displacement and instrumental error using conventional methodologies. Encouragingly, the precursory displacement was clearly detected by applying the NN averaging method. These results show that millimetric displacements prior to failure can be detected using TLS.