923 resultados para Image quality perception
Resumo:
Despite the efficacy of minutia-based fingerprint matching techniques for good-quality images captured by optical sensors, minutia-based techniques do not often perform so well on poor-quality images or fingerprint images captured by small solid-state sensors. Solid-state fingerprint sensors are being increasingly deployed in a wide range of applications for user authentication purposes. Therefore, it is necessary to develop new fingerprint-matching techniques that utilize other features to deal with fingerprint images captured by solid-state sensors. This paper presents a new fingerprint matching technique based on fingerprint ridge features. This technique was assessed on the MSU-VERIDICOM database, which consists of fingerprint impressions obtained from 160 users (4 impressions per finger) using a solid-state sensor. The combination of ridge-based matching scores computed by the proposed ridge-based technique with minutia-based matching scores leads to a reduction of the false non-match rate by approximately 1.7% at a false match rate of 0.1%. © 2005 IEEE.
Resumo:
A CMOS/SOI circuit to decode PWM signals is presented as part of a body-implanted neurostimulator for visual prosthesis. Since encoded data is the sole input to the circuit, the decoding technique is based on a double-integration concept and does not require dc filtering. Nonoverlapping control phases are internally derived from the incoming pulses and a fast-settling comparator ensures good discrimination accuracy in the megahertz range. The circuit was integrated on a 2 mu m single-metal SOI fabrication process and has an effective area of 2mm(2) Typically, the measured resolution of encoding parameter a was better than 10% at 6MHz and V-DD=3.3V. Stand-by consumption is around 340 mu W. Pulses with frequencies up to 15MHz and alpha = 10% can be discriminated for V-DD spanning from 2.3V to 3.3V. Such an excellent immunity to V-DD deviations meets a design specification with respect to inherent coupling losses on transmitting data and power by means of a transcutaneous link.
Resumo:
During the last 30 years the Atomic Force Microscopy became the most powerful tool for surface probing in atomic scale. The Tapping-Mode Atomic Force Microscope is used to generate high quality accurate images of the samples surface. However, in this mode of operation the microcantilever frequently presents chaotic motion due to the nonlinear characteristics of the tip-sample forces interactions, degrading the image quality. This kind of irregular motion must be avoided by the control system. In this work, the tip-sample interaction is modelled considering the Lennard-Jones potentials and the two-term Galerkin aproximation. Additionally, the State Dependent Ricatti Equation and Time-Delayed Feedback Control techniques are used in order to force the Tapping-Mode Atomic Force Microscope system motion to a periodic orbit, preventing the microcantilever chaotic motion
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
FUNDAMENTO: A redução da frequência cardíaca (FC) na angiografia por tomografia das artérias coronarianas (ATCCor) é fundamental para a qualidade de imagem. A eficácia dos bloqueadores de cálcio como alternativas para pacientes com contraindicações aos betabloqueadores não foi definida. OBJETIVOS: Comparar a eficácia na redução da FC e variabilidade RR do metoprolol e diltiazem na ATCCor. MÉTODOS: Estudo prospectivo, randomizado, aberto, incluiu pacientes com indicação clínica de ATCCor, em ritmo sinusal, com FC>70bpm e sem uso de agentes que interferissem com a FC. Cinquenta pacientes foram randomizados para grupos: metoprolol IV 5-15 mg ou até FC≤60 bpm(M), e diltiazem IV 0,25-0,60mg/kg ou até FC≤60 bpm (D). Pressão arterial (PA) e FC foram aferidas na condição basal, 1min, 3min e 5min após agentes, na aquisição e após ATCCor. RESULTADOS: A redução da FC em valores absolutos foi maior no grupo M que no grupo D (1, 3, 5min, aquisição e pós-exame). A redução percentual da FC foi significativamente maior no grupo M apenas no 1 min e 3 min após início dos agentes. Não houve diferença no 5 min, durante a aquisição e após exame. A variabilidade RR percentual do grupo D foi estatisticamente menor do que a do grupo M durante a aquisição (variabilidade RR/ FC média da aquisição). Um único caso de BAV, 2:1 Mobitz I, revertido espontaneamente ocorreu (grupo D). CONCLUSÃO: Concluímos que o diltiazem é uma alternativa eficaz e segura aos betabloqueadores na redução da FC na realização de angiografia por tomografia computadorizada das artérias coronarianas. (Arq Bras Cardiol. 2012; [online].ahead print, PP.0-0)
Resumo:
The theoretical framework that underpins this research study is based on the Prospect Theory formulated by Kahneman and Tversky, and Thaler's Mental Accounting Theory. The research aims to evaluate the consumers' behavior when different patterns of discount are offered (in percentage and absolute value and for larger and smaller discounts). Two experiments were conducted to explore these patterns of behavior and the results that were obtained supported the view that the framing effect was a common occurrence. The patterns of choice of individuals in a sample were found to be different due to changes in the ways discounts were offered. This can be explained by the various ways of presenting discount rates that had an impact on the influence of purchase intentions, recommendations and quality perception.
Resumo:
In this work, a Monte Carlo code was used to investigate the performance of different x-ray spectra in digital mammography, through a figure of merit (FOM), defined as FOM = CNR2/(D) over bar (g), with CNR being the contrast-to-noise ratio in image and (D) over bar (g) being the average glandular dose. The FOM was studied for breasts with different thicknesses t (2 cm <= t <= 8 cm) and glandular contents (25%, 50% and 75% glandularity). The anode/filter combinations evaluated were those traditionally employed in mammography (Mo/Mo, Mo/Rh, Rh/Rh), and a W anode combined with Al or K-edge filters (Zr, Mo, Rh, Pd, Ag, Cd, Sn), for tube potentials between 22 and 34 kVp. Results show that the W anode combined with K-edge filters provides higher values of FOM for all breast thicknesses investigated. Nevertheless, the most suitable filter and tube potential depend on the breast thickness, and for t >= 6 cm, they also depend on breast glandularity. Particularly for thick and dense breasts, a W anode combined with K-edge filters can greatly improve the digital technique, with the values of FOM up to 200% greater than that obtained with the anode/filter combinations and tube potentials traditionally employed in mammography. For breasts with t < 4 cm, a general good performance was obtained with the W anode combined with 60 mu m of the Mo filter at 24-25 kVp, while 60 mu m of the Pd filter provided a general good performance at 24-26 kVp for t = 4 cm, and at 28-30 and 29-31 kVp for t = 6 and 8 cm, respectively.
Resumo:
Objectives: To compare, in vivo, the accuracy of conventional and digital radiographic methods in determining root canal working length. Material and Methods: Twenty-five maxillary incisor or canine teeth from 22 patients were used in this study. Considering the preoperative radiographs as the baseline, a 25 K file was inserted into the root canal to the point where the Root ZX electronic apex locator indicated the APEX measurement in the screen. From this measurement, 1 mm was subtracted for positioning the file. The radiographic measurements were made using a digital sensor (Digora 1.51) or conventional type-E films, size 2, following the paralleling technique, to determine the distance of the file tip and the radiographic apex. Results: The Student "t" test indicated mean distances of 1.11 mm to conventional and 1.20 mm for the digital method and indicated a significant statistical difference (p<0.05). Conclusions: The conventional radiographic method was found to be superior to the digital one in determining the working length of the root canal.
Resumo:
Context. The ESO public survey VISTA variables in the Via Lactea (VVV) started in 2010. VVV targets 562 sq. deg in the Galactic bulge and an adjacent plane region and is expected to run for about five years. Aims. We describe the progress of the survey observations in the first observing season, the observing strategy, and quality of the data obtained. Methods. The observations are carried out on the 4-m VISTA telescope in the ZYJHK(s) filters. In addition to the multi-band imaging the variability monitoring campaign in the K-s filter has started. Data reduction is carried out using the pipeline at the Cambridge Astronomical Survey Unit. The photometric and astrometric calibration is performed via the numerous 2MASS sources observed in each pointing. Results. The first data release contains the aperture photometry and astrometric catalogues for 348 individual pointings in the ZYJHK(s) filters taken in the 2010 observing season. The typical image quality is similar to 0 ''.9-1 ''.0. The stringent photometric and image quality requirements of the survey are satisfied in 100% of the JHK(s) images in the disk area and 90% of the JHK(s) images in the bulge area. The completeness in the Z and Y images is 84% in the disk, and 40% in the bulge. The first season catalogues contain 1.28 x 10(8) stellar sources in the bulge and 1.68 x 10(8) in the disk area detected in at least one of the photometric bands. The combined, multi-band catalogues contain more than 1.63 x 10(8) stellar sources. About 10% of these are double detections because of overlapping adjacent pointings. These overlapping multiple detections are used to characterise the quality of the data. The images in the JHK(s) bands extend typically similar to 4 mag deeper than 2MASS. The magnitude limit and photometric quality depend strongly on crowding in the inner Galactic regions. The astrometry for K-s = 15-18 mag has rms similar to 35-175 mas. Conclusions. The VVV Survey data products offer a unique dataset to map the stellar populations in the Galactic bulge and the adjacent plane and provide an exciting new tool for the study of the structure, content, and star-formation history of our Galaxy, as well as for investigations of the newly discovered star clusters, star-forming regions in the disk, high proper motion stars, asteroids, planetary nebulae, and other interesting objects.
Resumo:
The theoretical framework that underpins this research study is based on the Prospect Theory formulated by Kahneman and Tversky, and Thaler's Mental Accounting Theory. The research aims to evaluate the consumers' behavior when different patterns of discount are offered (in percentage and absolute value and for larger and smaller discounts). Two experiments were conducted to explore these patterns of behavior and the results that were obtained supported the view that the framing effect was a common occurrence. The patterns of choice of individuals in a sample were found to be different due to changes in the ways discounts were offered. This can be explained by the various ways of presenting discount rates that had an impact on the influence of purchase intentions, recommendations and quality perception.
Resumo:
OBJECTIVE: To evaluate a comprehensive MRI protocol that investigates for cancer, vascular disease, and degenerative/inflammatory disease from the head to the pelvis in less than 40 minutes on a new generation 48-channel 3T system. MATERIALS AND METHODS: All MR studies were performed on a 48-channel 3T MR scanner. A 20-channel head/neck coil, two 18-channel body arrays, and a 32-channel spine array were employed. A total of 4 healthy individuals were studied. The designed protocol included a combination of single-shot T2-weighted sequences, T1-weighted 3D gradient-echo pre- and post-gadolinium. All images were retrospectively evaluated by two radiologists independently for overall image quality. RESULTS: The image quality for cancer was rated as excellent in the liver, pancreas, kidneys, lungs, pelvic organs, and brain, and rated as fair in the colon and breast. For vascular diseases ratings were excellent in the aorta, major branch vessel origins, inferior vena cava, portal and hepatic veins, rated as good in pulmonary arteries, and as poor in the coronary arteries. For degenerative/inflammatory diseases ratings were excellent in the brain, liver and pancreas. The inter-observer agreement was excellent. CONCLUSION: A comprehensive and time efficient screening for important categories of disease processes may be achieved with high quality imaging in a new generation 48-channel 3T system.
Resumo:
The main problem connected to cone beam computed tomography (CT) systems for industrial applications employing 450 kV X-ray tubes is the high amount of scattered radiation which is added to the primary radiation (signal). This stray radiation leads to a significant degradation of the image quality. A better understanding of the scattering and methods to reduce its effects are therefore necessary to improve the image quality. Several studies have been carried out in the medical field at lower energies, whereas studies in industrial CT, especially for energies up to 450 kV, are lacking. Moreover, the studies reported in literature do not consider the scattered radiation generated by the CT system structure and the walls of the X-ray room (environmental scatter). In order to investigate the scattering on CT projections a GEANT4-based Monte Carlo (MC) model was developed. The model, which has been validated against experimental data, has enabled the calculation of the scattering including the environmental scatter, the optimization of an anti-scatter grid suitable for the CT system, and the optimization of the hardware components of the CT system. The investigation of multiple scattering in the CT projections showed that its contribution is 2.3 times the one of primary radiation for certain objects. The results of the environmental scatter showed that it is the major component of the scattering for aluminum box objects of front size 70 x 70 mm2 and that it strongly depends on the thickness of the object and therefore on the projection. For that reason, its correction is one of the key factors for achieving high quality images. The anti-scatter grid optimized by means of the developed MC model was found to reduce the scatter-toprimary ratio in the reconstructed images by 20 %. The object and environmental scatter calculated by means of the simulation were used to improve the scatter correction algorithm which could be patented by Empa. The results showed that the cupping effect in the corrected image is strongly reduced. The developed CT simulation is a powerful tool to optimize the design of the CT system and to evaluate the contribution of the scattered radiation to the image. Besides, it has offered a basis for a new scatter correction approach by which it has been possible to achieve images with the same spatial resolution as state-of-the-art well collimated fan-beam CT with a gain in the reconstruction time of a factor 10. This result has a high economic impact in non-destructive testing and evaluation, and reverse engineering.
Resumo:
Biological processes are very complex mechanisms, most of them being accompanied by or manifested as signals that reflect their essential characteristics and qualities. The development of diagnostic techniques based on signal and image acquisition from the human body is commonly retained as one of the propelling factors in the advancements in medicine and biosciences recorded in the recent past. It is a fact that the instruments used for biological signal and image recording, like any other acquisition system, are affected by non-idealities which, by different degrees, negatively impact on the accuracy of the recording. This work discusses how it is possible to attenuate, and ideally to remove, these effects, with a particular attention toward ultrasound imaging and extracellular recordings. Original algorithms developed during the Ph.D. research activity will be examined and compared to ones in literature tackling the same problems; results will be drawn on the base of comparative tests on both synthetic and in-vivo acquisitions, evaluating standard metrics in the respective field of application. All the developed algorithms share an adaptive approach to signal analysis, meaning that their behavior is not dependent only on designer choices, but driven by input signal characteristics too. Performance comparisons following the state of the art concerning image quality assessment, contrast gain estimation and resolution gain quantification as well as visual inspection highlighted very good results featured by the proposed ultrasound image deconvolution and restoring algorithms: axial resolution up to 5 times better than algorithms in literature are possible. Concerning extracellular recordings, the results of the proposed denoising technique compared to other signal processing algorithms pointed out an improvement of the state of the art of almost 4 dB.
Resumo:
To date the hospital radiological workflow is completing a transition from analog to digital technology. Since the X-rays digital detection technologies have become mature, hospitals are trading on the natural devices turnover to replace the conventional screen film devices with digital ones. The transition process is complex and involves not just the equipment replacement but also new arrangements for image transmission, display (and reporting) and storage. This work is focused on 2D digital detector’s characterization with a concern to specific clinical application; the systems features linked to the image quality are analyzed to assess the clinical performances, the conversion efficiency, and the minimum dose necessary to get an acceptable image. The first section overviews the digital detector technologies focusing on the recent and promising technological developments. The second section contains a description of the characterization methods considered in this thesis categorized in physical, psychophysical and clinical; theory, models and procedures are described as well. The third section contains a set of characterizations performed on new equipments that appears to be some of the most advanced technologies available to date. The fourth section deals with some procedures and schemes employed for quality assurance programs.