998 resultados para Diagnostic Algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to determine and evaluate the diagnostic accuracy of visual screening tests for detecting vision loss in elderly. This study is defined as study of diagnostic performance. The diagnostic accuracy of 5 visual tests -near convergence point, near accommodation point, stereopsis, contrast sensibility and amsler grid—was evaluated by means of the ROC method (receiver operating characteristics curves), sensitivity, specificity, positive and negative likelihood ratios (LR+/LR−). Visual acuity was used as the reference standard. A sample of 44 elderly aged 76.7 years (±9.32), who were institutionalized, was collected. The curves of contrast sensitivity and stereopsis are the most accurate (area under the curves were 0.814−p = 0.001, C.I.95%[0.653;0.975]— and 0.713−p = 0.027, C.I.95%[0,540;0,887], respectively). The scores with the best diagnostic validity for the stereopsis test were 0.605 (sensitivity 0.87, specificity 0.54; LR+ 1.89, LR−0.24) and 0.610 (sensitivity 0.81, specificity 0.54; LR+1.75, LR−0.36). The scores with higher diagnostic validity for the contrast sensibility test were 0.530 (sensitivity 0.94, specificity 0.69; LR+ 3.04, LR−0.09). The contrast sensitivity and stereopsis test's proved to be clinically useful in detecting vision loss in the elderly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aging of Portuguese population is characterized by an increase of individuals aged older than 65 years. Preventable visual loss in older persons is an important public health problem. Tests used for vision screening should have a high degree of diagnostic validity confirmed by means of clinical trials. The primary aim of a screening program is the early detection of visual diseases. Between 20% and 50% of older people in the UK have undetected reduced vision and in most cases is correctable. Elderly patients do not receive a systematic eye examination unless a problem arises with their glasses or suspicion vision loss. This study aimed to determine and evaluate the diagnostic accuracy of visual screening tests for detecting vision loss in elderly. Furthermore, it pretends to define the ability to find the subjects affected with vision loss as positive and the subjects not affected with the same disease as negative. The ideal vision screening method should have high sensitivity and specificity for early detection of risk factors. It should be also low cost and easy to implement in all geographic and socioeconomic regions. Sensitivity is the ability of an examination to identify the presence of a given disease and specificity is the ability of the examination to identify the absence of a given disease. It was not an aim of this study to detect abnormalities that affect visual acuity. The aim of this study was to find out what´s the best test for the identification of any vision loss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esta tese pretende contribuir para o estudo e análise dos factores relacionados com as técnicas de aquisição de imagens radiológicas digitais, a qualidade diagnóstica e a gestão da dose de radiação em sistema de radiologia digital. A metodologia encontra-se organizada em duas componentes. A componente observacional, baseada num desenho do estudo de natureza retrospectiva e transversal. Os dados recolhidos a partir de sistemas CR e DR permitiram a avaliação dos parâmetros técnicos de exposição utilizados em radiologia digital, a avaliação da dose absorvida e o índice de exposição no detector. No contexto desta classificação metodológica (retrospectiva e transversal), também foi possível desenvolver estudos da qualidade diagnóstica em sistemas digitais: estudos de observadores a partir de imagens arquivadas no sistema PACS. A componente experimental da tese baseou-se na realização de experiências em fantomas para avaliar a relação entre dose e qualidade de imagem. As experiências efectuadas permitiram caracterizar as propriedades físicas dos sistemas de radiologia digital, através da manipulação das variáveis relacionadas com os parâmetros de exposição e a avaliação da influência destas na dose e na qualidade da imagem. Utilizando um fantoma contraste de detalhe, fantomas antropomórficos e um fantoma de osso animal, foi possível objectivar medidas de quantificação da qualidade diagnóstica e medidas de detectabilidade de objectos. Da investigação efectuada, foi possível salientar algumas conclusões. As medidas quantitativas referentes à performance dos detectores são a base do processo de optimização, permitindo a medição e a determinação dos parâmetros físicos dos sistemas de radiologia digital. Os parâmetros de exposição utilizados na prática clínica mostram que a prática não está em conformidade com o referencial Europeu. Verifica-se a necessidade de avaliar, melhorar e implementar um padrão de referência para o processo de optimização, através de novos referenciais de boa prática ajustados aos sistemas digitais. Os parâmetros de exposição influenciam a dose no paciente, mas a percepção da qualidade de imagem digital não parece afectada com a variação da exposição. Os estudos que se realizaram envolvendo tanto imagens de fantomas como imagens de pacientes mostram que a sobreexposição é um risco potencial em radiologia digital. A avaliação da qualidade diagnóstica das imagens mostrou que com a variação da exposição não se observou degradação substancial da qualidade das imagens quando a redução de dose é efectuada. Propõe-se o estudo e a implementação de novos níveis de referência de diagnóstico ajustados aos sistemas de radiologia digital. Como contributo da tese, é proposto um modelo (STDI) para a optimização de sistemas de radiologia digital.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado de Radiações aplicadas às Tecnologias da Saúde. Área de especialização: Imagem Digital com Radiação X.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter provides a theoretical background about image quality in diagnostic radiology. Digital image representation and also image quality evaluation methods are here discussed. An overview of methods for quality evaluation of diagnostic imaging procedures is provided. Digital image representation and primary physical image quality parameters are also discussed, including objective image quality measurements and observer performance methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the discriminative and diagnostic values of neuropsychological tests for identifying schizophrenia patients. METHODS: A cross-sectional study with 36 male schizophrenia outpatients and 72 healthy matched volunteers was carried out. Participants underwent the following neuropsychological tests: Wisconsin Card Sorting test, Verbal Fluency, Stroop test, Mini Mental State Examination, and Spatial Recognition Span. Sensitivity and specificity estimated the diagnostic value of tests with cutoffs obtained using Receiver Operating Characteristic curves. The latent class model (diagnosis of schizophrenia) was used as gold standard. RESULTS: Although patients presented lower scores in most tests, the highest canonical function for the discriminant analysis was 0.57 (Verbal Fluency M). The best sensitivity and specificity were obtained in the Verbal Fluency M test (75 and 65, respectively). CONCLUSIONS: The neuropsychological tests showed moderate diagnostic value for the identification of schizophrenia patients. These findings suggested that the cognitive impairment measured by these tests might not be homogeneous among schizophrenia patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims at investigating the impact of treating breast cancer using different radiation therapy (RT) techniques – forwardly-planned intensity-modulated, f-IMRT, inversely-planned IMRT and dynamic conformal arc (DCART) RT – and their effects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB treatment planning system were compared: Pencil Beam Convolution (PBC) and commercial Monte Carlo (iMC). Seven left-sided breast patients submitted to breast-conserving surgery were enrolled in the study. For each patient, four RT techniques – f-IMRT, IMRT using 2-fields and 5-fields (IMRT2 and IMRT5, respectively) and DCART – were applied. The dose distributions in the planned target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose–volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all techniques provided adequate coverage of the PTV. However, statistically significant dose differences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung and heart than tangential techniques. However, IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Differences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the iMC algorithm predicted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Image resizing is a normal feature incorporated into the Nuclear Medicine digital imaging. Upsampling is done by manufacturers to adequately fit more the acquired images on the display screen and it is applied when there is a need to increase - or decrease - the total number of pixels. This paper pretends to compare the “hqnx” and the “nxSaI” magnification algorithms with two interpolation algorithms – “nearest neighbor” and “bicubic interpolation” – in the image upsampling operations. Material and Methods: Three distinct Nuclear Medicine images were enlarged 2 and 4 times with the different digital image resizing algorithms (nearest neighbor, bicubic interpolation nxSaI and hqnx). To evaluate the pixel’s changes between the different output images, 3D whole image plot profiles and surface plots were used as an addition to the visual approach in the 4x upsampled images. Results: In the 2x enlarged images the visual differences were not so noteworthy. Although, it was clearly noticed that bicubic interpolation presented the best results. In the 4x enlarged images the differences were significant, with the bicubic interpolated images presenting the best results. Hqnx resized images presented better quality than 4xSaI and nearest neighbor interpolated images, however, its intense “halo effect” affects greatly the definition and boundaries of the image contents. Conclusion: The hqnx and the nxSaI algorithms were designed for images with clear edges and so its use in Nuclear Medicine images is obviously inadequate. Bicubic interpolation seems, from the algorithms studied, the most suitable and its each day wider applications seem to show it, being assumed as a multi-image type efficient algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper formulates a genetic algorithm that evolves two types of objects in a plane. The fitness function promotes a relationship between the objects that is optimal when some kind of interface between them occurs. Furthermore, the algorithm adopts an hexagonal tessellation of the two-dimensional space for promoting an efficient method of the neighbour modelling. The genetic algorithm produces special patterns with resemblances to those revealed in percolation phenomena or in the symbiosis found in lichens. Besides the analysis of the spacial layout, a modelling of the time evolution is performed by adopting a distance measure and the modelling in the Fourier domain in the perspective of fractional calculus. The results reveal a consistent, and easy to interpret, set of model parameters for distinct operating conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To avoid additional hardware deployment, indoor localization systems have to be designed in such a way that they rely on existing infrastructure only. Besides the processing of measurements between nodes, localization procedure can include the information of all available environment information. In order to enhance the performance of Wi-Fi based localization systems, the innovative solution presented in this paper considers also the negative information. An indoor tracking method inspired by Kalman filtering is also proposed.