908 resultados para Software ASF MTF NPS DBT FFDM
Resumo:
In questo lavoro di tesi sono state studiate le caratteristiche di una macchina per tomosintesi Fujifilm AMULET Innovality in uso presso l'Istituto Scientifico Romagnolo per lo Studio e la Cura dei Tumori (I.R.S.T.) di Meldola. Le valutazioni sono state fatte utilizzando diversi fantocci, uno dei quali costruito durante il lavoro di tesi. Per la valutazione delle immagini di mammografia digitale e di tomosintesi sono state seguite le linee guida della International Electrotechnical Commission (IEC) e della European Reference Organisation for Quality Assured Breast Screening and Diagnostic Services (EUREF). Per lo studio delle mammografie digitali sono stati valutati, utilizzando i software COQ e ImageJ, i parametri di NPS, MTF e DQE. Per lo studio delle immagini di tomosintesi sono stati appositamente sviluppati degli algoritmi in linguaggio Java, integrati poi all'interno del software COQ. Il programma sviluppato ha permesso di valutare ASF, MTF, NPS e omogeneità delle immagini ricostruite.
Resumo:
This work compares the detector performance and image quality of the new Kodak Min-R EV mammography screen-film system with the Fuji CR Profect detector and with other current mammography screen-film systems from Agfa, Fuji and Kodak. Basic image quality parameters (MTF, NPS, NEQ and DQE) were evaluated for a 28 kV Mo/Mo (HVL = 0.646 mm Al) beam using different mAs exposure settings. Compared with other screen-film systems, the new Kodak Min-R EV detector has the highest contrast and a low intrinsic noise level, giving better NEQ and DQE results, especially at high optical density. Thus, the properties of the new mammography film approach those of a fine mammography detector, especially at low frequency range. Screen-film systems provide the best resolution. The presampling MTF of the digital detector has a value of 15% at the Nyquist frequency and, due to the spread size of the laser beam, the use of a smaller pixel size would not permit a significant improvement of the detector resolution. The dual collection reading technology increases significantly the low frequency DQE of the Fuji CR system that can at present compete with the most efficient mammography screen-film systems.
Resumo:
The characterization of physical properties of digital imaging systems requires the determination and measurement of detectors’ physical performance. Those measures such as modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE) provide objective evaluations of digital detectors’ performance. To provide an MTF, NPS, and DQE calculation from raw-data images it is necessary to implement a method that is undertaken by two major steps: (1) image acquisition and (2) quantitative measure determination method. In this chapter a comprehensive description about a method to provide the measure of performance of digital radiography detectors is provided.
Resumo:
La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.
Resumo:
Lo scopo di questo lavoro è la caratterizzazione fisica del flat panel PaxScan4030CB Varian, rivelatore di raggi X impiegato in un ampio spettro di applicazioni cliniche, dalla radiografia generale alla radiologia interventistica. Nell’ambito clinico, al fine di una diagnosi accurata, è necessario avere una buona qualità dell’immagine radiologica mantenendo il più basso livello di dose rilasciata al paziente. Elemento fondamentale per ottenere questo risultato è la scelta del rivelatore di radiazione X, che deve garantire prestazioni fisiche (contrasto, risoluzione spaziale e rumore) adeguati alla specifica procedura. Le metriche oggettive che misurano queste caratteristiche sono SNR (Signal-to-Noise Ratio), MTF (Modulation Transfer Function) ed NPS (Noise Power Spectrum), che insieme contribuiscono alla misura della DQE (Detective Quantum Efficiency), il parametro più completo e adatto a stabilire le performance di un sistema di imaging. L’oggettività di queste misure consente anche di mettere a confronto tra loro diversi sistemi di rivelazione. La misura di questi parametri deve essere effettuata seguendo precisi protocolli di fisica medica, che sono stati applicati al rivelatore PaxScan4030CB presente nel laboratorio del Centro di Coordinamento di Fisica Medica, Policlinico S.Orsola. I risultati ottenuti, conformi a quelli dichiarati dal costruttore, sono stati confrontati con successo con alcuni lavori presenti in letteratura e costituiscono la base necessaria per la verifica di procedure di ottimizzazione dell’immagine radiologica attraverso interventi sul processo di emissione dei raggi X e sul trattamento informatico dell’immagine (Digital Subtraction Angiography).
Resumo:
Digital Breast Tomosynthesis (DBT) is an advanced mammography technique based on the reconstruction of a pseudo-volumetric image. To date, image quality represents the most deficient section of DBT quality control protocols. In fact, related tests are not yet characterized by either action levels or typical values. This thesis work focuses on the evaluation of one aspect of image quality: the z-resolution. The latter is studied in terms of Artifact Spread Function (ASF), a function that describes the signal spread of a detail along the reconstructed focal planes. To quantify the ASF numerically, its Full Width at Half Maximum (FWHM) is calculated and used as a representative index of z-resolution. Experimental measurements were acquired in 24 DBT systems, of 7 different models, currently in use in 20 hospital facilities in Italy. The analysis, performed on the clinical reconstructed images, of 5 different commercial phantoms, lead to the identification of characteristic FWHM values for each type of DBT system. The ASF clearly showed a dependence on the size of the detail, providing higher FWHM values for larger objects. The z-resolution was found to be positively influenced by the acquisition angle: Fujifilm sistematically showed wider ASF profiles in ST mode (15°) than in HR mode (40°). However, no clear relationship was found between angular range and ASF, among different DBT systems, due to the influence of the peculiarities of each reconstruction algorithm. The experimental approach shown in this thesis work can be proposed as a z-resolution quality control test procedure. Contextually, the values found could be used as a starting point for identifying typical values to be included in the test, in a DBT protocol. Clearly, a statistically significant number of images is needed to do this. The equipment involved in this work is located in hospitals and is not available for research purposes, so only a limited amount of data was acquired and processed.
Resumo:
The daily-to-day of medical practice is marked by a constant search for an accurate diagnosis and therapeutic assessment. For this purpose the doctor serves up a wide variety of imaging techniques, however, the methods using ionizing radiation still the most widely used because it is considered cheaper and above all very efficient when used with control and quality. The optimization of the risk-benefit ratio is considered a major breakthrough in relation to conventional radiology, though this is not the reality of computing and digital radiology, where Brazil has not established standards and protocols for this purpose. This work aims to optimize computational chest radiographs (anterior-posterior projection-AP). To achieve this objective were used a homogeneous phantoms that simulate the characteristics of absorption and scattering of radiation close to the chest of a patient standard. Another factor studied was the subjective evaluation of image quality, carried out by visual grading assessment (VGA) by specialists in radiology, using an anthropomorphic phantom to identify the best image for a particular pathology (fracture or pneumonia). Quantifying the corresponding images indicated by the radiologist was performed from the quantification of physical parameters (Detective Quantum Efficiency - DQE, Modulation Transfer Function - MTF and Noise Power Spectrum - NPS) using the software MatLab®. © 2013 Springer-Verlag.
Resumo:
Based on the empirical evidence that the ratio of email messages in public mailing lists to versioning system commits has remained relatively constant along the history of the Apache Software Foundation (ASF), this paper has as goal to study what can be inferred from such a metric for projects of the ASF. We have found that the metric seems to be an intensive metric as it is independent of the size of the project, its activity, or the number of developers, and remains relatively independent of the technology or functional area of the project. Our analysis provides evidence that the metric is related to the technical effervescence and popularity of project, and as such can be a good candidate to measure its healthy evolution. Other, similar metrics -like the ratio of developer messages to commits and the ratio of issue tracker messages to commits- are studied for several projects as well, in order to see if they have similar characteristics.
Resumo:
Thesis (Master, Computing) -- Queen's University, 2016-05-29 18:11:34.114
Resumo:
Over the past few years, logging has evolved from from simple printf statements to more complex and widely used logging libraries. Today logging information is used to support various development activities such as fixing bugs, analyzing the results of load tests, monitoring performance and transferring knowledge. Recent research has examined how to improve logging practices by informing developers what to log and where to log. Furthermore, the strong dependence on logging has led to the development of logging libraries that have reduced the intricacies of logging, which has resulted in an abundance of log information. Two recent challenges have emerged as modern software systems start to treat logging as a core aspect of their software. In particular, 1) infrastructural challenges have emerged due to the plethora of logging libraries available today and 2) processing challenges have emerged due to the large number of log processing tools that ingest logs and produce useful information from them. In this thesis, we explore these two challenges. We first explore the infrastructural challenges that arise due to the plethora of logging libraries available today. As systems evolve, their logging infrastructure has to evolve (commonly this is done by migrating to new logging libraries). We explore logging library migrations within Apache Software Foundation (ASF) projects. We i find that close to 14% of the pro jects within the ASF migrate their logging libraries at least once. For processing challenges, we explore the different factors which can affect the likelihood of a logging statement changing in the future in four open source systems namely ActiveMQ, Camel, Cloudstack and Liferay. Such changes are likely to negatively impact the log processing tools that must be updated to accommodate such changes. We find that 20%-45% of the logging statements within the four systems are changed at least once. We construct random forest classifiers and Cox models to determine the likelihood of both just-introduced and long-lived logging statements changing in the future. We find that file ownership, developer experience, log density and SLOC are important factors in determining the stability of logging statements.
Resumo:
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.
Resumo:
This paper presents SMarty, a variability management approach for UML-based software product lines (PL). SMarty is supported by a UML profile, the SMartyProfile, and a process for managing variabilities, the SMartyProcess. SMartyProfile aims at representing variabilities, variation points, and variants in UML models by applying a set of stereotypes. SMartyProcess consists of a set of activities that is systematically executed to trace, identify, and control variabilities in a PL based on SMarty. It also identifies variability implementation mechanisms and analyzes specific product configurations. In addition, a more comprehensive application of SMarty is presented using SEI's Arcade Game Maker PL. An evaluation of SMarty and related work are discussed.
Resumo:
Thousands of Free and Open Source Software Projects (FSP) were, and continually are, created on the Internet. This scenario increases the number of opportunities to collaborate to the same extent that it promotes competition for users and contributors, who can guide projects to superior levels, unachievable by founders alone. Thus, given that the main goal of FSP founders is to improve their projects by means of collaboration, the importance to understand and manage the capacity of attracting users and contributors to the project is established. To support researchers and founders in this challenge, the concept of attractiveness is introduced in this paper, which develops a theoretical-managerial toolkit about the causes, indicators and consequences of attractiveness, enabling its strategic management.
Resumo:
Objective To evaluate drug interaction software programs and determine their accuracy in identifying drug-drug interactions that may occur in intensive care units. Setting The study was developed in Brazil. Method Drug interaction software programs were identified through a bibliographic search in PUBMED and in LILACS (database related to the health sciences published in Latin American and Caribbean countries). The programs` sensitivity, specificity, and positive and negative predictive values were determined to assess their accuracy in detecting drug-drug interactions. The accuracy of the software programs identified was determined using 100 clinically important interactions and 100 clinically unimportant ones. Stockley`s Drug Interactions 8th edition was employed as the gold standard in the identification of drug-drug interaction. Main outcome Sensitivity, specificity, positive and negative predictive values. Results The programs studied were: Drug Interaction Checker (DIC), Drug-Reax (DR), and Lexi-Interact (LI). DR displayed the highest sensitivity (0.88) and DIC showed the lowest (0.69). A close similarity was observed among the programs regarding specificity (0.88-0.92) and positive predictive values (0.88-0.89). The DIC had the lowest negative predictive value (0.75) and DR the highest (0.91). Conclusion The DR and LI programs displayed appropriate sensitivity and specificity for identifying drug-drug interactions of interest in intensive care units. Drug interaction software programs help pharmacists and health care teams in the prevention and recognition of drug-drug interactions and optimize safety and quality of care delivered in intensive care units.
Resumo:
Support for interoperability and interchangeability of software components which are part of a fieldbus automation system relies on the definition of open architectures, most of them involving proprietary technologies. Concurrently, standard, open and non-proprietary technologies, such as XML, SOAP, Web Services and the like, have greatly evolved and been diffused in the computing area. This article presents a FOUNDATION fieldbus (TM) device description technology named Open-EDD, based on XML and other related technologies (XLST, DOM using Xerces implementation, OO, XMIL Schema), proposing an open and nonproprietary alternative to the EDD (Electronic Device Description). This initial proposal includes defining Open-EDDML as the programming language of the technology in the FOUNDATION fieldbus (TM) protocol, implementing a compiler and a parser, and finally, integrating and testing the new technology using field devices and a commercial fieldbus configurator. This study attests that this new technology is feasible and can be applied to other configurators or HMI applications used in fieldbus automation systems. (c) 2008 Elsevier B.V. All rights reserved.