973 resultados para Microfluidic Analytical Techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An analysis of thermal degradation products evolved during the melt processing of organo-layered silicates (OLS) was carried out via the use of a solid phase microextraction (SPME) technique. Two commerical OLSs and one produced in-house were prepared for comparision. The solid phase microextraction technique proved to be a very effective technique for investigating the degradation of the OLS at a specific processing temperature. The results showed that most available OLSs will degrade under typical conditions required for the melt processing of many polymers, including thermoplastic polyurethanes. It is suggested that these degradation products may lead to changes in the structure and properties of the final polymer, particularly in thermoplastic polyurethanes, which seem significantly succeptable to the presence of these products. It is also suggested that many commercially available OLSs are produced in such a way that results in an excess of unbound organic modifier, giving rise to a greater quantity of degradation products. All OLSs where compared and characterised by TGA and GC-MS. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microbiological diagnosis of catheter-related bloodstream infection (CR-BSI) is often based on isolation of indistinguishable micro-organisms from an explanted catheter tip and blood culture, confirmed by antibiograms. Whether phenotypic identification of coagulase-negative staphylococci (CoNS) allows an accurate diagnosis of CR-BSI to be established was evaluated. Eight patients with a diagnosis of CR-BSI had CoNS isolated from pure blood cultures and explanted catheter tips which were considered as indistinguishable strains by routine microbiological methods. For each patient, an additional three colonies of CoNS isolated from the blood and five from the catheter tip were subcultured and further characterized by antibiogram profiles, analytical profile index (API) biotyping and PFGE. PFGE distinguished more strains of CoNS compared to API biotyping or antibiograms (17, 10 and 11, respectively). By PFGE, indistinguishable micro-organisms were only isolated from pure blood and catheter tip cultures in four out of eight (50%) patients thus supporting the diagnosis of CR-BSI. In another patient, indistinguishable micro-organisms were identified in both cultures; however, other strains of CoNS were also present. The remaining three patients had multiple strains of CoNS, none of which were indistinguishable in the tip and blood cultures, thus questioning the diagnosis of CR-BSI. Phenotypic characterization of CoNS lacked discriminatory power. Current routine methods of characterizing a limited number of pooled colonies may generate misleading results as multiple strains may be present in the cultures. Multiple colonies should be studied using a rapid genotypic characterization method to confirm or refute the diagnosis of CR-BSI. © 2007 SGM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ultrasonics offers the possibility of developing sophisticated fluid manipulation tools in lab-on-a-chip technologies. Here we demonstrate the ability to shape ultrasonic fields by using phononic lattices, patterned on a disposable chip, to carry out the complex sequence of fluidic manipulations required to detect the rodent malaria parasite Plasmodium berghei in blood. To illustrate the different tools that are available to us, we used acoustic fields to produce the required rotational vortices that mechanically lyse both the red blood cells and the parasitic cells present in a drop of blood. This procedure was followed by the amplification of parasitic genomic sequences using different acoustic fields and frequencies to heat the sample and perform a real-time PCR amplification. The system does not require the use of lytic reagents nor enrichment steps, making it suitable for further integration into lab-on-a-chip point-of-care devices. This acoustic sample preparation and PCR enables us to detect ca. 30 parasites in a microliter-sized blood sample, which is the same order of magnitude in sensitivity as lab-based PCR tests. Unlike other lab-on-a-chip methods, where the sample moves through channels, here we use our ability to shape the acoustic fields in a frequency-dependent manner to provide different analytical functions. The methods also provide a clear route toward the integration of PCR to detect pathogens in a single handheld system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comprehensive investigation of sensitive ecosystems in South Florida with the main goal of determining the identity, spatial distribution, and sources of both organic biocides and trace elements in different environmental compartments is reported. This study presents the development and validation of a fractionation and isolation method of twelve polar acidic herbicides commonly applied in the vicinity of the study areas, including e.g. 2,4-D, MCPA, dichlorprop, mecroprop, picloram in surface water. Solid phase extraction (SPE) was used to isolate the analytes from abiotic matrices containing large amounts of dissolved organic material. Atmospheric-pressure ionization (API) with electrospray ionization in negative mode (ESP-) in a Quadrupole Ion Trap mass spectrometer was used to perform the characterization of the herbicides of interest. ^ The application of Laser Ablation-ICP-MS methodology in the analysis of soils and sediments is reported in this study. The analytical performance of the method was evaluated on certified standards and real soil and sediment samples. Residential soils were analyzed to evaluate feasibility of using the powerful technique as a routine and rapid method to monitor potential contaminated sites. Forty eight sediments were also collected from semi pristine areas in South Florida to conduct screening of baseline levels of bioavailable elements in support of risk evaluation. The LA-ICP-MS data were used to perform a statistical evaluation of the elemental composition as a tool for environmental forensics. ^ A LA-ICP-MS protocol was also developed and optimized for the elemental analysis of a wide range of elements in polymeric filters containing atmospheric dust. A quantitative strategy based on internal and external standards allowed for a rapid determination of airborne trace elements in filters containing both contemporary African dust and local dust emissions. These distributions were used to qualitative and quantitative assess differences of composition and to establish provenance and fluxes to protected regional ecosystems such as coral reefs and national parks. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are situations in which it is very important to quickly and positively identify an individual. Examples include suspects detained in the neighborhood of a bombing or terrorist incident, individuals detained attempting to enter or leave the country, and victims of mass disasters. Systems utilized for these purposes must be fast, portable, and easy to maintain. The goal of this project was to develop an ultra fast, direct PCR method for forensic genotyping of oral swabs. The procedure developed eliminates the need for cellular digestion and extraction of the sample by performing those steps in the PCR tube itself. Then, special high-speed polymerases are added which are capable of amplifying a newly developed 7 loci multiplex in under 16 minutes. Following the amplification, a postage stamp sized microfluidic device equipped with specially designed entangled polymer separation matrix, yields a complete genotype in 80 seconds. The entire process is rapid and reliable, reducing the time from sample to genotype from 1-2 days to under 20 minutes. Operation requires minimal equipment and can be easily performed with a small high-speed thermal-cycler, reagents, and a microfluidic device with a laptop. The system was optimized and validated using a number of test parameters and a small test population. The overall precision was better than 0.17 bp and provided a power of discrimination greater than 1 in 106. The small footprint, and ease of use will permit this system to be an effective tool to quickly screen and identify individuals detained at ports of entry, police stations and remote locations. The system is robust, portable and demonstrates to the forensic community a simple solution to the problem of rapid determination of genetic identity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stable isotope analysis has emerged as one of the primary means for examining the structure and dynamics of food webs, and numerous analytical approaches are now commonly used in the field. Techniques range from simple, qualitative inferences based on the isotopic niche, to Bayesian mixing models that can be used to characterize food-web structure at multiple hierarchical levels. We provide a comprehensive review of these techniques, and thus a single reference source to help identify the most useful approaches to apply to a given data set. We structure the review around four general questions: (1) what is the trophic position of an organism in a food web?; (2) which resource pools support consumers?; (3) what additional information does relative position of consumers in isotopic space reveal about food-web structure?; and (4) what is the degree of trophic variability at the intrapopulation level? For each general question, we detail different approaches that have been applied, discussing the strengths and weaknesses of each. We conclude with a set of suggestions that transcend individual analytical approaches, and provide guidance for future applications in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review discusses menu analysis models in depth to identify the models strengths and weaknesses in attempt to discover opportunities to enhance existing models and evolve menu analysis toward a comprehensive analytical model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Capillary electrophoresis (CE) is a modern analytical technique, which is electrokinetic separation generated by high voltage and taken place inside the small capillaries. In this dissertation, several advanced capillary electrophoresis methods are presented using different approaches of CE and UV and mass spectrometry are utilized as the detection methods. ^ Capillary electrochromatography (CEC), as one of the CE modes, is a recent developed technique which is a hybrid of capillary electrophoresis and high performance liquid chromatography (HPLC). Capillary electrochromatography exhibits advantages of both techniques. In Chapter 2, monolithic capillary column are fabricated using in situ photoinitiation polymerization method. The column was then applied for the separation of six antidepressant compounds. ^ Meanwhile, a simple chiral separation method is developed and presented in Chapter 3. Beta cycodextrin was utilized to achieve the goal of chiral separation. Not only twelve cathinone analytes were separated, but also isomers of several analytes were enantiomerically separated. To better understand the molecular information on the analytes, the TOF-MS system was coupled with the CE. A sheath liquid and a partial filling technique (PFT) were employed to reduce the contamination of MS ionization source. Accurate molecular information was obtained. ^ It is necessary to propose, develop, and optimize new techniques that are suitable for trace-level analysis of samples in forensic, pharmaceutical, and environmental applications. Capillary electrophoresis (CE) was selected for this task, as it requires lower amounts of samples, it simplifies sample preparation, and it has the flexibility to perform separations of neutral and charged molecules as well as enantiomers. ^ Overall, the study demonstrates the versatility of capillary electrophoresis methods in forensic, pharmaceutical, and environmental applications.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comprehensive investigation of sensitive ecosystems in South Florida with the main goal of determining the identity, spatial distribution, and sources of both organic biocides and trace elements in different environmental compartments is reported. This study presents the development and validation of a fractionation and isolation method of twelve polar acidic herbicides commonly applied in the vicinity of the study areas, including e.g. 2,4-D, MCPA, dichlorprop, mecroprop, picloram in surface water. Solid phase extraction (SPE) was used to isolate the analytes from abiotic matrices containing large amounts of dissolved organic material. Atmospheric-pressure ionization (API) with electrospray ionization in negative mode (ESP-) in a Quadrupole Ion Trap mass spectrometer was used to perform the characterization of the herbicides of interest. The application of Laser Ablation-ICP-MS methodology in the analysis of soils and sediments is reported in this study. The analytical performance of the method was evaluated on certified standards and real soil and sediment samples. Residential soils were analyzed to evaluate feasibility of using the powerful technique as a routine and rapid method to monitor potential contaminated sites. Forty eight sediments were also collected from semi pristine areas in South Florida to conduct screening of baseline levels of bioavailable elements in support of risk evaluation. The LA-ICP-MS data were used to perform a statistical evaluation of the elemental composition as a tool for environmental forensics. A LA-ICP-MS protocol was also developed and optimized for the elemental analysis of a wide range of elements in polymeric filters containing atmospheric dust. A quantitative strategy based on internal and external standards allowed for a rapid determination of airborne trace elements in filters containing both contemporary African dust and local dust emissions. These distributions were used to qualitative and quantitative assess differences of composition and to establish provenance and fluxes to protected regional ecosystems such as coral reefs and national parks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper details methodologies that have been explored for the fast proofing of on-chip architectures for Circular Dichroism techniques. Flow-cell devices fabricated from UV transparent Quartz are used for these experiments. The complexity of flow-cell production typically results in lead times of six months from order to delivery. Only at that point can the on-chip architecture be tested empirically and any required modifications determined ready for the next six month iteration phase. By using the proposed 3D printing and PDMS moulding techniques for fast proofing on-chip architectures the optimum design can be determined within a matter of hours prior to commitment to quartz chip production.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the pressure continues to grow on Diamond and the world's synchrotrons for higher throughput of diffraction experiments, new and novel techniques are required for presenting micron dimension crystals to the X ray beam. Currently this task is both labour intensive and primarily a serial process. Diffraction measurements typically take milliseconds but sample preparation and presentation can reduce throughput down to 4 measurements an hour. With beamline waiting times as long as two years it is of key importance for researchers to capitalize on available beam time, generating as much data as possible. Other approaches detailed in the literature [1] [2] [3] are very much skewed towards automating, with robotics, the actions of a human protocols. The work detailed here is the development and discussion of a bottom up approach relying on SSAW self assembly, including material selection, microfluidic integration and tuning of the acoustic cavity to order the protein crystals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the pressure continues to grow on Diamond and the world's synchrotrons for higher throughput of diffraction experiments, new and novel techniques are required for presenting micron dimension crystals to the X ray beam. Currently this task is both labour intensive and primarily a serial process. Diffraction measurements typically take milliseconds but sample preparation and presentation can reduce throughput down to 4 measurements an hour. With beamline waiting times as long as two years it is of key importance for researchers to capitalize on available beam time, generating as much data as possible. Other approaches detailed in the literature [1] [2] [3] are very much skewed towards automating, with robotics, the actions of a human protocols. The work detailed here is the development and discussion of a bottom up approach relying on SSAW self assembly, including material selection, microfluidic integration and tuning of the acoustic cavity to order the protein crystals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.