976 resultados para colombo itetris ns-3 VANET monitoraggio traffico veicoli ITS Intelligent Transport System


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the typically low population densities and animal-mediated pollination of tropical forest trees, outcrossing and long-distance pollen dispersal are the norm. We reviewed the genetic literature on mating systems and pollen dispersal for neotropical trees to identify the ecological and phylogenetic correlates. The 36 studies surveyed found >90% outcrossed mating for 45 hermaphroditic or monoecious species. Self-fertilization rates varied inversely with population density and showed phylogenetic and geographic trends. The few direct measures of pollen flow (N = 11 studies) suggest that pollen dispersal is widespread among low-density tropical trees, ranging from a mean of 200 m to over 19 km for species pollinated by small insects or bats. Future research needs to examine (1) the effect of inbreeding depression on observed outcrossing rates, (2) pollen dispersal in a wide range of pollination syndromes and ecological classes, (3) and the range of variation of mating system expression at different hierarchical levels, including individual, seasonal, population, ecological, landscape and range wide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cueO gene of Escherichia coli encodes a multi-copper oxidase, which contributes to copper tolerance in this bacterium. It was observed that a cueO mutant was highly sensitive to killing by copper ions when cells were grown on defined minimal media. Copper sensitivity was correlated with accumulation of copper in the mutant strain. Growth of the cueO mutant in the presence of copper could be restored by addition of divalent zinc and manganese ions or ferrous iron but not by other first row transition metal ions or magnesium ions. Copper toxicity towards a cueO mutant Could also be suppressed by addition of the superoxide quencher 1,2-dihydroxybenzene-3,5-disulfonic acid (tiron), suggesting that a primary cause of copper toxicity is the copper-catalyzed production of superoxide anions in the cytoplasm. (C) 2005 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large and powerful ocean predators such as swordfishes, some tunas, and several shark species are unique among fishes in that they are capable of maintaining elevated body temperatures (endothermy) when hunting for prey in deep and cold water [1-3]. In these animals, warming the central nervous system and the eyes is the one common feature of this energetically costly adaptation [4]. In the swordfish (Xiphias gladius), a highly specialized heating system located in an extraocular muscle specifically warms the eyes and brain up to 10degreesC-15degreesC above ambient water temperatures [2, 5]. Although the function of neural warming in fishes has been the subject of considerable speculation [1, 6, 7], the biological significance of this unusual ability has until now remained unknown. We show here that warming the retina significantly improves temporal resolution, and hence the detection of rapid motion, in fast-swimming predatory fishes such as the swordfish. Depending on diving depth, temporal resolution can be more than ten times greater in these fishes than in fishes with eyes at the same temperature as the surrounding water. The enhanced temporal resolution allowed by heated eyes provides warm-blooded and highly visual oceanic predators, such as swordfishes, tunas, and sharks, with a crucial advantage over their agile, cold-blooded prey.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The uptake and metabolism profiles of ginsenoside Rh2 and its aglycon protopanaxadiol (ppd) were studied in the human epithelial Caco-2 cell line. High-performance liquid chromatography-mass spectrometry was applied to determine Rh2 and its aglycon ppd concentration in the cells at different pH, temperature, concentration levels and in the presence or absence of inhibitors. Rh2 uptake was time and concentration dependent, and its uptake rates were reduced by metabolic inhibitors and influenced by low temperature, thus indicating that the absorption process was energy-dependent. Drug uptake was maximal when the extracellular pH was 7.0 for Rh2 and 8.0 for ppd. Rh2 kinetic analysis showed that a non-saturable component (K-d 0.17 nmol (.) h(-1) (.) mg(-1) protein) and an active transport system with a K-m of 3.95 mumol (.) l(-1) and a V-max of 4.78 nmol(.)h(-1) (.)mg(-1) protein were responsible for the drug uptake. Kinetic analysis of ppd showed a non-saturable component (K-d 0.78 nmol (.) h(-1) (.) mg(-1) protein). It was suggested that active extrusion of P-glycoprotein and drug degradation in the intestine may influence Rh2 bioavailability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Essential hypertension is one of the most common diseases in the Western world, affecting about 26.4% of the adult population, and it is increasing (1). Its causes are heterogeneous and include genetic and environmental factors (2), but several observations point to an important role of the kidney in its genesis (3). In addition to variations in tubular transport mechanisms that could, for example, affect salt handling, structural characteristics of the kidney might also contribute to hypertension. The burden of chronic kidney disease is also increasing worldwide, due to population growth, increasing longevity, and changing risk factors. Although single-cause models of disease are still widely promoted, multideterminant or multihit models that can accommodate multiple risk factors in an individual or in a population are probably more applicable (4,5). In such a framework, nephron endowment is one potential determinant of disease susceptibility. Some time ago, Brenner and colleagues (6,7) proposed that lower nephron numbers predispose both to essential hypertension and to renal disease. They also proposed that hypertension and progressive renal insufficiency might be initiated and accelerated by glomerular hypertrophy and intraglomerular hypertension that develops as nephron number is reduced (8). In this review, we summarize data from recent studies that shed more light on these hypotheses. The data supply a new twist to possible mechanisms of the Barker hypothesis, which proposes that intrauterine growth retardation predisposes to chronic disease in later life (9). The review describes how nephron number is estimated and its range and some determinants and morphologic correlates. It then considers possible causes of low nephron numbers. Finally, associations of hypertension and renal disease with reduced nephron numbers are considered, and some potential clinical implications are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Researchers often use 3-way interactions in moderated multiple regression analysis to test the joint effect of 3 independent variables on a dependent variable. However, further probing of significant interaction terms varies considerably and is sometimes error prone. The authors developed a significance test for slope differences in 3-way interactions and illustrate its importance for testing psychological hypotheses. Monte Carlo simulations revealed that sample size, magnitude of the slope difference, and data reliability affected test power. Application of the test to published data yielded detection of some slope differences that were undetected by alternative probing techniques and led to changes of results and conclusions. The authors conclude by discussing the test's applicability for psychological research. Copyright 2006 by the American Psychological Association.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently identified genes located downstream (3') of the msmEF (transport encoding) gene cluster, msmGH, and located 5' of the structural genes for methanesulfonate monooxygenase (MSAMO) are described from Methylosulfonomonas methylovora. Sequence analysis of the derived polypeptide sequences encoded by these genes revealed a high degree of identity to ABC-type transporters. MsmE showed similarity to a putative periplasmic substrate binding protein, MsmF resembled an integral membraneassociated protein, and MsmG was a putative ATP-binding enzyme. MsmH was thought to be the cognate permease component of the sulfonate transport system. The close association of these putative transport genes to the MSAMO structural genes msmABCD suggested a role for these genes in transport of methanesulfonic acid (MSA) into M. methylovora. msmEFGH and msmABCD constituted two operons for the coordinated expression of MSAMO and the MSA transporter systems. Reverse-transcription-PCR analysis of msmABCD and msmEFGH revealed differential expression of these genes during growth on MSA and methanol. The msmEFGH operon was constitutively expressed, whereas MSA induced expression of msmABCD. A mutant defective in msmE had considerably slower growth rates than the wild type, thus supporting the proposed role of MsmE in the transport of MSA into M. methylovora.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with the means by which the state in Britain has attempted to influence the technological development of private industry in the period 1945-1979. Particular emphasis is laid on assessing the abilities of technology policy measures to promote innovation. With that objective, the innovation literature is selectively reviewed to draw up an analytical framework to evaluate the innovation content of policy (Chapter 2). Technology policy is taken to consist of the specific measures utilised by government and its agents that affect the technological behaviour of firms. The broad sweep of policy during the period under consideration is described in Chapter 3 which concentrates on elucidating its institutional structure and the activities of the bodies involved. The empirical core of the thesis consists of three parallel case studies of policy toward the computer, machine tool and textile machinery industries (Chapters 4-6). The studies provide detailed historical accounts of the development and composition of policy, relating it to its specific institutional and industrial contexts. Each reveals a different pattern and level of state intervention. The thesis concludes with a comparative review of the findings of the case studies within a discussion centred on the arguments presented in Chapter 2. Topics arising include the state's differential support for the range of activities involved in innovation, the location of state-funded R&D, the encouragement of supplier-user contact, and the difficulties raised in adoption and diffusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis aims to provide empirical studies towards Chinese corporate governance. Since China initially established its stock exchange system in the 1990s, it has gone through different stages of changes to become a more market-oriented system. Extensive studies have been conducted in Chinese corporate governance, however, many were theoretical discussion focusing on the early stages and there‘s a general lack of empirical analysis. This paper provides three empirical analysis of the Chinese corporate governance: the overall market discipline efficiency, the impact of capital structure on agency costs, the status of 2005- 2006 reform that substantially modified ownership structure of Chinese listed firms and separated ownership and control of listed firms. The three empirical studies were selected to reflect four key issues that need answering: the first empirical study, using event study to detect market discipline on a collective level. This study filled a gap in the Chinese stock market literature for being the first one ever using cross-market data to test market discipline. The second empirical study endeavoured to contribute to the existing corporate governance literature regarding capital structure and agency costs. Two conclusions can be made through this study: 1) for Chinese listed firms, higher gearing means higher asset turnover ratios and ROE, i.e. more debts seem to reduce agency costs; 2) concentration level of shares appears to be irrelevant with company performance, controlling shareholders didn‘t seem to commit to the improvement of corporate assets utilization or contribute to reducing agency costs. This study addressed a key issue in Chinese corporate governance since the state has significant shareholding in most big listed companies. The discussion of corporate governance in the Chinese context would be completely meaningless without discussing the state‘s role in corporate governance, given that about 2/3 of the almost all shares were non-circulating shares controlled by the state before the 2005-2006 overhaul ownership reform. The third study focused on the 2005-2006 reform of ownership of Chinese listed firms. By collecting large-scale data covering all 64 groups of Chinese listed companies went through the reform by the end of 2006 (accounting for about 97.86% and 96.76% of the total market value of Shanghai (SSE) and Shenzhen Stock Exchange (SZSE) respectively), a comprehensive study about the ownership reform was conducted. This would be first and most comprehensive empirical study in this area. The study of separated ownership and control of listed firm is the first study conducted using the ultimate ownership concept in Chinese context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a Web-Centric [3] extension to a previously developed glaucoma expert system that will provide access for doctors and patients from any part of the world. Once implemented, this telehealth solution will publish the services of the Glaucoma Expert System on the World Wide Web, allowing patients and doctors to interact with it from their own homes. This web-extension will also allow the expert system itself to be proactive and to send diagnosis alerts to the registered user or doctor and the patient, informing each one of any emergencies, therefore allowing them to take immediate actions. The existing Glaucoma Expert System uses fuzzy logic learning algorithms applied on historical patient data to update and improve its diagnosis rules set. This process, collectively called the learning process, would benefit greatly from a web-based framework that could provide services like patient data transfer and web- based distribution of updated rules [1].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the proof of Lemma 3.1 in [1] we need to show that we may take the two points p and q with p ≠ q such that p+q+(b-2)g21(C′)∼2(q1+… +qb-1) where q1,…,qb-1 are points of C′, but in the paper [1] we did not show that p ≠ q. Moreover, we hadn't been able to prove this using the method of our paper [1]. So we must add some more assumption to Lemma 3.1 and rewrite the statements of our paper after Lemma 3.1. The following is the correct version of Lemma 3.1 in [1] with its proof.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ocean acidification, recognized as a major threat to marine ecosystems, has developed into one of the fastest growing fields of research in marine sciences. Several studies on fish larval stages point to abnormal behaviours, malformations and increased mortality rates as a result of exposure to increased levels of CO2. However, other studies fail to recognize any consequence, suggesting species-specific sensitivity to increased levels of CO2, highlighting the need of further research. In this study we investigated the effects of exposure to elevated pCO2 on behaviour, development, oxidative stress and energy metabolism of sand smelt larvae, Atherina presbyter. Larvae were caught at Arrábida Marine Park (Portugal) and exposed to different pCO2 levels (control: 600 µatm, pH = 8.03; medium: 1000 µatm, pH = 7.85; high: 1800 µatm, pH = 7.64) up to 15 days, after which critical swimming speed (Ucrit), morphometric traits and biochemical biomarkers were determined. Measured biomarkers were related with: 1) oxidative stress-superoxide dismutase and catalase enzyme activities, levels of lipid peroxidation and DNA damage, and levels of superoxide anion production; 2) energy metabolism - total carbohydrate levels, electron transport system activity, lactate dehydrogenase and isocitrate dehydrogenase enzyme activities. Swimming speed was not affected by treatment, but exposure to increasing levels of pCO2 leads to higher energetic costs and morphometric changes, with larger larvae in high pCO2 treatment and smaller larvae in medium pCO2 treatment. The efficient antioxidant response capacity and increase in energetic metabolism only registered at the medium pCO2 treatment may indicate that at higher pCO2 levels the capacity of larvae to restore their internal balance can be impaired. Our findings illustrate the need of using multiple approaches to explore the consequences of future pCO2 levels on organisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During Leg ANT-XXIII/9 on 2007-04-04 the German research vessel POLARSTERN mapped a significant bathymetric feature with its swath sonar system in the area of the Indian Ridge in the Southern Indian Ocean. The feature is a vulcano located 800 km northwest of Crozet Island. The vulcano with a crater has an absolute height of 1370 m, extending from 3100 m mean depth of the surrounding sea floor to a depth of 1730 m at the top of the crater rim. The crater has a depth of about 135 m. Due to the fact, that the feature was discovered just a month after the fourth International Polar Year (IPY) 2007/2009 has started, it was named "IPY Seamount". The undersea feature name proposal was submitted to the International Hydrographic Organisation (IHO) and the Intergovernmental Oceanographic Commission (IOC of UNESCO) on 2007-05-11. The name was officially accepted by the GEBCO Sub-Committee on Undersea Feature Names (SCUFN) at its 20th meeting in July and was added to the GEBCO Gazetteer of UFN.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.