963 resultados para multi-source noise


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this article was to review the strategies to control patient dose in adult and pediatric computed tomography (CT), taking into account the change of technology from single-detector row CT to multi-detector row CT. First the relationships between computed tomography dose index, dose length product, and effective dose in adult and pediatric CT are revised, along with the diagnostic reference level concept. Then the effect of image noise as a function of volume computed tomography dose index, reconstructed slice thickness, and the size of the patient are described. Finally, the potential of tube current modulation CT is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major issue in the application of waveform inversion methods to crosshole ground-penetrating radar (GPR) data is the accurate estimation of the source wavelet. Here, we explore the viability and robustness of incorporating this step into a recently published time-domain inversion procedure through an iterative deconvolution approach. Our results indicate that, at least in non-dispersive electrical environments, such an approach provides remarkably accurate and robust estimates of the source wavelet even in the presence of strong heterogeneity of both the dielectric permittivity and electrical conductivity. Our results also indicate that the proposed source wavelet estimation approach is relatively insensitive to ambient noise and to the phase characteristics of the starting wavelet. Finally, there appears to be little to no trade-off between the wavelet estimation and the tomographic imaging procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to compare the diagnostic value of post-mortem computed tomography angiography (PMCTA) to conventional, ante-mortem computed tomography (CT)-scan, CT-angiography (CTA) and digital subtraction angiography (DSA) in the detection and localization of the source of bleeding in cases of acute hemorrhage with fatal outcomes. The medical records and imaging scans of nine individuals who underwent a conventional, ante-mortem CT-scan, CTA or DSA and later died in the hospital as a result of an acute hemorrhage were reviewed. Post-mortem computed tomography angiography, using multi-phase post-mortem CTA, as well as medico-legal autopsies were performed. Localization accuracy of the bleeding was assessed by comparing the diagnostic findings of the different techniques. The results revealed that data from ante-mortem and post-mortem radiological examinations were similar, though the PMCTA showed a higher sensitivity for detecting the hemorrhage source than did ante-mortem radiological investigations. By comparing the results of PMCTA and conventional autopsy, much higher sensitivity was noted in PMCTA in identifying the source of the bleeding. In fact, the vessels involved were identified in eight out of nine cases using PMCTA and only in three cases through conventional autopsy. Our study showed that PMCTA, similar to clinical radiological investigations, is able to precisely identify lesions of arterial and/or venous vessels and thus determine the source of bleeding in cases of acute hemorrhages with fatal outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cases of fatal outcome after surgical intervention are autopsied to determine the cause of death and to investigate whether medical error caused or contributed to the death. For medico-legal purposes, it is imperative that autopsy findings are documented clearly. Modern imaging techniques such as multi-detector computed tomography (MDCT) and postmortem CT angiography, which is used for vascular system imaging, are useful tools for determining cause of death. The aim of this study was to determine the utility of postmortem CT angiography for the medico-legal death investigation. This study investigated 10 medico-legal cases with a fatal outcome after surgical intervention using multi-phase postmortem whole body CT angiography. A native CT scan was performed as well as three angiographic phases (arterial, venous, and dynamic) using a Virtangio((R)) perfusion device and the oily contrast agent, Angiofil((R)). The results of conventional autopsy were compared to those from the radiological investigations. We also investigated whether the radiological findings affected the final interpretation of cause-of-death. Causes of death were hemorrhagic shock, intracerebral hemorrhage, septic shock, and a combination of hemorrhage and blood aspiration. The diagnoses were made by conventional autopsy as well as by postmortem CT angiography. Hemorrhage played an important role in eight of ten cases. The radiological exam revealed the exact source of bleeding in seven of the eight cases, whereas conventional autopsy localized the source of bleeding only generally in five of the seven cases. In one case, neither conventional autopsy nor CT angiography identified the source of hemorrhage. We conclude that postmortem CT angiography is extremely useful for investigating deaths following surgical interventions. This technique helps document autopsy findings and allows a second examination if it is needed; specifically, it detects and visualizes the sources of hemorrhages in detail, which is often of particular interest in such cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the qualitativecomparative evaluation performed on theresults of two machine translation systemswith different approaches to the processing ofmulti-word units. It proposes a solution forovercoming the difficulties multi-word unitspresent to machine translation by adopting amethodology that combines the lexicongrammar approach with OpenLogos ontologyand semantico-syntactic rules. The paper alsodiscusses the importance of a qualitativeevaluation metrics to correctly evaluate theperformance of machine translation engineswith regards to multi-word units.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To date, published studies of alluvial bar architecture in large rivers have been restricted mostly to case studies of individual bars and single locations. Relatively little is known about how the depositional processes and sedimentary architecture of kilometre-scale bars vary within a multi-kilometre reach or over several hundreds of kilometres downstream. This study presents Ground Penetrating Radar and core data from 11, kilometre-scale bars from the Rio Parana, Argentina. The investigated bars are located between 30km upstream and 540km downstream of the Rio Parana - Rio Paraguay confluence, where a significant volume of fine-grained suspended sediment is introduced into the network. Bar-scale cross-stratified sets, with lengths and widths up to 600m and thicknesses up to 12m, enable the distinction of large river deposits from stacked deposits of smaller rivers, but are only present in half the surface area of the bars. Up to 90% of bar-scale sets are found on top of finer-grained ripple-laminated bar-trough deposits. Bar-scale sets make up as much as 58% of the volume of the deposits in small, incipient mid-channel bars, but this proportion decreases significantly with increasing age and size of the bars. Contrary to what might be expected, a significant proportion of the sedimentary structures found in the Rio Parana is similar in scale to those found in much smaller rivers. In other words, large river deposits are not always characterized by big structures that allow a simple interpretation of river scale. However, the large scale of the depositional units in big rivers causes small-scale structures, such as ripple sets, to be grouped into thicker cosets, which indicate river scale even when no obvious large-scale sets are present. The results also show that the composition of bars differs between the studied reaches upstream and downstream of the confluence with the Rio Paraguay. Relative to other controls on downstream fining, the tributary input of fine-grained suspended material from the Rio Paraguay causes a marked change in the composition of the bar deposits. Compared to the upstream reaches, the sedimentary architecture of the downstream reaches in the top ca 5m of mid-channel bars shows: (i) an increase in the abundance and thickness (up to metre-scale) of laterally extensive (hundreds of metres) fine-grained layers; (ii) an increase in the percentage of deposits comprised of ripple sets (to >40% in the upper bar deposits); and (iii) an increase in bar-trough deposits and a corresponding decrease in bar-scale cross-strata (<10%). The thalweg deposits of the Rio Parana are composed of dune sets, even directly downstream from the Rio Paraguay where the upper channel deposits are dominantly fine-grained. Thus, the change in sedimentary facies due to a tributary point-source of fine-grained sediment is primarily expressed in the composition of the upper bar deposits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After the release of the gamma-ray source catalog produced by the Fermi satellite during its first two years of operation, a significant fraction of sources still remain unassociated at lower energies. In addition to well-known high-energy emitters (pulsars, blazars, supernova remnants, etc.), theoretical expectations predict new classes of gamma-ray sources. In particular, gamma-ray emission could be associated with some of the early phases of stellar evolution, but this interesting possibility is still poorly understood. Aims: The aim of this paper is to assess the possibility of the Fermi gamma-ray source 2FGL J0607.5-0618c being associated with the massive star forming region Monoceros R2. Methods: A multi-wavelength analysis of the Monoceros R2 region is carried out using archival data at radio, infrared, X-ray, and gamma-ray wavelengths. The resulting observational properties are used to estimate the physical parameters needed to test the different physical scenarios. Results: We confirm the 2FGL J0607.5-0618c detection with improved confidence over the Fermi two-year catalog. We find that a combined effect of the multiple young stellar objects in Monoceros R2 is a viable picture for the nature of the source.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'imagerie par résonance magnétique (IRM) peut fournir aux cardiologues des informations diagnostiques importantes sur l'état de la maladie de l'artère coronarienne dans les patients. Le défi majeur pour l'IRM cardiaque est de gérer toutes les sources de mouvement qui peuvent affecter la qualité des images en réduisant l'information diagnostique. Cette thèse a donc comme but de développer des nouvelles techniques d'acquisitions des images IRM, en changeant les techniques de compensation du mouvement, pour en augmenter l'efficacité, la flexibilité, la robustesse et pour obtenir plus d'information sur le tissu et plus d'information temporelle. Les techniques proposées favorisent donc l'avancement de l'imagerie des coronaires dans une direction plus maniable et multi-usage qui peut facilement être transférée dans l'environnement clinique. La première partie de la thèse s'est concentrée sur l'étude du mouvement des artères coronariennes sur des patients en utilisant la techniques d'imagerie standard (rayons x), pour mesurer la précision avec laquelle les artères coronariennes retournent dans la même position battement après battement (repositionnement des coronaires). Nous avons découvert qu'il y a des intervalles dans le cycle cardiaque, tôt dans la systole et à moitié de la diastole, où le repositionnement des coronaires est au minimum. En réponse nous avons développé une nouvelle séquence d'acquisition (T2-post) capable d'acquérir les données aussi tôt dans la systole. Cette séquence a été testée sur des volontaires sains et on a pu constater que la qualité de visualisation des artère coronariennes est égale à celle obtenue avec les techniques standard. De plus, le rapport signal sur bruit fourni par la séquence d'acquisition proposée est supérieur à celui obtenu avec les techniques d'imagerie standard. La deuxième partie de la thèse a exploré un paradigme d'acquisition des images cardiaques complètement nouveau pour l'imagerie du coeur entier. La technique proposée dans ce travail acquiert les données sans arrêt (free-running) au lieu d'être synchronisée avec le mouvement cardiaque. De cette façon, l'efficacité de la séquence d'acquisition est augmentée de manière significative et les images produites représentent le coeur entier dans toutes les phases cardiaques (quatre dimensions, 4D). Par ailleurs, l'auto-navigation de la respiration permet d'effectuer cette acquisition en respiration libre. Cette technologie rend possible de visualiser et évaluer l'anatomie du coeur et de ses vaisseaux ainsi que la fonction cardiaque en quatre dimensions et avec une très haute résolution spatiale et temporelle, sans la nécessité d'injecter un moyen de contraste. Le pas essentiel qui a permis le développement de cette technique est l'utilisation d'une trajectoire d'acquisition radiale 3D basée sur l'angle d'or. Avec cette trajectoire, il est possible d'acquérir continûment les données d'espace k, puis de réordonner les données et choisir les paramètres temporel des images 4D a posteriori. L'acquisition 4D a été aussi couplée avec un algorithme de reconstructions itératif (compressed sensing) qui permet d'augmenter la résolution temporelle tout en augmentant la qualité des images. Grâce aux images 4D, il est possible maintenant de visualiser les artères coronariennes entières dans chaque phase du cycle cardiaque et, avec les mêmes données, de visualiser et mesurer la fonction cardiaque. La qualité des artères coronariennes dans les images 4D est la même que dans les images obtenues avec une acquisition 3D standard, acquise en diastole Par ailleurs, les valeurs de fonction cardiaque mesurées au moyen des images 4D concorde avec les valeurs obtenues avec les images 2D standard. Finalement, dans la dernière partie de la thèse une technique d'acquisition a temps d'écho ultra-court (UTE) a été développée pour la visualisation in vivo des calcifications des artères coronariennes. Des études récentes ont démontré que les acquisitions UTE permettent de visualiser les calcifications dans des plaques athérosclérotiques ex vivo. Cepandent le mouvement du coeur a entravé jusqu'à maintenant l'utilisation des techniques UTE in vivo. Pour résoudre ce problème nous avons développé une séquence d'acquisition UTE avec trajectoire radiale 3D et l'avons testée sur des volontaires. La technique proposée utilise une auto-navigation 3D pour corriger le mouvement respiratoire et est synchronisée avec l'ECG. Trois échos sont acquis pour extraire le signal de la calcification avec des composants au T2 très court tout en permettant de séparer le signal de la graisse depuis le signal de l'eau. Les résultats sont encore préliminaires mais on peut affirmer que la technique développé peut potentiellement montrer les calcifications des artères coronariennes in vivo. En conclusion, ce travail de thèse présente trois nouvelles techniques pour l'IRM du coeur entier capables d'améliorer la visualisation et la caractérisation de la maladie athérosclérotique des coronaires. Ces techniques fournissent des informations anatomiques et fonctionnelles en quatre dimensions et des informations sur la composition du tissu auparavant indisponibles. CORONARY artery magnetic resonance imaging (MRI) has the potential to provide the cardiologist with relevant diagnostic information relative to coronary artery disease of patients. The major challenge of cardiac MRI, though, is dealing with all sources of motions that can corrupt the images affecting the diagnostic information provided. The current thesis, thus, focused on the development of new MRI techniques that change the standard approach to cardiac motion compensation in order to increase the efficiency of cardioavscular MRI, to provide more flexibility and robustness, new temporal information and new tissue information. The proposed approaches help in advancing coronary magnetic resonance angiography (MRA) in the direction of an easy-to-use and multipurpose tool that can be translated to the clinical environment. The first part of the thesis focused on the study of coronary artery motion through gold standard imaging techniques (x-ray angiography) in patients, in order to measure the precision with which the coronary arteries assume the same position beat after beat (coronary artery repositioning). We learned that intervals with minimal coronary artery repositioning occur in peak systole and in mid diastole and we responded with a new pulse sequence (T2~post) that is able to provide peak-systolic imaging. Such a sequence was tested in healthy volunteers and, from the image quality comparison, we learned that the proposed approach provides coronary artery visualization and contrast-to-noise ratio (CNR) comparable with the standard acquisition approach, but with increased signal-to-noise ratio (SNR). The second part of the thesis explored a completely new paradigm for whole- heart cardiovascular MRI. The proposed techniques acquires the data continuously (free-running), instead of being triggered, thus increasing the efficiency of the acquisition and providing four dimensional images of the whole heart, while respiratory self navigation allows for the scan to be performed in free breathing. This enabling technology allows for anatomical and functional evaluation in four dimensions, with high spatial and temporal resolution and without the need for contrast agent injection. The enabling step is the use of a golden-angle based 3D radial trajectory, which allows for a continuous sampling of the k-space and a retrospective selection of the timing parameters of the reconstructed dataset. The free-running 4D acquisition was then combined with a compressed sensing reconstruction algorithm that further increases the temporal resolution of the 4D dataset, while at the same time increasing the overall image quality by removing undersampling artifacts. The obtained 4D images provide visualization of the whole coronary artery tree in each phases of the cardiac cycle and, at the same time, allow for the assessment of the cardiac function with a single free- breathing scan. The quality of the coronary arteries provided by the frames of the free-running 4D acquisition is in line with the one obtained with the standard ECG-triggered one, and the cardiac function evaluation matched the one measured with gold-standard stack of 2D cine approaches. Finally, the last part of the thesis focused on the development of ultrashort echo time (UTE) acquisition scheme for in vivo detection of calcification in the coronary arteries. Recent studies showed that UTE imaging allows for the coronary artery plaque calcification ex vivo, since it is able to detect the short T2 components of the calcification. The heart motion, though, prevented this technique from being applied in vivo. An ECG-triggered self-navigated 3D radial triple- echo UTE acquisition has then been developed and tested in healthy volunteers. The proposed sequence combines a 3D self-navigation approach with a 3D radial UTE acquisition enabling data collection during free breathing. Three echoes are simultaneously acquired to extract the short T2 components of the calcification while a water and fat separation technique allows for proper visualization of the coronary arteries. Even though the results are still preliminary, the proposed sequence showed great potential for the in vivo visualization of coronary artery calcification. In conclusion, the thesis presents three novel MRI approaches aimed at improved characterization and assessment of atherosclerotic coronary artery disease. These approaches provide new anatomical and functional information in four dimensions, and support tissue characterization for coronary artery plaques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän pro gradu -tutkielman tarkoituksena oli selvittää, miten luottamus vaikuttaa yrityksen kilpailuetuun. Luottamus on monitieteellinen ilmiö joka on saanut varsin paljon huomiota viimeaikaisissa tutkimuksissa. Myös jatkuvasti kehittyvässä yrityksen teoriassa yrityksen kilpailukyky ja sen pysyvyys ovat olleet mielenkiinnon kohteena. Luottamuksen ja kilpailuedun suhdetta on tästä huolimatta tutkittu vain vähän. Luottamus on nähty ikään kuin implisiittisesti yrityksen teoriassa, mutta ei eksplisiittisesti. Tämä tutkielma analysoi luottamuksen ja yrityksen kilpailukyvyn teorioita sekä näiden yhtymäkohtia. Tutkimuksen lopputuloksena esitellään nämä teoriat toisiinsa yhdistävä malli. Tutkielma on teoreettinen ja osin käsiteanalyyttinen, tutkimusote on syntetisoiva ja exploratiivinen, sillä se pyrkii osoittamaan aiemmin erillisten teorioiden yhtymäkohdat. Esiteltävä malli osoittaa, että luottamuksen ja yrityksen pysyvän kilpailuedun teoriat voidaan liittää toisiinsa, tärkeimmän yhdistävän tekijän ollessa yhteistyökyvykkyys. Luottamus on yhteistyökyvykkyyden välttämätön ennakkoedellytys. Kyky tehdä yhteistyötä on, paitsi jo itsessään erittäin tärkeä dynaaminen kyvykkyys, myös muiden dynaamisten kyvykkyyksien luomista ja hyväksikäyttöä edesauttava tekijä. Sikäli yhteistyö voidaan nähdä yrityksen teoriassa muut dynaamiset kyvykkyydet mahdollistavana meta-kyvykkyytenä ja luottamus sen edellytyksenä.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, massive protostars have turned out to be a possible population of high-energy emitters. Among the best candidates is IRAS 16547-4247, a protostar that presents a powerful outflow with clear signatures of interaction with its environment. This source has been revealed to be a potential high-energy source because it displays non-thermal radio emission of synchrotron origin, which is evidence of relativistic particles. To improve our understanding of IRAS 16547-4247 as a high-energy source, we analyzed XMM-Newton archival data and found that IRAS 16547-4247 is a hard X-ray source. We discuss these results in the context of a refined one-zone model and previous radio observations. From our study we find that it may be difficult to explain the X-ray emission as non-thermal radiation coming from the interaction region, but it might be produced by thermal Bremsstrahlung (plus photo-electric absorption) by a fast shock at the jet end. In the high-energy range, the source might be detectable by the present generation of Cherenkov telescopes, and may eventually be detected by Fermi in the GeV range.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. There are a number of very high energy sources in the Galaxy that remain unidentified. Multi-wavelength and variability studies, and catalogue searches, are powerful tools to identify the physical counterpart, given the uncertainty in the source location and extension. Aims. This work carries out a thorough multi-wavelength study of the unidentified, very high energy source HESS J1858+020 and its environs. Methods. We have performed Giant Metrewave Radio Telescope observations at 610 MHz and 1.4 GHz to obtain a deep, low-frequency radio image of the region surrounding HESS J1858+020. We analysed archival radio, infrared, and X-ray data as well. This observational information, combined with molecular data, catalogue sources, and a nearby Fermi gamma-ray detection of unidentified origin, are combined to explore possible counterparts to the very high energy source. Results. We provide with a deep radio image of a supernova remnant that might be related to the GeV and TeV emission in the region. We confirm the presence of an H ii region next to the supernova remnant and coincident with molecular emission. A potential region of star formation is also identified. We identify several radio and X-ray sources in the surroundings. Some of these sources are known planetary nebulae, whereas others may be non-thermal extended emitters and embedded young stellar objects. Three old, background Galactic pulsars also neighbour HESS J1858+020 along the line of sight. Conclusions. The region surrounding HESS J1858+020 is rich in molecular structures and non-thermal objects that may potentially be linked to this unidentified very high energy source. In particular, a supernova remnant interacting with nearby molecular clouds may be a good candidate, but a star forming region, or a non-thermal radio source of yet unclear nature, may also be behind the gamma-ray source. The neighbouring pulsars, despite being old and distant, cannot be discarded as candidates. Further observational studies are needed, however, to narrow the search for a counterpart to the HESS source.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents the design and implementation of a GPS-signal source suitable for receiver measurements. The developed signal source is based on direct digital synthesis which generates the intermediate frequency. The intermediate frequency is transfered to the final frequency with the aid of an Inphase/Quadrature modulator. The modulating GPS-data was generated with MATLAB. The signal source was duplicated to form a multi channel source. It was shown that, GPS-signals ment for civil navigation are easy to generate in the laboratory. The hardware does not need to be technically advanced if navigation with high level of accuracy is not needed. It was also shown that, the Inphase/Quadrature modulator can function as a single side band upconverter even with a high intermediate frequency. This concept reduces the demands required for output filtering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To determine inter-session and intra/inter-individual variations of the attenuations of aortic blood/myocardium with MDCT in the context of calcium scoring. To evaluate whether these variations are dependent on patients' characteristics. METHODS: Fifty-four volunteers were evaluated with calcium scoring non-enhanced CT. We measured attenuations (inter-individual variation) and standard deviations (SD, intra-individual variation) of the blood in the ascending aorta and of the myocardium of left ventricle. Every volunteer was examined twice to study the inter-session variation. The fat pad thickness at the sternum and noise (SD of air) were measured too. These values were correlated with the measured aortic/ventricular attenuations and their SDs (Pearson). Historically fixed thresholds (90 and 130 HU) were tested against different models based on attenuations of blood/ventricle. RESULTS: The mean attenuation was 46 HU (range, 17-84 HU) with mean SD 23 HU for the blood, and 39 HU (10-82 HU) with mean SD 18 HU for the myocardium. The attenuation/SD of the blood were significantly higher than those of the myocardium (p < 0.01). The inter-session variation was not significant. There was a poor correlation between SD of aortic blood/ventricle with fat thickness/noise. Based on existing models, 90 HU threshold offers a confidence interval of approximately 95% and 130 HU more than 99%. CONCLUSIONS: Historical thresholds offer high confidence intervals for exclusion of aortic blood/myocardium and by the way for detecting calcifications. Nevertheless, considering the large variations of blood/myocardium CT values and the influence of patient's characteristics, a better approach might be an adaptive threshold.