991 resultados para X-ray test


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this Thesis we have presented our work on the analysis of galaxy clusters through their X-ray emission and the gravitational lensing effect that they induce. Our research work was mainly finalised to verify and possibly explain the observed mismatch between the galaxy cluster mass distributions estimated through two of the most promising techniques, i.e. the X-ray and the gravitational lensing analyses. Moreover, it is an established evidence that combined, multi-wavelength analyses are extremely effective in addressing and explaining the open issues in astronomy: however, in order to follow this approach, it is crucial to test the reliability and the limitations of the individual analysis techniques. In this Thesis we also assessed the impact of some factors that could affect both the X-ray and the strong lensing analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

X-ray absorption spectroscopy (XAS) is a powerful means of investigation of structural and electronic properties in condensed -matter physics. Analysis of the near edge part of the XAS spectrum, the so – called X-ray Absorption Near Edge Structure (XANES), can typically provide the following information on the photoexcited atom: - Oxidation state and coordination environment. - Speciation of transition metal compounds. - Conduction band DOS projected on the excited atomic species (PDOS). Analysis of XANES spectra is greatly aided by simulations; in the most common scheme the multiple scattering framework is used with the muffin tin approximation for the scattering potential and the spectral simulation is based on a hypothetical, reference structure. This approach has the advantage of requiring relatively little computing power but in many cases the assumed structure is quite different from the actual system measured and the muffin tin approximation is not adequate for low symmetry structures or highly directional bonds. It is therefore very interesting and justified to develop alternative methods. In one approach, the spectral simulation is based on atomic coordinates obtained from a DFT (Density Functional Theory) optimized structure. In another approach, which is the object of this thesis, the XANES spectrum is calculated directly based on an ab – initio DFT calculation of the atomic and electronic structure. This method takes full advantage of the real many-electron final wavefunction that can be computed with DFT algorithms that include a core-hole in the absorbing atom to compute the final cross section. To calculate the many-electron final wavefunction the Projector Augmented Wave method (PAW) is used. In this scheme, the absorption cross section is written in function of several contributions as the many-electrons function of the finale state; it is calculated starting from pseudo-wavefunction and performing a reconstruction of the real-wavefunction by using a transform operator which contains some parameters, called partial waves and projector waves. The aim of my thesis is to apply and test the PAW methodology to the calculation of the XANES cross section. I have focused on iron and silicon structures and on some biological molecules target (myoglobin and cytochrome c). Finally other inorganic and biological systems could be taken into account for future applications of this methodology, which could become an important improvement with respect to the multiscattering approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a new method for fully-automatic landmark detection and shape segmentation in X-ray images. Our algorithm works by estimating the displacements from image patches to the (unknown) landmark positions and then integrating them via voting. The fundamental contribution is that, we jointly estimate the displacements from all patches to multiple landmarks together, by considering not only the training data but also geometric constraints on the test image. The various constraints constitute a convex objective function that can be solved efficiently. Validated on three challenging datasets, our method achieves high accuracy in landmark detection, and, combined with statistical shape model, gives a better performance in shape segmentation compared to the state-of-the-art methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aviation security strongly depends on screeners' performance in the detection of threat objects in x-ray images of passenger bags. We examined for the first time the effects of stress and stress-induced cortisol increases on detection performance of hidden weapons in an x-ray baggage screening task. We randomly assigned 48 participants either to a stress or a nonstress group. The stress group was exposed to a standardized psychosocial stress test (TSST). Before and after stress/nonstress, participants had to detect threat objects in a computer-based object recognition test (X-ray ORT). We repeatedly measured salivary cortisol and X-ray ORT performance before and after stress/nonstress. Cortisol increases in reaction to psychosocial stress induction but not to nonstress independently impaired x-ray detection performance. Our results suggest that stress-induced cortisol increases at peak reactivity impair x-ray screening performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose a new method for fully-automatic landmark detection and shape segmentation in X-ray images. To detect landmarks, we estimate the displacements from some randomly sampled image patches to the (unknown) landmark positions, and then we integrate these predictions via a voting scheme. Our key contribution is a new algorithm for estimating these displacements. Different from other methods where each image patch independently predicts its displacement, we jointly estimate the displacements from all patches together in a data driven way, by considering not only the training data but also geometric constraints on the test image. The displacements estimation is formulated as a convex optimization problem that can be solved efficiently. Finally, we use the sparse shape composition model as the a priori information to regularize the landmark positions and thus generate the segmented shape contour. We validate our method on X-ray image datasets of three different anatomical structures: complete femur, proximal femur and pelvis. Experiments show that our method is accurate and robust in landmark detection, and, combined with the shape model, gives a better or comparable performance in shape segmentation compared to state-of-the art methods. Finally, a preliminary study using CT data shows the extensibility of our method to 3D data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Schwalbenberg II loess-paleosol sequence (LPS) denotes a key site for Marine Isotope Stage (MIS 3) in Western Europe owing to eight succeeding cambisols, which primarily constitute the Ahrgau Subformation. Therefore, this LPS qualifies as a test candidate for the potential of temporal high-resolution geochemical data obtained X-ray fluorescence (XRF) scanning of discrete samplesproviding a fast and non-destructive tool for determining the element composition. The geochemical data is first contextualized to existing proxy data such as magnetic susceptibility (MS) and organic carbon (Corg) and then aggregated to element log ratios characteristic for weathering intensity [LOG (Ca/Sr), LOG (Rb/Sr), LOG (Ba/Sr), LOG (Rb/K)] and dust provenance [LOG (Ti/Zr), LOG (Ti/Al), LOG (Si/Al)]. Generally, an interpretation of rock magnetic particles is challenged in western Europe, where not only magnetic enhancement but also depletion plays a role. Our data indicates leaching and top-soil erosion induced MS depletion at the Schwalbenberg II LPS. Besides weathering, LOG (Ca/Sr) is susceptible for secondary calcification. Thus, also LOG (Rb/Sr) and LOG (Ba/Sr) are shown to be influenced by calcification dynamics. Consequently, LOG (Rb/K) seems to be the most suitable weathering index identifying the Sinzig Soils S1 and S2 as the most pronounced paleosols for this site. Sinzig Soil S3 is enclosed by gelic gleysols and in contrast to S1 and S2 only initially weathered pointing to colder climate conditions. Also the Remagen Soils are characterized by subtle to moderate positive excursions in the weathering indices. Comparing the Schwalbenberg II LPS with the nearby Eifel Lake Sediment Archive (ELSA) and other more distant German, Austrian and Czech LPS while discussing time and climate as limiting factors for pedogenesis, we suggest that the lithologically determined paleosols are in-situ soil formations. The provenance indices document a Zr-enrichment at the transition from the Ahrgau to the Hesbaye Subformation. This is explained by a conceptual model incorporating multiple sediment recycling and sorting effects in eolian and fluvial domains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the fracturing process in low-porous rocks during uniaxial compressive tests considering the original defects and the new mechanical cracks in the material. For this purpose, five different kinds of rocks have been chosen with carbonate mineralogy and low porosity (lower than 2%). The characterization of the fracture damage is carried out using three different techniques: ultrasounds, mercury porosimetry and X-ray computed tomography. The proposed methodology allows quantifying the evolution of the porous system as well as studying the location of new cracks in the rock samples. Intercrystalline porosity (the smallest pores with pore radius < 1 μm) shows a limited development during loading, disappearing rapidly from the porosimetry curves and it is directly related to the initial plastic behaviour in the stress–strain patterns. However, the biggest pores (corresponding to the cracks) suffer a continuous enlargement until the unstable propagation of fractures. The measured crack initiation stress varies between 0.25 σp and 0.50 σp for marbles and between 0.50 σp and 0.85 σp for micrite limestone. The unstable propagation of cracks is assumed to occur very close to the peak strength. Crack propagation through the sample is completely independent of pre-existing defects (porous bands, stylolites, fractures and veins). The ultrasonic response in the time-domain is less sensitive to the fracture damage than the frequency-domain. P-wave velocity increases during loading test until the beginning of the unstable crack propagation. This increase is higher for marbles (between 15% and 30% from initial vp values) and lower for micrite limestones (between 5% and 10%). When the mechanical cracks propagate unstably, the velocity stops to increase and decreases only when rock damage is very high. Frequency analysis of the ultrasonic signals shows clear changes during the loading process. The spectrum of treated waveforms shows two main frequency peaks centred at low (~ 20 kHz) and high (~ 35 kHz) values. When new fractures appear and grow the amplitude of the high-frequency peak decreases, while that of the low-frequency peak increases. Besides, a slight frequency shift is observed towards higher frequencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context. Classical supergiant X-ray binaries (SGXBs) and supergiant fast X-ray transients (SFXTs) are two types of high-mass X-ray binaries (HMXBs) that present similar donors but, at the same time, show very different behavior in the X-rays. The reason for this dichotomy of wind-fed HMXBs is still a matter of debate. Among the several explanations that have been proposed, some of them invoke specific stellar wind properties of the donor stars. Only dedicated empiric analysis of the donors’ stellar wind can provide the required information to accomplish an adequate test of these theories. However, such analyses are scarce. Aims. To close this gap, we perform a comparative analysis of the optical companion in two important systems: IGR J17544-2619 (SFXT) and Vela X-1 (SGXB). We analyze the spectra of each star in detail and derive their stellar and wind properties. As a next step, we compare the wind parameters, giving us an excellent chance of recognizing key differences between donor winds in SFXTs and SGXBs. Methods. We use archival infrared, optical and ultraviolet observations, and analyze them with the non-local thermodynamic equilibrium (NLTE) Potsdam Wolf-Rayet model atmosphere code. We derive the physical properties of the stars and their stellar winds, accounting for the influence of X-rays on the stellar winds. Results. We find that the stellar parameters derived from the analysis generally agree well with the spectral types of the two donors: O9I (IGR J17544-2619) and B0.5Iae (Vela X-1). The distance to the sources have been revised and also agree well with the estimations already available in the literature. In IGR J17544-2619 we are able to narrow the uncertainty to d = 3.0 ± 0.2 kpc. From the stellar radius of the donor and its X-ray behavior, the eccentricity of IGR J17544-2619 is constrained to e< 0.25. The derived chemical abundances point to certain mixing during the lifetime of the donors. An important difference between the stellar winds of the two stars is their terminal velocities (ν∞ = 1500 km s-1 in IGR J17544-2619 and ν∞ = 700 km s-1 in Vela X-1), which have important consequences on the X-ray luminosity of these sources. Conclusions. The donors of IGR J17544-2619 and Vela X-1 have similar spectral types as well as similar parameters that physically characterize them and their spectra. In addition, the orbital parameters of the systems are similar too, with a nearly circular orbit and short orbital period. However, they show moderate differences in their stellar wind velocity and the spin period of their neutron star which has a strong impact on the X-ray luminosity of the sources. This specific combination of wind speed and pulsar spin favors an accretion regime with a persistently high luminosity in Vela X-1, while it favors an inhibiting accretion mechanism in IGR J17544-2619. Our study demonstrates that the relative wind velocity is critical in class determination for the HMXBs hosting a supergiant donor, given that it may shift the accretion mechanism from direct accretion to propeller regimes when combined with other parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contrast enhanced magnetic resonance imaging (CE MRI) is the most sensitive tool for screening women who are at high familial risk of breast cancer. Our aim in this study was to assess the cost-effectiveness of X-ray mammography (XRM), CE MRI or both strategies combined. In total, 649 women were enrolled in the MARIBS study and screened with both CE MRI and mammography resulting in 1881 screens and 1-7 individual annual screening events. Women aged 35-49 years at high risk of breast cancer, either because they have a strong family history of breast cancer or are tested carriers of a BRCA1, BRCA2 or TP53 mutation or are at a 50% risk of having inherited such a mutation, were recruited from 22 centres and offered annual MRI and XRM for between 2 and 7 years. Information on the number and type of further investigations was collected and specifically calculated unit costs were used to calculate the incremental cost per cancer detected. The numbers of cancer detected was 13 for mammography, 27 for CE MRI and 33 for mammography and CE MRI combined. In the subgroup of BRCA1 (BRCA2) mutation carriers or of women having a first degree relative with a mutation in BRCA1 (BRCA2) corresponding numbers were 3 (6), 12 (7) and 12 (11), respectively. For all women, the incremental cost per cancer detected with CE MRI and mammography combined was 28 pound 284 compared to mammography. When only BRCA1 or the BRCA2 groups were considered, this cost would be reduced to 11 pound 731 (CE MRI vs mammography) and 15 pound 302 (CE MRI and mammography vs mammography). Results were most sensitive to the unit cost estimate for a CE MRI screening test. Contrast-enhanced MRI might be a cost-effective screening modality for women at high risk, particularly for the BRCA1 and BRCA2 subgroups. Further work is needed to assess the impact of screening on mortality and health-related quality of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fluoroscopic images exhibit severe signal-dependent quantum noise, due to the reduced X-ray dose involved in image formation, that is generally modelled as Poisson-distributed. However, image gray-level transformations, commonly applied by fluoroscopic device to enhance contrast, modify the noise statistics and the relationship between image noise variance and expected pixel intensity. Image denoising is essential to improve quality of fluoroscopic images and their clinical information content. Simple average filters are commonly employed in real-time processing, but they tend to blur edges and details. An extensive comparison of advanced denoising algorithms specifically designed for both signal-dependent noise (AAS, BM3Dc, HHM, TLS) and independent additive noise (AV, BM3D, K-SVD) was presented. Simulated test images degraded by various levels of Poisson quantum noise and real clinical fluoroscopic images were considered. Typical gray-level transformations (e.g. white compression) were also applied in order to evaluate their effect on the denoising algorithms. Performances of the algorithms were evaluated in terms of peak-signal-to-noise ratio (PSNR), signal-to-noise ratio (SNR), mean square error (MSE), structural similarity index (SSIM) and computational time. On average, the filters designed for signal-dependent noise provided better image restorations than those assuming additive white Gaussian noise (AWGN). Collaborative denoising strategy was found to be the most effective in denoising of both simulated and real data, also in the presence of image gray-level transformations. White compression, by inherently reducing the greater noise variance of brighter pixels, appeared to support denoising algorithms in performing more effectively. © 2012 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this research was to determine the composition of boron deposits produced by pyrolysis of boron tribromide, and to use the results to (a) determine the experimental conditions (reaction temperature, etc.) necessary to produce alpha-rhombohedral boron and (b) guide the development/refinement of the pyrolysis experiments such that large, high purity crystals of alpha-rhombohedral boron can be produced with consistency. Developing a method for producing large, high purity alpha-rhombohedral boron crystals is of interest because such crystals could potentially be used to achieve an alpha-rhombohedral boron based neutron detector design (a solid-state detector) that could serve as an alternative to existing neutron detector technologies. The supply of neutron detectors in the United States has been hampered for a number of years due to the current shortage of helium-3 (a gas used in many existing neutron detector technologies); the development of alternative neutron detector technology such as an alpha-rhombohedral boron based detector would help provide a more sustainable supply of neutron detectors in this country. In addition, the prospect/concept of an alpha-rhombohedral boron based neutron detector is attractive because it offers the possibility of achieving a design that is smaller, longer life, less power consuming, and potentially more sensitive than existing neutron detectors. The main difficulty associated with creating an alpha-rhombohedral boron based neutron detector is that producing large, high purity crystals of alpha-rhombohedral boron is extremely challenging. Past researchers have successfully made alpha-rhombohedral boron via a number of methods, but no one has developed a method for consistently producing large, high purity crystals. Alpha-rhombohedral boron is difficult to make because it is only stable at temperatures below around 1100-1200 °C, its formation is very sensitive to impurities, and the conditions necessary for its formation are not fully understood or agreed upon in the literature. In this research, the method of pyrolysis of boron tribromide (hydrogen reduction of boron tribromide) was used to deposit boron on a tantalum filament. The goal was to refine this method, or potentially use it in combination with a second method (amorphous boron crystallization), to the point where it is possible to grow large, high purity alpha-rhombohedral boron crystals with consistency. A pyrolysis apparatus was designed and built, and a number of trials were run to determine the conditions (reaction temperature, etc.) necessary for alpha-rhombohedral boron production. This work was focused on the x-ray diffraction analysis of the boron deposits; x-ray diffraction was performed on a number of samples to determine the types of boron (and other compounds) formed in each trial and to guide the choices of test conditions for subsequent trials. It was found that at low reaction temperatures (in the range of around 830-950 °C), amorphous boron was the primary form of boron produced. Reaction temperatures in the range of around 950-1000 °C yielded various combinations of crystalline boron and amorphous boron. In the first trial performed at a temperature of 950 °C, a mix of amorphous boron and alpha-rhombohedral boron was formed. Using a scanning electron microscope, it was possible to see small alpha-rhombohedral boron crystals (on the order of ~1 micron in size) embedded in the surface of the deposit. In subsequent trials carried out at reaction temperatures in the range of 950 °C – 1000 °C, it was found that various combinations of alpha-rhombohedral boron, beta-rhombohedral boron, and amorphous boron were produced; the results tended to be unpredictable (alpha-rhombohedral boron was not produced in every trial), and the factors leading to success/failure were difficult to pinpoint. These results illustrate how sensitive of a process producing alpha-rhombohedral boron can be, and indicate that further improvements to the test apparatus and test conditions (for example, higher purity/cleanliness) may be necessary to optimize the boron deposition. Although alpha-rhombohedral boron crystals of large size were not achieved, this research was successful in (a) developing a pyrolysis apparatus and test procedure that can serve as a platform for future testing, (b) determining reaction temperatures at which alpha-rhombohedral boron can form, and (c) developing a consistent process for analyzing the boron deposits and determining their composition. Further experimentation is necessary to achieve a pyrolysis apparatus and test procedure that can yield large alpha-rhombohedral boron crystals with consistency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Apply dual X-ray absorptiometry (DXA) to determine the amount of fat mass, lean mass, and bone mineral density in Mexican schoolchildren with and without obesity. Material and methods: We performed an observational, analytical, comparative, cross-sectional study of 80 Mexican schoolchildren who attended the Nutrition Clinic of the Pediatric Medical Center in Monterrey, Mexico during the period of January to April 2005. Body mass index (BMI) was determined to classify the participants according to the growth charts of the Centers for Disease Control and Prevention. Two groups of 40 children each (with and without obesity) were formed and DXA was carried out on each individual. Cronbach’s Alpha was used to determine instrument reliability and the Kolmogorov-Smirnov test was used to test the normality of numerical variables. Means were compared using Student´s t test. Results: Statistically signiicant differences were found in fat mass (p≤0.001) and lean mass (p≤0.001), but not in bone mineral content (p=0.051) between both groups. Conclusions: Differences exist in fat mass and lean mass in both groups, but not in bone mineral content between both groups. A signiicant positive correlation was found between fat mass, determined by DXA, and BMI in schoolchildren with and without obesity