1000 resultados para X-ray rocking curve
Resumo:
We present an analysis of a pointed 141 ks Chandra high-resolution transmission gratings observation of the Be X-ray emitting star HD110432, a prominent member of the γ Cas analogs. This observation represents the first high-resolution spectrum taken for this source as well as the longest uninterrupted observation of any γ Cas analog. The Chandra light curve shows a high variability but its analysis fails to detect any coherent periodicity up to a frequency of 0.05 Hz. Hardness ratio versus intensity analyses demonstrate that the relative contributions of the [1.5-3] Å, [3-6] Å, and [6-16] Å energy bands to the total flux change rapidly in the short term. The analysis of the Chandra High Energy Transmission Grating (HETG) spectrum shows that, to correctly describe the spectrum, three model components are needed. Two of those components are optically thin thermal plasmas of different temperatures (kT ≈ 8-9 and 0.2-0.3 keV, respectively) described by the models vmekal or bvapec. The Fe abundance in each of these two components appears equal within the errors and is slightly subsolar with Z ≈ 0.75 Z ☉. The bvapec model better describes the Fe L transitions, although it cannot fit well the Na XI Lyα line at 10.02 Å, which appears to be overabundant. Two different models seem to describe well the third component. One possibility is a third hot optically thin thermal plasma at kT = 16-21 keV with an Fe abundance Z ≈ 0.3 Z ☉, definitely smaller than for the other two thermal components. Furthermore, the bvapec model describes well the Fe K shell transitions because it accounts for the turbulence broadening of the Fe XXV and Fe XXVI lines with a v turb ≈ 1200 km s–1. These two lines, contributed mainly by the hot thermal plasma, are significantly wider than the Fe Kα line whose FWHM < 5 mÅ is not resolved by Chandra. Alternatively, the third component can be described by a power law with a photon index of Γ = 1.56. In either case, the Chandra HETG spectrum establishes that each one of these components must be modified by distinct absorption columns. The analysis of a noncontemporaneous 25 ks Suzaku observation shows the presence of a hard tail extending up to at least 33 keV. The Suzaku spectrum is described with the sum of two components: an optically thin thermal plasma at kT ≈ 9 keV and Z ≈ 0.74 Z ☉, and a very hot second plasma with kT ≈ 33 keV or, alternatively, a power law with photon index of Γ = 1.58. In either case, each one of the two components must be affected by different absorption columns. Therefore, the kT = 8-9 keV component is definitely needed while the nature of the harder emission cannot be unambiguously established with the present data sets. The analysis of the Si XIII and S XV He-like triplets present in the Chandra spectrum points to a very dense (ne ~ 1013 cm–3) plasma located either close to the stellar surface (r < 3R *) of the Be star or, alternatively, very close (r ~ 1.5R WD) to the surface of a (hypothetical) white dwarf companion. We argue, however, that the available data support the first scenario.
Resumo:
We report near-infrared radial velocity (RV) measurements of the recently identified donor star in the high mass X-ray binary (HMXB) system OAO 1657−415 obtained in the H band using ISAAC on the Very Large Telescope. Cross-correlation methods were employed to construct a RV curve with a semi-amplitude of 22.1 ± 3.5 km s−1. Combined with other measured parameters of this system it provides a dynamically determined neutron star (NS) mass of 1.42 ± 0.26 M⊙ and a mass of 14.3 ± 0.8 M⊙ for the Ofpe/WN9 highly evolved donor star. OAO 1657−415 is an eclipsing HMXB pulsar with the largest eccentricity and orbital period of any within its class. Of the 10 known eclipsing X-ray binary pulsars OAO 1657−415 becomes the ninth with a dynamically determined NS mass solution and only the second in an eccentric system. Furthermore, the donor star in OAO 1657−415 is much more highly evolved than the majority of the supergiant donors in other HMXBs, joining a small but growing list of HMXBs donors with extensive hydrogen depleted atmospheres. Considering the evolutionary development of OAO 1657−415, we have estimated the binding energy of the envelope of the mass donor and find that there is insufficient energy for the removal of the donor’s envelope via spiral-in, ruling out a common envelope evolutionary scenario. With its non-zero eccentricity and relatively large orbital period the identification of a definitive evolutionary pathway for OAO 1657−415 remains problematic, we conclude by proposing two scenarios which may account for OAO 1657−415 current orbital configuration.
Resumo:
Context. The current generation of X-ray satellites has discovered many new X-ray sources that are difficult to classify within the well-described subclasses. The hard X-ray source IGR J11215−5952 is a peculiar transient, displaying very short X-ray outbursts every 165 days. Aims. To characterise the source, we obtained high-resolution spectra of the optical counterpart, HD 306414, at different epochs, spanning a total of three months, before and around the 2007 February outburst with the combined aims of deriving its astrophysical parameters and searching for orbital modulation. Methods. We fit model atmospheres generated with the fastwind code to the spectrum, and used the interstellar lines in the spectrum to estimate its distance. We also cross-correlated each individual spectrum to the best-fit model to derive radial velocities. Results. From its spectral features, we classify HD 306414 as B0.5 Ia. From the model fit, we find Teff ≈ 24 700 K and log g ≈ 2.7, in good agreement with the morphological classification. Using the interstellar lines in its spectrum, we estimate a distance to HD 306414 d ≳ 7 kpc. Assuming this distance, we derive R∗ ≈ 40 R⊙ and Mspect ≈ 30 M⊙ (consistent, within errors, with Mevol ≈ 38 M⊙, and in good agreement with calibrations for the spectral type). Analysis of the radial velocity curve reveals that radial velocity changes are not dominated by the orbital motion, and provide an upper limit on the semi-amplitude for the optical component Kopt ≲ 11 ± 6 km s-1. Large variations in the depth and shape of photospheric lines suggest the presence of strong pulsations, which may be the main cause of the radial velocity changes. Very significant variations, uncorrelated with those of the photospheric lines are seen in the shape and position of the Hα emission feature around the time of the X-ray outburst, but large excursions are also observed at other times. Conclusions. HD 306414 is a normal B0.5 Ia supergiant. Its radial velocity curve is dominated by an effect that is different from binary motion, and is most likely stellar pulsations. The data available suggest that the X-ray outbursts are caused by the close passage of the neutron star in a very eccentric orbit, perhaps leading to localised mass outflow.
Resumo:
Slag composition determines the physical and chemical properties as well as the application performance of molten oxide mixtures. Therefore, it is necessary to establish a routine instrumental technique to produce accurate and precise analytical results for better process and production control. In the present paper, a multi-component analysis technique of powdered metallurgical slag samples by X-ray Fluorescence Spectrometer (XRFS) has been demonstrated. This technique provides rapid and accurate results, with minimum sample preparation. It eliminates the requirement for a fused disc, using briquetted samples protected by a layer of Borax(R). While the use of theoretical alpha coefficients has allowed accurate calibrations to be made using fewer standard samples, the application of pseudo-Voight function to curve fitting makes it possible to resolve overlapped peaks in X-ray spectra that cannot be physically separated. The analytical results of both certified reference materials and industrial slag samples measured using the present technique are comparable to those of the same samples obtained by conventional fused disc measurements.
Resumo:
X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.
A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.
Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.
The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).
First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.
Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.
Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.
The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.
To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.
The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.
The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.
Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.
The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.
In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.
Resumo:
The Body Mass Index (BMI) has been used worldwide as an indicator of fatness. However, the universal cut-off points by the World Health Organisation (WHO) classification may not be appropriate for every ethnic group when consider the relationship with their actual total body fatness(%BF). The application of population-specific classifications to assess BMI may be more relevant to public health. Ethnic differences in the BMI%BF relationship between 45 Japanese and 42 Australian-Caucasian males were assessed using whole body dual-energy X-ray absorptiometry (DXA) scan and anthropometry using a standard protocol. Japanese males had significantly (p<0.05) greater %BF at given BMI values than Australian males. When this is taken into account the newly proposed Asia-Pacific BMI classification of BMI 23 as overweight and 25 as obese may better assess the level of obesity that is associated increased health risks for this population. To clarify the current findings, further studies that compare the relationships across other Japanese populations are recommended.
Resumo:
By using the Rasch model, much detailed diagnostic information is available to developers of survey and assessment instruments and to the researchers who use them. We outline an approach to the analysis of data obtained from the administration of survey instruments that can enable researchers to recognise and diagnose difficulties with those instruments and then to suggest remedial actions that can improve the measurement properties of the scales included in questionnaires. We illustrate the approach using examples drawn from recent research and demonstrate how the approach can be used to generate figures that make the results of Rasch analyses accessible to non-specialists.
Resumo:
Computer tomography has been used to image and reconstruct in 3-D an Egyptian mummy from the collection of the British Museum. This study of Tjentmutengebtiu, a priestess from the 22nd dynasty (945-715 BC) revealed invaluable information of a scientific, Egyptological and palaeopathological nature without mutilation and destruction of the painted cartonnage case or linen wrappings. Precise details on the removal of the brain through the nasal cavity and the viscera from the abdominal cavity were obtained. The nature and composition of the false eyes were investigated. The detailed analysis of the teeth provided a much closer approximation of age at death. The identification of materials used for the various amulets including that of the figures placed in the viscera was graphically demonstrated using this technique.
Resumo:
X-ray computed tomography (CT) is a medical imaging technique that produces images of trans-axial planes through the human body. When compared with a conventional radiograph, which is an image of many planes superimposed on each other, a CT image exhibits significantly improved contrast although this is at the expense of reduced spatial resolution.----- A CT image is reconstructed mathematically from a large number of one dimensional projections of the chosen plane. These projections are acquired electronically using a linear array of solid-state detectors and an x ray source that rotates around the patient.----- X-ray computed tomography is used routinely in radiological examinations. It has also be found to be useful in special applications such as radiotherapy treatment planning and three-dimensional imaging for surgical planning.
Resumo:
The aggregate structure which occurs in aqueous smectitic suspensions is responsible for poor water clarification, difficulties in sludge dewatering and the unusual rheological behaviour of smectite rich soils. These macroscopic properties are dictated by the 3-D structural arrangement of smectite finest fraction within flocculated aggregates. Here, we report results from a relatively new technique, Transmission X-ray Microscopy (TXM), which makes it possible to investigate the internal structure and 3-D tomographic reconstruction of the smectite clay aggregates modified by Al13 keggin macro-molecule [Al13(O)4(OH)24(H2O)12 ]7+. Three different treatment methods were shown resulted in three different micro-structural environments of the resulting flocculation.
Resumo:
This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.