987 resultados para X-ray methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biological soil crusts (BSCs) are formed by aggregates of soil particles and communities of microbial organisms and are common in all drylands. The role of BSCs on infiltration remains uncertain due to the lack of data on their role in affecting soil physical properties such as porosity and structure. Quantitative assessment of these properties is primarily hindered by the fragile nature of the crusts. Here we show how the use of a combination of non-destructive imaging X-ray microtomography (XMT) and Lattice Boltzmann method (LBM) enables quantification of key soil physical parameters and the modeling of water flow through BSCs samples from Kalahari Sands, Botswana. We quantify porosity and flow changes as a result of mechanical disturbance of such a fragile cyanobacteria-dominated crust. Results show significant variations in porosity between different types of crusts and how they affect the flow and that disturbance of a cyanobacteria-dominated crust results in the breakdown of larger pore spaces and reduces flow rates through the surface layer. We conclude that the XMT–LBM approach is well suited for study of fragile surface crust samples where physical and hydraulic properties cannot be easily quantified using conventional methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: The aims of this study were to establish the structure of the potent anticonvulsant enaminone methyl 4-(4′-bromophenyl)amino-6-methyl-2- oxocyclohex-3-en-1-oate (E139), and to determine the energetically preferred conformation of the molecule, which is responsible for the biological activity. Materials and Methods: The structure of the molecule was determined by X-ray crystallography. Theoretical ab initio calculations with different basis sets were used to compare the energies of the different enantiomers and to other structurally related compounds. Results: The X-ray crystal structure revealed two independent molecules of E139, both with absolute configuration C11(S), C12(R), and their inverse. Ab initio calculations with the 6-31G, 3-21G and STO-3G basis sets confirmed that the C11(S), C12(R) enantiomer with both substituents equatorial had the lowest energy. Compared to relevant crystal structures, the geometry of the theoretical structures shows a longer C-N and shorter C=O distance with more cyclohexene ring puckering in the isolated molecule. Conclusion: Based on a pharmacophoric model it is suggested that the enaminone system HN-C=C-C=O and the 4-bromophenyl group in E139 are necessary to confer anticonvulsant property that could lead to the design of new and improved anticonvulsant agents. Copyright © 2003 S. Karger AG, Basel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims. We report results of an X-ray study of the supernova remnant (SNR) G344.7-0.1 and the point-like X-ray source located at the geometrical center of the SNR radio structure. Methods. The morphology and spectral properties of the remnant and the central X-ray point-like source were studied using data from the XMM-Newton and Chandra satellites. Archival radio data and infrared Spitzer observations at 8 and 24 mu m were used to compare and study its multi-band properties at different wavelengths. Results. The XMM-Newton and Chandra observations reveal that the overall X-ray emission of G344.7-0.1 is extended and correlates very well with regions of bright radio and infrared emission. The X-ray spectrum is dominated by prominent atomic emission lines. These characteristics suggest that the X-ray emission originated in a thin thermal plasma, whose radiation is represented well by a plane-parallel shock plasma model (PSHOCK). Our study favors the scenario in which G344.7-0.1 is a 6 x 10^3 year old SNR expanding in a medium with a high density gradient and is most likely encountering a molecular cloud on the western side. In addition, we report the discovery of a soft point-like X-ray source located at the geometrical center of the radio SNR structure. The object presents some characteristics of the so-called compact central objects (CCO). However, its neutral hydrogen absorption column (N_H) is inconsistent with that of the SNR. Coincident with the position of the source, we found infrared and optical objects with typical early-K star characteristics. The X-ray source may be a foreground star or the CCO associated with the SNR. If this latter possibility were confirmed, the point-like source would be the farthest CCO detected so far and the eighth member of the new population of isolated and weakly magnetized neutron stars.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this research was to determine the composition of boron deposits produced by pyrolysis of boron tribromide, and to use the results to (a) determine the experimental conditions (reaction temperature, etc.) necessary to produce alpha-rhombohedral boron and (b) guide the development/refinement of the pyrolysis experiments such that large, high purity crystals of alpha-rhombohedral boron can be produced with consistency. Developing a method for producing large, high purity alpha-rhombohedral boron crystals is of interest because such crystals could potentially be used to achieve an alpha-rhombohedral boron based neutron detector design (a solid-state detector) that could serve as an alternative to existing neutron detector technologies. The supply of neutron detectors in the United States has been hampered for a number of years due to the current shortage of helium-3 (a gas used in many existing neutron detector technologies); the development of alternative neutron detector technology such as an alpha-rhombohedral boron based detector would help provide a more sustainable supply of neutron detectors in this country. In addition, the prospect/concept of an alpha-rhombohedral boron based neutron detector is attractive because it offers the possibility of achieving a design that is smaller, longer life, less power consuming, and potentially more sensitive than existing neutron detectors. The main difficulty associated with creating an alpha-rhombohedral boron based neutron detector is that producing large, high purity crystals of alpha-rhombohedral boron is extremely challenging. Past researchers have successfully made alpha-rhombohedral boron via a number of methods, but no one has developed a method for consistently producing large, high purity crystals. Alpha-rhombohedral boron is difficult to make because it is only stable at temperatures below around 1100-1200 °C, its formation is very sensitive to impurities, and the conditions necessary for its formation are not fully understood or agreed upon in the literature. In this research, the method of pyrolysis of boron tribromide (hydrogen reduction of boron tribromide) was used to deposit boron on a tantalum filament. The goal was to refine this method, or potentially use it in combination with a second method (amorphous boron crystallization), to the point where it is possible to grow large, high purity alpha-rhombohedral boron crystals with consistency. A pyrolysis apparatus was designed and built, and a number of trials were run to determine the conditions (reaction temperature, etc.) necessary for alpha-rhombohedral boron production. This work was focused on the x-ray diffraction analysis of the boron deposits; x-ray diffraction was performed on a number of samples to determine the types of boron (and other compounds) formed in each trial and to guide the choices of test conditions for subsequent trials. It was found that at low reaction temperatures (in the range of around 830-950 °C), amorphous boron was the primary form of boron produced. Reaction temperatures in the range of around 950-1000 °C yielded various combinations of crystalline boron and amorphous boron. In the first trial performed at a temperature of 950 °C, a mix of amorphous boron and alpha-rhombohedral boron was formed. Using a scanning electron microscope, it was possible to see small alpha-rhombohedral boron crystals (on the order of ~1 micron in size) embedded in the surface of the deposit. In subsequent trials carried out at reaction temperatures in the range of 950 °C – 1000 °C, it was found that various combinations of alpha-rhombohedral boron, beta-rhombohedral boron, and amorphous boron were produced; the results tended to be unpredictable (alpha-rhombohedral boron was not produced in every trial), and the factors leading to success/failure were difficult to pinpoint. These results illustrate how sensitive of a process producing alpha-rhombohedral boron can be, and indicate that further improvements to the test apparatus and test conditions (for example, higher purity/cleanliness) may be necessary to optimize the boron deposition. Although alpha-rhombohedral boron crystals of large size were not achieved, this research was successful in (a) developing a pyrolysis apparatus and test procedure that can serve as a platform for future testing, (b) determining reaction temperatures at which alpha-rhombohedral boron can form, and (c) developing a consistent process for analyzing the boron deposits and determining their composition. Further experimentation is necessary to achieve a pyrolysis apparatus and test procedure that can yield large alpha-rhombohedral boron crystals with consistency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spectral CT using a photon counting x-ray detector (PCXD) shows great potential for measuring material composition based on energy dependent x-ray attenuation. Spectral CT is especially suited for imaging with K-edge contrast agents to address the otherwise limited contrast in soft tissues. We have developed a micro-CT system based on a PCXD. This system enables full spectrum CT in which the energy thresholds of the PCXD are swept to sample the full energy spectrum for each detector element and projection angle. Measurements provided by the PCXD, however, are distorted due to undesirable physical eects in the detector and are very noisy due to photon starvation. In this work, we proposed two methods based on machine learning to address the spectral distortion issue and to improve the material decomposition. This rst approach is to model distortions using an articial neural network (ANN) and compensate for the distortion in a statistical reconstruction. The second approach is to directly correct for the distortion in the projections. Both technique can be done as a calibration process where the neural network can be trained using 3D printed phantoms data to learn the distortion model or the correction model of the spectral distortion. This replaces the need for synchrotron measurements required in conventional technique to derive the distortion model parametrically which could be costly and time consuming. The results demonstrate experimental feasibility and potential advantages of ANN-based distortion modeling and correction for more accurate K-edge imaging with a PCXD. Given the computational eciency with which the ANN can be applied to projection data, the proposed scheme can be readily integrated into existing CT reconstruction pipelines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work focuses on the construction and application of coded apertures to compressive X-ray tomography. Coded apertures can be made in a number of ways, each method having an impact on system background and signal contrast. Methods of constructing coded apertures for structuring X-ray illumination and scatter are compared and analyzed. Apertures can create structured X-ray bundles that investigate specific sets of object voxels. The tailored bundles of rays form a code (or pattern) and are later estimated through computational inversion. Structured illumination can be used to subsample object voxels and make inversion feasible for low dose computed tomography (CT) systems, or it can be used to reduce background in limited angle CT systems.

On the detection side, coded apertures modulate X-ray scatter signals to determine the position and radiance of scatter points. By forming object dependent projections in measurement space, coded apertures multiplex modulated scatter signals onto a detector. The multiplexed signals can be inverted with knowledge of the code pattern and system geometry. This work shows two systems capable of determining object position and type in a 2D plane, by illuminating objects with an X-ray `fan beam,' using coded apertures and compressive measurements. Scatter tomography can help identify materials in security and medicine that may be ambiguous with transmission tomography alone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We used X-ray fluorescence (XRF) scanning on Site U1338 sediments from Integrated Ocean Drilling Program Expedition 321 to measure sediment geochemical compositions at 2.5 cm resolution for the 450 m of the Site U1338 spliced sediment column. This spatial resolution is equivalent to ~2 k.y. age sampling in the 0-5 Ma section and ~1 k.y. resolution from 5 to 17 Ma. Here we report the data and describe data acquisition conditions to measure Al, Si, K, Ca, Ti, Fe, Mn, and Ba in the solid phase. We also describe a method to convert the data from volume-based raw XRF scan data to a normalized mass measurement ready for calibration by other geochemical methods. Both the raw and normalized data are reported along the Site U1338 splice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context: The stellar population of the 30 Doradus star-forming region in the Large Magellanic Cloud contains a subset of apparently single, rapidly rotating O-type stars. The physical processes leading to the formation of this cohort are currently uncertain. 

Aims: One member of this group, the late O-type star VFTS 399, is found to be unexpectedly X-ray bright for its bolometric luminosity-in this study we aim to determine its physical nature and the cause of this behaviour. 

Methods: To accomplish this we performed a time-resolved analysis of optical, infrared and X-ray observations. 

Results: We found VFTS 399 to be an aperiodic photometric variable with an apparent near-IR excess. Its optical spectrum demonstrates complex emission profiles in the lower Balmer series and select He i lines-taken together these suggest an OeBe classification. The highly variable X-ray luminosity is too great to be produced by a single star, while the hard, non-thermal nature suggests the presence of an accreting relativistic companion. Finally, the detection of periodic modulation of the X-ray lightcurve is most naturally explained under the assumption that the accretor is a neutron star. 

Conclusions: VFTS 399 appears to be the first high-mass X-ray binary identified within 30 Dor, sharing many observational characteristics with classical Be X-ray binaries. Comparison of the current properties of VFTS 399 to binary-evolution models suggests a progenitor mass 25 M for the putative neutron star, which may host a magnetic field comparable in strength to those of magnetars. VFTS 399 is now the second member of the cohort of rapidly rotating "single" O-type stars in 30 Dor to show evidence of binary interaction resulting in spin-up, suggesting that this may be a viable evolutionary pathway for the formation of a subset of this stellar population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We evaluate the integration of 3D preoperative computed tomography angiography of the coronary arteries with intraoperative 2D X-ray angiographies by a recently proposed novel registration-by-regression method. The method relates image features of 2D projection images to the transformation parameters of the 3D image. We compared different sets of features and studied the influence of preprocessing the training set. For the registration evaluation, a gold standard was developed from eight X-ray angiography sequences from six different patients. The alignment quality was measured using the 3D mean target registration error (mTRE). The registration-by-regression method achieved moderate accuracy (median mTRE of 15 mm) on real images. It does therefore not provide yet a complete solution to the 3D–2D registration problem but it could be used as an initialisation method to eliminate the need for manual initialisation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An important part of computed tomography is the calculation of a three-dimensional reconstruction of an object from series of X-ray images. Unfortunately, some applications do not provide sufficient X-ray images. Then, the reconstructed objects no longer truly represent the original. Inside of the volumes, the accuracy seems to vary unpredictably. In this paper, we introduce a novel method to evaluate any reconstruction, voxel by voxel. The evaluation is based on a sophisticated probabilistic handling of the measured X-rays, as well as the inclusion of a priori knowledge about the materials that the object receiving the X-ray examination consists of. For each voxel, the proposed method outputs a numerical value that represents the probability of existence of a predefined material at the position of the voxel while doing X-ray. Such a probabilistic quality measure was lacking so far. In our experiment, false reconstructed areas get detected by their low probability. In exact reconstructed areas, a high probability predominates. Receiver Operating Characteristics not only confirm the reliability of our quality measure but also demonstrate that existing methods are less suitable for evaluating a reconstruction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Apply dual X-ray absorptiometry (DXA) to determine the amount of fat mass, lean mass, and bone mineral density in Mexican schoolchildren with and without obesity. Material and methods: We performed an observational, analytical, comparative, cross-sectional study of 80 Mexican schoolchildren who attended the Nutrition Clinic of the Pediatric Medical Center in Monterrey, Mexico during the period of January to April 2005. Body mass index (BMI) was determined to classify the participants according to the growth charts of the Centers for Disease Control and Prevention. Two groups of 40 children each (with and without obesity) were formed and DXA was carried out on each individual. Cronbach’s Alpha was used to determine instrument reliability and the Kolmogorov-Smirnov test was used to test the normality of numerical variables. Means were compared using Student´s t test. Results: Statistically signiicant differences were found in fat mass (p≤0.001) and lean mass (p≤0.001), but not in bone mineral content (p=0.051) between both groups. Conclusions: Differences exist in fat mass and lean mass in both groups, but not in bone mineral content between both groups. A signiicant positive correlation was found between fat mass, determined by DXA, and BMI in schoolchildren with and without obesity

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To determine the heavy metal and trace element composition of the powdered aerial parts of Origanum sipyleum L. and its water extract. Methods: The heavy metal and trace elements content of the powdered plant material and 2 % aqueous extract were evaluated by x-ray fluorescence spectroscopy with silicon drift detector SDD at a resolution of 145 eV and 10,000 pulses. The process conditions were 0.1 g sample weight, process time of 300 s at a voltage of 25 kV and 50 kV, and at a current of 0.5 and 1.0 mA under helium atmosphere. Results: The major elements, K, Ca and Na, known as macronutrients, constituted 11990, 10490 and 970 ppm of the powdered drug and 8910, 2991 and 810 ppm of the water extract, respectively. Among other constituents, arsenic, lead and uranium levels were < 1, 2.1 and < 3 ppm, respectively, in the powdered material while in the aqueous extract, the levels were < 1, < 2 and 200 ppm, respectively. Conclusion: O. sipyleum is a potential source of macro- and micronutrients from which useful food additives and health supplements can be derived.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Body Mass Index (BMI) has been used worldwide as an indicator of fatness. However, the universal cut-off points by the World Health Organisation (WHO) classification may not be appropriate for every ethnic group when consider the relationship with their actual total body fatness(%BF). The application of population-specific classifications to assess BMI may be more relevant to public health. Ethnic differences in the BMI%BF relationship between 45 Japanese and 42 Australian-Caucasian males were assessed using whole body dual-energy X-ray absorptiometry (DXA) scan and anthropometry using a standard protocol. Japanese males had significantly (p<0.05) greater %BF at given BMI values than Australian males. When this is taken into account the newly proposed Asia-Pacific BMI classification of BMI 23 as overweight and 25 as obese may better assess the level of obesity that is associated increased health risks for this population. To clarify the current findings, further studies that compare the relationships across other Japanese populations are recommended.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By using the Rasch model, much detailed diagnostic information is available to developers of survey and assessment instruments and to the researchers who use them. We outline an approach to the analysis of data obtained from the administration of survey instruments that can enable researchers to recognise and diagnose difficulties with those instruments and then to suggest remedial actions that can improve the measurement properties of the scales included in questionnaires. We illustrate the approach using examples drawn from recent research and demonstrate how the approach can be used to generate figures that make the results of Rasch analyses accessible to non-specialists.