470 resultados para Institute of Chemistry
Resumo:
Air pollution is a widespread health problem associated with respiratory symptoms. Continuous exposure monitoring was performed to estimate alveolar and tracheobronchial dose, measured as deposited surface area, for 103 children and to evaluate the long-term effects of exposure to airborne particles through spirometry, skin prick tests and measurement of exhaled nitric oxide (eNO). The mean daily alveolar deposited surface area dose received by children was 1.35×103 mm2. The lowest and highest particle number concentrations were found during sleeping and eating time. A significant negative association was found between changes in pulmonary function tests and individual dose estimates. Significant differences were found for asthmatics, children with allergic rhinitis and sensitive to allergens compared to healthy subjects for eNO. Variation is a child’s activity over time appeared to have a strong impact on respiratory outcomes, which indicates that personal monitoring is vital for assessing the expected health effects of exposure to particles.
Resumo:
Introduction: The accurate identification of tissue electron densities is of great importance for Monte Carlo (MC) dose calculations. When converting patient CT data into a voxelised format suitable for MC simulations, however, it is common to simplify the assignment of electron densities so that the complex tissues existing in the human body are categorized into a few basic types. This study examines the effects that the assignment of tissue types and the calculation of densities can have on the results of MC simulations, for the particular case of a Siemen’s Sensation 4 CT scanner located in a radiotherapy centre where QA measurements are routinely made using 11 tissue types (plus air). Methods: DOSXYZnrc phantoms are generated from CT data, using the CTCREATE user code, with the relationship between Hounsfield units (HU) and density determined via linear interpolation between a series of specified points on the ‘CT-density ramp’ (see Figure 1(a)). Tissue types are assigned according to HU ranges. Each voxel in the DOSXYZnrc phantom therefore has an electron density (electrons/cm3) defined by the product of the mass density (from the HU conversion) and the intrinsic electron density (electrons /gram) (from the material assignment), in that voxel. In this study, we consider the problems of density conversion and material identification separately: the CT-density ramp is simplified by decreasing the number of points which define it from 12 down to 8, 3 and 2; and the material-type-assignment is varied by defining the materials which comprise our test phantom (a Supertech head) as two tissues and bone, two plastics and bone, water only and (as an extreme case) lead only. The effect of these parameters on radiological thickness maps derived from simulated portal images is investigated. Results & Discussion: Increasing the degree of simplification of the CT-density ramp results in an increasing effect on the resulting radiological thickness calculated for the Supertech head phantom. For instance, defining the CT-density ramp using 8 points, instead of 12, results in a maximum radiological thickness change of 0.2 cm, whereas defining the CT-density ramp using only 2 points results in a maximum radiological thickness change of 11.2 cm. Changing the definition of the materials comprising the phantom between water and plastic and tissue results in millimetre-scale changes to the resulting radiological thickness. When the entire phantom is defined as lead, this alteration changes the calculated radiological thickness by a maximum of 9.7 cm. Evidently, the simplification of the CT-density ramp has a greater effect on the resulting radiological thickness map than does the alteration of the assignment of tissue types. Conclusions: It is possible to alter the definitions of the tissue types comprising the phantom (or patient) without substantially altering the results of simulated portal images. However, these images are very sensitive to the accurate identification of the HU-density relationship. When converting data from a patient’s CT into a MC simulation phantom, therefore, all possible care should be taken to accurately reproduce the conversion between HU and mass density, for the specific CT scanner used. Acknowledgements: This work is funded by the NHMRC, through a project grant, and supported by the Queensland University of Technology (QUT) and the Royal Brisbane and Women's Hospital (RBWH), Brisbane, Australia. The authors are grateful to the staff of the RBWH, especially Darren Cassidy, for assistance in obtaining the phantom CT data used in this study. The authors also wish to thank Cathy Hargrave, of QUT, for assistance in formatting the CT data, using the Pinnacle TPS. Computational resources and services used in this work were provided by the HPC and Research Support Group, QUT, Brisbane, Australia.
Resumo:
Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.
Resumo:
Nonthermal plasma (NTP) treatment of exhaust gas is a promising technology for both nitrogen oxides (NOX) and particulate matter (PM) reduction by introducing plasma into the exhaust gases. This paper considers the effect of NTP on PM mass reduction, PM size distribution, and PM removal efficiency. The experiments are performed on real exhaust gases from a diesel engine. The NTP is generated by applying high-voltage pulses using a pulsed power supply across a dielectric barrier discharge (DBD) reactor. The effects of the applied high-voltage pulses up to 19.44 kVpp with repetition rate of 10 kHz are investigated. In this paper, it is shown that the PM removal and PM size distribution need to be considered both together, as it is possible to achieve high PM removal efficiency with undesirable increase in the number of small particles. Regarding these two important factors, in this paper, 17 kVpp voltage level is determined to be an optimum point for the given configuration. Moreover, particles deposition on the surface of the DBD reactor is found to be a significant phenomenon, which should be considered in all plasma PM removal tests.
Resumo:
Accuracy of dose delivery in external beam radiotherapy is usually verified with electronic portal imaging (EPI) in which the treatment beam is used to check the positioning of the patient. However the resulting megavoltage x-ray images suffer from poor quality. The image quality can be improved by developing a special operating mode in the linear accelerator. The existing treatment beam is modified such that it produces enough low-energy photons for imaging. In this work the problem of optimizing the beam/detector combination to achieve optimal electronic portal image quality is addressed. The linac used for this study was modified to produce two experimental photon beams. These beams, named Al6 and Al10, were non-flat and were produced by 4MeV electrons hitting aluminum targets, 6 and 10mm thick respectively. The images produced by a conventional EPI system (6MV treatment beam and camera-based EPID with a Cu plate & Gd2O2S screen ) were compared with the images produced by the experimental beams and various screens with the same camera). The contrast of 0.8cm bone equivalent material in 5 cm water increased from 1.5% for the conventional system to 11% for the combination of Al6 beam with a 200mg/cm2 Gd2O2S screen. The signal-to-noise ratio calculated for 1cGy flood field images increased by about a factor of two for the same EPI systems. The spatial resolution of the two imaging systems was comparable. This work demonstrates that significant improvements in portal image contrast can be obtained by simultaneous optimization of the linac spectrum and EPI detector.
Resumo:
We have taken a new method of calibrating portal images of IMRT beams and used this to measure patient set-up accuracy and delivery errors, such as leaf errors and segment intensity errors during treatment. A calibration technique was used to remove the intensity modulations from the images leaving equivalent open field images that show patient anatomy that can be used for verification of the patient position. The images of the treatment beam can also be used to verify the delivery of the beam in terms of multileaf collimator leaf position and dosimetric errors. A series of controlled experiments delivering an IMRT anterior beam to the head and neck of a humanoid phantom were undertaken. A 2mm translation in the position of the phantom could be detected. With intentional introduction of delivery errors into the beam this method allowed us to detect leaf positioning errors of 2mm and variation in monitor units of 1%. The method was then applied to the case of a patient who received IMRT treatment to the larynx and cervical nodes. The anterior IMRT beam was imaged during four fractions and the images calibrated and investigated for the characteristic signs of patient position error and delivery error that were shown in the control experiments. No significant errors were seen. The method of imaging the IMRT beam and calibrating the images to remove the intensity modulations can be a useful tool in verifying both the patient position and the delivery of the beam.
Resumo:
The electron Volt Spectrometer (eVS) is an inverse geometry filter difference spectrometer that has been optimised to measure the single atom properties of condensed matter systems using a technique known as Neutron Compton Scattering (NCS) or Deep Inelastic Neutron Scattering (DINS). The spectrometer utilises the high flux of epithermal neutrons that are produced by the ISIS neutron spallation source enabling the direct measurement of atomic momentum distributions and ground state kinetic energies. In this paper the procedure that is used to calibrate the spectrometer is described. This includes details of the method used to determine detector positions and neutron flight path lengths as well as the determination of the instrument resolution. Examples of measurements on 3 different samples are shown, ZrH2, 4He and Sn which show the self-consistency of the calibration procedure.
Resumo:
The sheep (Ovis aries) is commonly used as a large animal model in skeletal research. Although the sheep genome has been sequenced there are still only a limited number of annotated mRNA sequences in public databases. A complementary DNA (cDNA) library was constructed to provide a generic resource for further exploration of genes that are actively expressed in bone cells in sheep. It was anticipated that the cDNA library would provide molecular tools for further research into the process of fracture repair and bone homeostasis, and add to the existing body of knowledge. One of the hallmarks of cDNA libraries has been the identification of novel genes and in this library the full open reading frame of the gene C12orf29 was cloned and characterised. This gene codes for a protein of unknown function with a molecular weight of 37 kDa. A literature search showed that no previous studies had been conducted into the biological role of C12orf29, except for some bioinformatics studies that suggested a possible link with cancer. Phylogenetic analyses revealed that C12orf29 had an ancient pedigree with a homologous gene found in some bacterial taxa. This implied that the gene was present in the last common eukaryotic ancestor, thought to have existed more than 2 billion years ago. This notion was further supported by the fact that the gene is found in taxa belonging to the two major eukaryotic branches, bikonts and unikonts. In the bikont supergroup a C12orf29-like gene was found in the single celled protist Naegleria gruberi, whereas in the unikont supergroup, encompassing the metazoa, the gene is universal to all chordate and, therefore, vertebrate species. It appears to have been lost to the majority of cnidaria and protostomes taxa; however, C12orf29-like genes have been found in the cnidarian freshwater hydra and the protostome Pacific oyster. The experimental data indicate that C12orf29 has a structural role in skeletal development and tissue homeostasis, whereas in silico analysis of the human C12orf29 promoter region suggests that its expression is potentially under the control of the NOTCH, WNT and TGF- developmental pathways, as well SOX9 and BAPX1; pathways that are all heavily involved in skeletogenesis. Taken together, this investigation provides strong evidence that C12orf29 has a very important role in the chordate body plan, in early skeletal development, cartilage homeostasis, and also a possible link with spina bifida in humans.
Resumo:
This thesis is about the use of different cells for bone tissue engineering. The cells were used in combination with a novel biomaterial in a large tibial bone defects in a sheep model. Furthermore this study developed a novel cell delivery procedure for bone tissue engineering. This novel procedure of cell delivery could overcome the current problems of cell-based tissue engineering and serve as a baseline for the translation of novel concepts into clinical application.
Resumo:
A synthetic reevesite-like material has been shown to decolorize selected dyes and degrade phenolic contaminants photocatalytically in water when irradiated with visible light. This material can photoactively decolorize dyes such as bromophenol blue, bromocresol green, bromothymol blue, thymol blue and methyl orange in less than 15 min under visible light radiation in the absence of additional oxidizing agents. Conversely, phenolic compounds suc has phenol, p-chlorophenol and p-nitrophenol are photocat- alytically degraded in approximately 3hwith additional H2O2 when irradiated with visible light. These reactions offer potentially energy effective pathways for the removal of recalcitrant organic waste contaminants.
Resumo:
This study aimed to quantify the efficiency of deep bag and electrostatic filters, and assess the influence of ventilation systems using these filters on indoor fine (<2.5 µm) and ultrafine particle concentrations in commercial office buildings. Measurements and modelling were conducted for different indoor and outdoor particle source scenarios at three office buildings in Brisbane, Australia. Overall, the in-situ efficiency, measured for particles in size ranges 6 to 3000 nm, of the deep bag filters ranged from 26.3 to 46.9% for the three buildings, while the in-situ efficiency of the electrostatic filter in one building was 60.2%. The highest PN and PM2.5 concentrations in one of the office buildings (up to 131% and 31% higher than the other two buildings, respectively) were due to the proximity of the building’s HVAC air intakes to a nearby bus-only roadway, as well as its higher outdoor ventilation rate. The lowest PN and PM2.5 concentrations (up to 57% and 24% lower than the other two buildings, respectively) were measured in a building that utilised both outdoor and mixing air filters in its HVAC system. Indoor PN concentrations were strongly influenced by outdoor levels and were significantly higher during rush-hours (up to 41%) and nucleation events (up to 57%), compared to working-hours, for all three buildings. This is the first time that the influence of new particle formation on indoor particle concentrations has been identified and quantified. A dynamic model for indoor PN concentration, which performed adequately in this study also revealed that using mixing/outdoor air filters can significantly reduce indoor particle concentration in buildings where indoor air was strongly influenced by outdoor particle levels. This work provides a scientific basis for the selection and location of appropriate filters and outdoor air intakes, during the design of new, or upgrade of existing, building HVAC systems. The results also serve to provide a better understanding of indoor particle dynamics and behaviours under different ventilation and particle source scenarios, and highlight effective methods to reduce exposure to particles in commercial office buildings.