971 resultados para Kinetic Monte Carlo code (kMC)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: The accurate identification of tissue electron densities is of great importance for Monte Carlo (MC) dose calculations. When converting patient CT data into a voxelised format suitable for MC simulations, however, it is common to simplify the assignment of electron densities so that the complex tissues existing in the human body are categorized into a few basic types. This study examines the effects that the assignment of tissue types and the calculation of densities can have on the results of MC simulations, for the particular case of a Siemen’s Sensation 4 CT scanner located in a radiotherapy centre where QA measurements are routinely made using 11 tissue types (plus air). Methods: DOSXYZnrc phantoms are generated from CT data, using the CTCREATE user code, with the relationship between Hounsfield units (HU) and density determined via linear interpolation between a series of specified points on the ‘CT-density ramp’ (see Figure 1(a)). Tissue types are assigned according to HU ranges. Each voxel in the DOSXYZnrc phantom therefore has an electron density (electrons/cm3) defined by the product of the mass density (from the HU conversion) and the intrinsic electron density (electrons /gram) (from the material assignment), in that voxel. In this study, we consider the problems of density conversion and material identification separately: the CT-density ramp is simplified by decreasing the number of points which define it from 12 down to 8, 3 and 2; and the material-type-assignment is varied by defining the materials which comprise our test phantom (a Supertech head) as two tissues and bone, two plastics and bone, water only and (as an extreme case) lead only. The effect of these parameters on radiological thickness maps derived from simulated portal images is investigated. Results & Discussion: Increasing the degree of simplification of the CT-density ramp results in an increasing effect on the resulting radiological thickness calculated for the Supertech head phantom. For instance, defining the CT-density ramp using 8 points, instead of 12, results in a maximum radiological thickness change of 0.2 cm, whereas defining the CT-density ramp using only 2 points results in a maximum radiological thickness change of 11.2 cm. Changing the definition of the materials comprising the phantom between water and plastic and tissue results in millimetre-scale changes to the resulting radiological thickness. When the entire phantom is defined as lead, this alteration changes the calculated radiological thickness by a maximum of 9.7 cm. Evidently, the simplification of the CT-density ramp has a greater effect on the resulting radiological thickness map than does the alteration of the assignment of tissue types. Conclusions: It is possible to alter the definitions of the tissue types comprising the phantom (or patient) without substantially altering the results of simulated portal images. However, these images are very sensitive to the accurate identification of the HU-density relationship. When converting data from a patient’s CT into a MC simulation phantom, therefore, all possible care should be taken to accurately reproduce the conversion between HU and mass density, for the specific CT scanner used. Acknowledgements: This work is funded by the NHMRC, through a project grant, and supported by the Queensland University of Technology (QUT) and the Royal Brisbane and Women's Hospital (RBWH), Brisbane, Australia. The authors are grateful to the staff of the RBWH, especially Darren Cassidy, for assistance in obtaining the phantom CT data used in this study. The authors also wish to thank Cathy Hargrave, of QUT, for assistance in formatting the CT data, using the Pinnacle TPS. Computational resources and services used in this work were provided by the HPC and Research Support Group, QUT, Brisbane, Australia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stereotactic radiosurgery treatments involve the delivery of very high doses for a small number of fractions. To date, there is limited data in terms of the skin dose for the very small field sizes used in these treatments. In this work, we determine relative surface doses for small size circular collimators as used in stereotactic radiosurgery treatments. Monte Carlo calculations were performed using the BEAMnrc code with a model of the Novalis 15 Trilogy linear accelerator and the BrainLab circular collimators. The surface doses were calculated at the ICRU skin dose depth of 70 m all using the 6 MV SRS x-ray beam. The calculated surface doses varied between 15 – 12% with decreasing values as the field size increased from 4 to 30 mm. In comparison, surface doses were measured using Gafchromic EBT3 film positioned at the surface of a Virtual Water phantom. The absolute agreement between calculated and measured surface doses was better than 2.5% which is well within the 20 uncertainties of the Monte Carlo calculations and the film measurements. Based on these results, we have shown that the Gafchromic EBT3 film is suitable for surface dose estimates in very small size fields as used in SRS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To obtain accurate Monte Carlo simulations of small radiation fields, it is important model the initial source parameters (electron energy and spot size) accurately. However recent studies have shown that small field dosimetry correction factors are insensitive to these parameters. The aim of this work is to extend this concept to test if these parameters affect dose perturbations in general, which is important for detector design and calculating perturbation correction factors. The EGSnrc C++ user code cavity was used for all simulations. Varying amounts of air between 0 and 2 mm were deliberately introduced upstream to a diode and the dose perturbation caused by the air was quantified. These simulations were then repeated using a range of initial electron energies (5.5 to 7.0 MeV) and electron spot sizes (0.7 to 2.2 FWHM). The resultant dose perturbations were large. For example 2 mm of air caused a dose reduction of up to 31% when simulated with a 6 mm field size. However these values did not vary by more than 2 % when simulated across the full range of source parameters tested. If a detector is modified by the introduction of air, one can be confident that the response of the detector will be the same across all similar linear accelerators and the Monte Carlo modelling of each machine is not required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dynamics of low-density flows is governed by the Boltzmann equation of the kinetic theory of gases. This is a nonlinear integro-differential equation and, in general, numerical methods must be used to obtain its solution. The present paper, after a brief review of Direct Simulation Monte Carlo (DSMC) methods due to Bird, and Belotserkovskii and Yanitskii, studies the details of theDSMC method of Deshpande for mono as well as multicomponent gases. The present method is a statistical particle-in-cell method and is based upon the Kac-Prigogine master equation which reduces to the Boltzmann equation under the hypothesis of molecular chaos. The proposed Markoff model simulating the collisions uses a Poisson distribution for the number of collisions allowed in cells into which the physical space is divided. The model is then extended to a binary mixture of gases and it is shown that it is necessary to perform the collisions in a certain sequence to obtain unbiased simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present work deals with the prediction of stiffness of an Indian nanoclay-reinforced polypropylene composite (that can be termed as a nanocomposite) using a Monte Carlo finite element analysis (FEA) technique. Nanocomposite samples are at first prepared in the laboratory using a torque rheometer for achieving desirable dispersion of nanoclay during master batch preparation followed up with extrusion for the fabrication of tensile test dog-bone specimens. It has been observed through SEM (scanning electron microscopy) images of the prepared nanocomposite containing a given percentage (3–9% by weight) of the considered nanoclay that nanoclay platelets tend to remain in clusters. By ascertaining the average size of these nanoclay clusters from the images mentioned, a planar finite element model is created in which nanoclay groups and polymer matrix are modeled as separate entities assuming a given homogeneous distribution of the nanoclay clusters. Using a Monte Carlo simulation procedure, the distribution of nanoclay is varied randomly in an automated manner in a commercial FEA code, and virtual tensile tests are performed for computing the linear stiffness for each case. Values of computed stiffness modulus of highest frequency for nanocomposites with different nanoclay contents correspond well with the experimentally obtained measures of stiffness establishing the effectiveness of the present approach for further applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BGCore reactor analysis system was recently developed at Ben-Gurion University for calculating in-core fuel composition and spent fuel emissions following discharge. It couples the Monte Carlo transport code MCNP with an independently developed burnup and decay module SARAF. Most of the existing MCNP based depletion codes (e.g. MOCUP, Monteburns, MCODE) tally directly the one-group fluxes and reaction rates in order to prepare one-group cross sections necessary for the fuel depletion analysis. BGCore, on the other hand, uses a multi-group (MG) approach for generation of one group cross-sections. This coupling approach significantly reduces the code execution time without compromising the accuracy of the results. Substantial reduction in the BGCore code execution time allows consideration of problems with much higher degree of complexity, such as introduction of thermal hydraulic (TH) feedback into the calculation scheme. Recently, a simplified TH feedback module, THERMO, was developed and integrated into the BGCore system. To demonstrate the capabilities of the upgraded BGCore system, a coupled neutronic TH analysis of a full PWR core was performed. The BGCore results were compared with those of the state of the art 3D deterministic nodal diffusion code DYN3D (Grundmann et al.; 2000). Very good agreement in major core operational parameters including k-eff eigenvalue, axial and radial power profiles, and temperature distributions between the BGCore and DYN3D results was observed. This agreement confirms the consistency of the implementation of the TH feedback module. Although the upgraded BGCore system is capable of performing both, depletion and TH analyses, the calculations in this study were performed for the beginning of cycle state with pre-generated fuel compositions. © 2011 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coupled Monte Carlo depletion systems provide a versatile and an accurate tool for analyzing advanced thermal and fast reactor designs for a variety of fuel compositions and geometries. The main drawback of Monte Carlo-based systems is a long calculation time imposing significant restrictions on the complexity and amount of design-oriented calculations. This paper presents an alternative approach to interfacing the Monte Carlo and depletion modules aimed at addressing this problem. The main idea is to calculate the one-group cross sections for all relevant isotopes required by the depletion module in a separate module external to Monte Carlo calculations. Thus, the Monte Carlo module will produce the criticality and neutron spectrum only, without tallying of the individual isotope reaction rates. The onegroup cross section for all isotopes will be generated in a separate module by collapsing a universal multigroup (MG) cross-section library using the Monte Carlo calculated flux. Here, the term "universal" means that a single MG cross-section set will be applicable for all reactor systems and is independent of reactor characteristics such as a neutron spectrum; fuel composition; and fuel cell, assembly, and core geometries. This approach was originally proposed by Haeck et al. and implemented in the ALEPH code. Implementation of the proposed approach to Monte Carlo burnup interfacing was carried out through the BGCORE system. One-group cross sections generated by the BGCORE system were compared with those tallied directly by the MCNP code. Analysis of this comparison was carried out and led to the conclusion that in order to achieve the accuracy required for a reliable core and fuel cycle analysis, accounting for the background cross section (σ0) in the unresolved resonance energy region is essential. An extension of the one-group cross-section generation model was implemented and tested by tabulating and interpolating by a simplified σ0 model. A significant improvement of the one-group cross-section accuracy was demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous studies have reported that different schemes for coupling Monte Carlo (MC) neutron transport with burnup and thermal hydraulic feedbacks may potentially be numerically unstable. This issue can be resolved by application of implicit methods, such as the stochastic implicit mid-point (SIMP) methods. In order to assure numerical stability, the new methods do require additional computational effort. The instability issue however, is problem-dependent and does not necessarily occur in all cases. Therefore, blind application of the unconditionally stable coupling schemes, and thus incurring extra computational costs, may not always be necessary. In this paper, we attempt to develop an intelligent diagnostic mechanism, which will monitor numerical stability of the calculations and, if necessary, switch from simple and fast coupling scheme to more computationally expensive but unconditionally stable one. To illustrate this diagnostic mechanism, we performed a coupled burnup and TH analysis of a single BWR fuel assembly. The results indicate that the developed algorithm can be easily implemented in any MC based code for monitoring of numerical instabilities. The proposed monitoring method has negligible impact on the calculation time even for realistic 3D multi-region full core calculations. © 2014 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo method has been applied to investigate the kinetic of grafting reaction in free radical copolymerization. The simulation is quits in agreement with that of theoretical and experimental results. It proves that the Monte Carlo simulation is an effective method for investigating the grafting reaction of free radical copolymerization. (C) 2000 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The outcomes for both (i) radiation therapy and (ii) preclinical small animal radio- biology studies are dependent on the delivery of a known quantity of radiation to a specific and intentional location. Adverse effects can result from these procedures if the dose to the target is too high or low, and can also result from an incorrect spatial distribution in which nearby normal healthy tissue can be undesirably damaged by poor radiation delivery techniques. Thus, in mice and humans alike, the spatial dose distributions from radiation sources should be well characterized in terms of the absolute dose quantity, and with pin-point accuracy. When dealing with the steep spatial dose gradients consequential to either (i) high dose rate (HDR) brachytherapy or (ii) within the small organs and tissue inhomogeneities of mice, obtaining accurate and highly precise dose results can be very challenging, considering commercially available radiation detection tools, such as ion chambers, are often too large for in-vivo use.

In this dissertation two tools are developed and applied for both clinical and preclinical radiation measurement. The first tool is a novel radiation detector for acquiring physical measurements, fabricated from an inorganic nano-crystalline scintillator that has been fixed on an optical fiber terminus. This dosimeter allows for the measurement of point doses to sub-millimeter resolution, and has the ability to be placed in-vivo in humans and small animals. Real-time data is displayed to the user to provide instant quality assurance and dose-rate information. The second tool utilizes an open source Monte Carlo particle transport code, and was applied for small animal dosimetry studies to calculate organ doses and recommend new techniques of dose prescription in mice, as well as to characterize dose to the murine bone marrow compartment with micron-scale resolution.

Hardware design changes were implemented to reduce the overall fiber diameter to <0.9 mm for the nano-crystalline scintillator based fiber optic detector (NanoFOD) system. Lower limits of device sensitivity were found to be approximately 0.05 cGy/s. Herein, this detector was demonstrated to perform quality assurance of clinical 192Ir HDR brachytherapy procedures, providing comparable dose measurements as thermo-luminescent dosimeters and accuracy within 20% of the treatment planning software (TPS) for 27 treatments conducted, with an inter-quartile range ratio to the TPS dose value of (1.02-0.94=0.08). After removing contaminant signals (Cerenkov and diode background), calibration of the detector enabled accurate dose measurements for vaginal applicator brachytherapy procedures. For 192Ir use, energy response changed by a factor of 2.25 over the SDD values of 3 to 9 cm; however a cap made of 0.2 mm thickness silver reduced energy dependence to a factor of 1.25 over the same SDD range, but had the consequence of reducing overall sensitivity by 33%.

For preclinical measurements, dose accuracy of the NanoFOD was within 1.3% of MOSFET measured dose values in a cylindrical mouse phantom at 225 kV for x-ray irradiation at angles of 0, 90, 180, and 270˝. The NanoFOD exhibited small changes in angular sensitivity, with a coefficient of variation (COV) of 3.6% at 120 kV and 1% at 225 kV. When the NanoFOD was placed alongside a MOSFET in the liver of a sacrificed mouse and treatment was delivered at 225 kV with 0.3 mm Cu filter, the dose difference was only 1.09% with use of the 4x4 cm collimator, and -0.03% with no collimation. Additionally, the NanoFOD utilized a scintillator of 11 µm thickness to measure small x-ray fields for microbeam radiation therapy (MRT) applications, and achieved 2.7% dose accuracy of the microbeam peak in comparison to radiochromic film. Modest differences between the full-width at half maximum measured lateral dimension of the MRT system were observed between the NanoFOD (420 µm) and radiochromic film (320 µm), but these differences have been explained mostly as an artifact due to the geometry used and volumetric effects in the scintillator material. Characterization of the energy dependence for the yttrium-oxide based scintillator material was performed in the range of 40-320 kV (2 mm Al filtration), and the maximum device sensitivity was achieved at 100 kV. Tissue maximum ratio data measurements were carried out on a small animal x-ray irradiator system at 320 kV and demonstrated an average difference of 0.9% as compared to a MOSFET dosimeter in the range of 2.5 to 33 cm depth in tissue equivalent plastic blocks. Irradiation of the NanoFOD fiber and scintillator material on a 137Cs gamma irradiator to 1600 Gy did not produce any measurable change in light output, suggesting that the NanoFOD system may be re-used without the need for replacement or recalibration over its lifetime.

For small animal irradiator systems, researchers can deliver a given dose to a target organ by controlling exposure time. Currently, researchers calculate this exposure time by dividing the total dose that they wish to deliver by a single provided dose rate value. This method is independent of the target organ. Studies conducted here used Monte Carlo particle transport codes to justify a new method of dose prescription in mice, that considers organ specific doses. Monte Carlo simulations were performed in the Geant4 Application for Tomographic Emission (GATE) toolkit using a MOBY mouse whole-body phantom. The non-homogeneous phantom was comprised of 256x256x800 voxels of size 0.145x0.145x0.145 mm3. Differences of up to 20-30% in dose to soft-tissue target organs was demonstrated, and methods for alleviating these errors were suggested during whole body radiation of mice by utilizing organ specific and x-ray tube filter specific dose rates for all irradiations.

Monte Carlo analysis was used on 1 µm resolution CT images of a mouse femur and a mouse vertebra to calculate the dose gradients within the bone marrow (BM) compartment of mice based on different radiation beam qualities relevant to x-ray and isotope type irradiators. Results and findings indicated that soft x-ray beams (160 kV at 0.62 mm Cu HVL and 320 kV at 1 mm Cu HVL) lead to substantially higher dose to BM within close proximity to mineral bone (within about 60 µm) as compared to hard x-ray beams (320 kV at 4 mm Cu HVL) and isotope based gamma irradiators (137Cs). The average dose increases to the BM in the vertebra for these four aforementioned radiation beam qualities were found to be 31%, 17%, 8%, and 1%, respectively. Both in-vitro and in-vivo experimental studies confirmed these simulation results, demonstrating that the 320 kV, 1 mm Cu HVL beam caused statistically significant increased killing to the BM cells at 6 Gy dose levels in comparison to both the 320 kV, 4 mm Cu HVL and the 662 keV, 137Cs beams.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present results for a variety of Monte Carlo annealing approaches, both classical and quantum, benchmarked against one another for the textbook optimization exercise of a simple one-dimensional double well. In classical (thermal) annealing, the dependence upon the move chosen in a Metropolis scheme is studied and correlated with the spectrum of the associated Markov transition matrix. In quantum annealing, the path integral Monte Carlo approach is found to yield nontrivial sampling difficulties associated with the tunneling between the two wells. The choice of fictitious quantum kinetic energy is also addressed. We find that a "relativistic" kinetic energy form, leading to a higher probability of long real-space jumps, can be considerably more effective than the standard nonrelativistic one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le travail de modélisation a été réalisé à travers EGSnrc, un logiciel développé par le Conseil National de Recherche Canada.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The triple- and quadruple-escape peaks of 6.128 MeV photons from the (19)F(p,alpha gamma)(16)O nuclear reaction were observed in an HPGe detector. The experimental peak areas, measured in spectra projected with a restriction function that allows quantitative comparison of data from different multiplicities, are in reasonably good agreement with those predicted by Monte Carlo simulations done with the general-purpose radiation-transport code PENELOPE. The behaviour of the escape intensities was simulated for some gamma-ray energies and detector dimensions; the results obtained can be extended to other energies using an empirical function and statistical properties related to the phenomenon. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The X-ray Fluorescence (XRF) analysis is a technique for the qualitative and quantitative determination of chemical constituents in a sample. This method is based on detection of the characteristic radiation intensities emitted by the elements of the sample, when properly excited. A variant of this technique is the Total Reflection X-ray Fluorescence (TXRF) that utilizes electromagnetic radiation as excitation source. In total reflection of X-ray, the angle of refraction of the incident beam tends to zero and the refracted beam is tangent to the sample support interface. Thus, there is a minimum angle of incidence at which no refracted beam exists and all incident radiation undergoes total reflection. In this study, we evaluated the influence of the energy variation of the beam of incident x-rays, using the MCNPX code (Monte Carlo NParticle) based on Monte Carlo method. © 2013 AIP Publishing LLC.