973 resultados para Monte-carlo Calculations


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Electronic Portal Imaging Devices (EPIDs) are available with most linear accelerators (Amonuk, 2002), the current technology being amorphous silicon flat panel imagers. EPIDs are currently used routinely in patient positioning before radiotherapy treatments. There has been an increasing interest in using EPID technology tor dosimetric verification of radiotherapy treatments (van Elmpt, 2008). A straightforward technique involves the EPID panel being used to measure the fluence exiting the patient during a treatment which is then compared to a prediction of the fluence based on the treatment plan. However, there are a number of significant limitations which exist in this Method: Resulting in a limited proliferation ot this technique in a clinical environment. In this paper, we aim to present a technique of simulating IMRT fields using Monte Carlo to predict the dose in an EPID which can then be compared to the measured dose in the EPID. Materials: Measurements were made using an iView GT flat panel a-SI EPfD mounted on an Elekta Synergy linear accelerator. The images from the EPID were acquired using the XIS software (Heimann Imaging Systems). Monte Carlo simulations were performed using the BEAMnrc and DOSXVZnrc user codes. The IMRT fieids to be delivered were taken from the treatment planning system in DICOMRT format and converted into BEAMnrc and DOSXYZnrc input files using an in-house application (Crowe, 2009). Additionally. all image processing and analysis was performed using another in-house application written using the Interactive Data Language (IDL) (In Visual Information Systems). Comparison between the measured and Monte Carlo EPID images was performed using a gamma analysis (Low, 1998) incorporating dose and distance to agreement criteria. Results: The fluence maps recorded by the EPID were found to provide good agreement between measured and simulated data. Figure 1 shows an example of measured and simulated IMRT dose images and profiles in the x and y directions. "A technique for the quantitative evaluation of dose distributions", Med Phys, 25(5) May 1998 S. Crowe, 1. Kairn, A. Fielding, "The Development of a Monte Carlo system to verify Radiotherapy treatment dose calculations", Radiotherapy & Oncology, Volume 92, Supplement 1, August 2009, Pages S71-S71.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: The accurate identification of tissue electron densities is of great importance for Monte Carlo (MC) dose calculations. When converting patient CT data into a voxelised format suitable for MC simulations, however, it is common to simplify the assignment of electron densities so that the complex tissues existing in the human body are categorized into a few basic types. This study examines the effects that the assignment of tissue types and the calculation of densities can have on the results of MC simulations, for the particular case of a Siemen’s Sensation 4 CT scanner located in a radiotherapy centre where QA measurements are routinely made using 11 tissue types (plus air). Methods: DOSXYZnrc phantoms are generated from CT data, using the CTCREATE user code, with the relationship between Hounsfield units (HU) and density determined via linear interpolation between a series of specified points on the ‘CT-density ramp’ (see Figure 1(a)). Tissue types are assigned according to HU ranges. Each voxel in the DOSXYZnrc phantom therefore has an electron density (electrons/cm3) defined by the product of the mass density (from the HU conversion) and the intrinsic electron density (electrons /gram) (from the material assignment), in that voxel. In this study, we consider the problems of density conversion and material identification separately: the CT-density ramp is simplified by decreasing the number of points which define it from 12 down to 8, 3 and 2; and the material-type-assignment is varied by defining the materials which comprise our test phantom (a Supertech head) as two tissues and bone, two plastics and bone, water only and (as an extreme case) lead only. The effect of these parameters on radiological thickness maps derived from simulated portal images is investigated. Results & Discussion: Increasing the degree of simplification of the CT-density ramp results in an increasing effect on the resulting radiological thickness calculated for the Supertech head phantom. For instance, defining the CT-density ramp using 8 points, instead of 12, results in a maximum radiological thickness change of 0.2 cm, whereas defining the CT-density ramp using only 2 points results in a maximum radiological thickness change of 11.2 cm. Changing the definition of the materials comprising the phantom between water and plastic and tissue results in millimetre-scale changes to the resulting radiological thickness. When the entire phantom is defined as lead, this alteration changes the calculated radiological thickness by a maximum of 9.7 cm. Evidently, the simplification of the CT-density ramp has a greater effect on the resulting radiological thickness map than does the alteration of the assignment of tissue types. Conclusions: It is possible to alter the definitions of the tissue types comprising the phantom (or patient) without substantially altering the results of simulated portal images. However, these images are very sensitive to the accurate identification of the HU-density relationship. When converting data from a patient’s CT into a MC simulation phantom, therefore, all possible care should be taken to accurately reproduce the conversion between HU and mass density, for the specific CT scanner used. Acknowledgements: This work is funded by the NHMRC, through a project grant, and supported by the Queensland University of Technology (QUT) and the Royal Brisbane and Women's Hospital (RBWH), Brisbane, Australia. The authors are grateful to the staff of the RBWH, especially Darren Cassidy, for assistance in obtaining the phantom CT data used in this study. The authors also wish to thank Cathy Hargrave, of QUT, for assistance in formatting the CT data, using the Pinnacle TPS. Computational resources and services used in this work were provided by the HPC and Research Support Group, QUT, Brisbane, Australia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Plotless density estimators are those that are based on distance measures rather than counts per unit area (quadrats or plots) to estimate the density of some usually stationary event, e.g. burrow openings, damage to plant stems, etc. These estimators typically use distance measures between events and from random points to events to derive an estimate of density. The error and bias of these estimators for the various spatial patterns found in nature have been examined using simulated populations only. In this study we investigated eight plotless density estimators to determine which were robust across a wide range of data sets from fully mapped field sites. They covered a wide range of situations including animal damage to rice and corn, nest locations, active rodent burrows and distribution of plants. Monte Carlo simulations were applied to sample the data sets, and in all cases the error of the estimate (measured as relative root mean square error) was reduced with increasing sample size. The method of calculation and ease of use in the field were also used to judge the usefulness of the estimator. Estimators were evaluated in their original published forms, although the variable area transect (VAT) and ordered distance methods have been the subjects of optimization studies. Results: An estimator that was a compound of three basic distance estimators was found to be robust across all spatial patterns for sample sizes of 25 or greater. The same field methodology can be used either with the basic distance formula or the formula used with the Kendall-Moran estimator in which case a reduction in error may be gained for sample sizes less than 25, however, there is no improvement for larger sample sizes. The variable area transect (VAT) method performed moderately well, is easy to use in the field, and its calculations easy to undertake. Conclusion: Plotless density estimators can provide an estimate of density in situations where it would not be practical to layout a plot or quadrat and can in many cases reduce the workload in the field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The permeability of the fractal porous media is simulated by Monte Carlo technique in this work. Based oil the fractal character of pore size distribution in porous media, the probability models for pore diameter and for permeability are derived. Taking the bi-dispersed fractal porous media as examples, the permeability calculations are performed by the present Monte Carlo method. The results show that the present simulations present a good agreement compared with the existing fractal analytical solution in the general interested porosity range. The proposed simulation method may have the potential in prediction of other transport properties (such as thermal conductivity, dispersion conductivity and electrical conductivity) in fractal porous media, both saturated and unsaturated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the effect of the burnup coupling scheme on the numerical stability and accuracy of coupled Monte-Carlo depletion calculations. We show that in some cases, even the Predictor Corrector method with relatively short time steps can be numerically unstable. In addition, we present two possible extensions to the Euler predictor-corrector (PC) method, which is typically used in coupled burnup calculations. These modifications allow using longer time steps, while maintaining numerical stability and accuracy. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BGCore reactor analysis system was recently developed at Ben-Gurion University for calculating in-core fuel composition and spent fuel emissions following discharge. It couples the Monte Carlo transport code MCNP with an independently developed burnup and decay module SARAF. Most of the existing MCNP based depletion codes (e.g. MOCUP, Monteburns, MCODE) tally directly the one-group fluxes and reaction rates in order to prepare one-group cross sections necessary for the fuel depletion analysis. BGCore, on the other hand, uses a multi-group (MG) approach for generation of one group cross-sections. This coupling approach significantly reduces the code execution time without compromising the accuracy of the results. Substantial reduction in the BGCore code execution time allows consideration of problems with much higher degree of complexity, such as introduction of thermal hydraulic (TH) feedback into the calculation scheme. Recently, a simplified TH feedback module, THERMO, was developed and integrated into the BGCore system. To demonstrate the capabilities of the upgraded BGCore system, a coupled neutronic TH analysis of a full PWR core was performed. The BGCore results were compared with those of the state of the art 3D deterministic nodal diffusion code DYN3D (Grundmann et al.; 2000). Very good agreement in major core operational parameters including k-eff eigenvalue, axial and radial power profiles, and temperature distributions between the BGCore and DYN3D results was observed. This agreement confirms the consistency of the implementation of the TH feedback module. Although the upgraded BGCore system is capable of performing both, depletion and TH analyses, the calculations in this study were performed for the beginning of cycle state with pre-generated fuel compositions. © 2011 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show the feasibility of using quantum Monte Carlo (QMC) to compute benchmark energies for configuration samples of thermal-equilibrium water clusters and the bulk liquid containing up to 64 molecules. Evidence that the accuracy of these benchmarks approaches that of basis-set converged coupled-cluster calculations is noted. We illustrate the usefulness of the benchmarks by using them to analyze the errors of the popular BLYP approximation of density functional theory (DFT). The results indicate the possibility of using QMC as a routine tool for analyzing DFT errors for non-covalent bonding in many types of condensed-phase molecular system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents stochastic implicit coupling method intended for use in Monte-Carlo (MC) based reactor analysis systems that include burnup and thermal hydraulic (TH) feedbacks. Both feedbacks are essential for accurate modeling of advanced reactor designs and analyses of associated fuel cycles. In particular, we investigate the effect of different burnup-TH coupling schemes on the numerical stability and accuracy of coupled MC calculations. First, we present the beginning of time step method which is the most commonly used. The accuracy of this method depends on the time step length and it is only conditionally stable. This work demonstrates that even for relatively short time steps, this method can be numerically unstable. Namely, the spatial distribution of neutronic and thermal hydraulic parameters, such as nuclide densities and temperatures, exhibit oscillatory behavior. To address the numerical stability issue, new implicit stochastic methods are proposed. The methods solve the depletion and TH problems simultaneously and use under-relaxation to speed up convergence. These methods are numerically stable and accurate even for relatively large time steps and require less computation time than the existing methods. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, the Serpent Monte Carlo code was used as a tool for preparation of homogenized few-group cross sections for the nodal diffusion analysis of Sodium cooled Fast Reactor (SFR) cores. Few-group constants for two reference SFR cores were generated by Serpent and then employed by nodal diffusion code DYN3D in 2D full core calculations. The DYN3D results were verified against the references full core Serpent Monte Carlo solutions. A good agreement between the reference Monte Carlo and nodal diffusion results was observed demonstrating the feasibility of using Serpent for generation of few-group constants for the deterministic SFR analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous studies have reported that different schemes for coupling Monte Carlo (MC) neutron transport with burnup and thermal hydraulic feedbacks may potentially be numerically unstable. This issue can be resolved by application of implicit methods, such as the stochastic implicit mid-point (SIMP) methods. In order to assure numerical stability, the new methods do require additional computational effort. The instability issue however, is problem-dependent and does not necessarily occur in all cases. Therefore, blind application of the unconditionally stable coupling schemes, and thus incurring extra computational costs, may not always be necessary. In this paper, we attempt to develop an intelligent diagnostic mechanism, which will monitor numerical stability of the calculations and, if necessary, switch from simple and fast coupling scheme to more computationally expensive but unconditionally stable one. To illustrate this diagnostic mechanism, we performed a coupled burnup and TH analysis of a single BWR fuel assembly. The results indicate that the developed algorithm can be easily implemented in any MC based code for monitoring of numerical instabilities. The proposed monitoring method has negligible impact on the calculation time even for realistic 3D multi-region full core calculations. © 2014 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We studied the self-assembly of polydisperse diblock copolymers under various confined states by Monte Carlo simulation. When the copolymers were confined within two parallel walls, it was found that the ordered strip structures appeared alternately with the increase in wall width. Moreover, the wall width at which the ordered structure appeared tended to increase with an increase in the polydispersity index (PDI). On the other hand, the simulation results showed that the copolymers were likely to form ordered concentric strip structures when they were confined within a circle wall.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effects of blend composition on morphology, order-disorder transition (ODT), and chain conformation of symmetric ABA/AB copolymer blends confined between two neutral hard walls have been investigated by lattice Monte Carlo simulation. Only lamellar structure is observed in all the simulation morphologies under thermodynamic equilibrium state, which is supported by theoretical prediction. When the composition of AB diblock copolymer (phi) increases, both lamellar spacing and the corresponding ODT temperature increase, which can be attributed to the variation of conformation distribution of the diblock and the triblock copolymer chains. In addition, both diblock and triblock copolymer, chains with bridge conformation extend dramatically in the direction parallel to the surface when the system is in ordered state. Finally, the copolymer chain conformation depends strongly on both the blend composition and the incompatibility parameter chi N.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The self-assembly of diblock copolymer mixtures (A-b-B/A-b-C or A-b-B/B-b-C mixtures) subjected to cylindrical confinement (two-dimensional confinement) was investigated using a Monte Carlo method. In this study, the boundary surfaces were configured to attract blocks A but repel blocks B and C. Relative to the structures of the individual components, the self-assembled structures of mixtures of the diblock copolymers were more complex and interesting. Under cylindrical confinement, with varying cylinder diameters and interaction energies between the boundary surfaces and the blocks, we observed a variety of interesting morphologies. Upon decreasing the cylinder's diameter, the self-assembled structures of the A(15)B(15)/A(15)C(15) mixtures changed from double-helix/cylinder structures (blocks B and C formed double helices, whereas blocks A formed the outer barrel and inner core) to stacked disk/cylinder structures (blocks B and C formed the stacked disk core, blocks A formed the outer cylindrical barrel), whereas the self-assembled structures of the A(15)B(7)/B7C15 mixtures changed from concentric cylindrical barrel structures to screw/cylinder structures (blocks C formed an inside core winding with helical stripes, whereas blocks A and B formed the outer cylindrical barrels) and then finally to the stacked disk/cylinder structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dynamic structure factor of neutron quasi-elastic scattering has been calculated by Monte Carlo methods for atoms diffusing on a disordered lattice. The disorder includes not only variation in the distances between neighbouring atomic sites but also variation in the hopping rate associated with each site. The presence of the disorder, particularly the hopping rate disorder, causes changes in the time-dependent intermediate scattering function which translate into a significant increase in the intensity in the wings of the quasi-elastic spectrum as compared with the Lorentzian form. The effect is particularly marked at high values of the momentum transfer and at site occupancies of the order of unity. The MC calculations demonstrate how the degree of disorder may be derived from experimental measurements of the quasi-elastic scattering. The model structure factors are compared with the experimental quasi-elastic spectrum of an amorphous metal-hydrogen alloy.