982 resultados para Direct Simulation Monte Carlo Method
Resumo:
A crystal nucleus in a finite volume may exhibit phase coexistence with a surrounding fluid. The thermodynamic properties of the coexisting fluid (pressure and chemical potential) are enhanced relative to their coexistence values. This enhancement is uniquely related to the surface excess free energy. rnA model for weakly attractive soft colloidal particles is investigated, the so called Asakura-Oosawa model. In simulations, this model allows for the calculation of the pressure in the liquid using the virial formula directly. The phase coexistence pressure in the thermodynamic limit is obtained from the interface velocity method. We introduce a method by which the chemical potential in dense liquids can be measured. There is neither a need to locate the interface nor to compute the anisotropic interfacial tension to obtain nucleation barriers. Therefore, our analysis is appropriate for nuclei of arbitrary shape. Monte Carlo simulations over a wide range of nucleus volumes yield to nucleation barriers independent from the total system volume. The interfacial tension is determined via the ensemble-switch method, hence a detailed test of classical nucleation theory is possible. The anisotropy of the interfacial tension and the resulting non-spherical shape has only a minor effect on the barrier for the Asakura-Oosawa model.
Resumo:
Although the Monte Carlo (MC) method allows accurate dose calculation for proton radiotherapy, its usage is limited due to long computing time. In order to gain efficiency, a new macro MC (MMC) technique for proton dose calculations has been developed. The basic principle of the MMC transport is a local to global MC approach. The local simulations using GEANT4 consist of mono-energetic proton pencil beams impinging perpendicularly on slabs of different thicknesses and different materials (water, air, lung, adipose, muscle, spongiosa, cortical bone). During the local simulation multiple scattering, ionization as well as elastic and inelastic interactions have been taken into account and the physical characteristics such as lateral displacement, direction distributions and energy loss have been scored for primary and secondary particles. The scored data from appropriate slabs is then used for the stepwise transport of the protons in the MMC simulation while calculating the energy loss along the path between entrance and exit position. Additionally, based on local simulations the radiation transport of neutrons and the generated ions are included into the MMC simulations for the dose calculations. In order to validate the MMC transport, calculated dose distributions using the MMC transport and GEANT4 have been compared for different mono-energetic proton pencil beams impinging on different phantoms including homogeneous and inhomogeneous situations as well as on a patient CT scan. The agreement of calculated integral depth dose curves is better than 1% or 1 mm for all pencil beams and phantoms considered. For the dose profiles the agreement is within 1% or 1 mm in all phantoms for all energies and depths. The comparison of the dose distribution calculated using either GEANT4 or MMC in the patient also shows an agreement of within 1% or 1 mm. The efficiency of MMC is up to 200 times higher than for GEANT4. The very good level of agreement in the dose comparisons demonstrate that the newly developed MMC transport results in very accurate and efficient dose calculations for proton beams.
Resumo:
Markov chain Monte Carlo is a method of producing a correlated sample in order to estimate features of a complicated target distribution via simple ergodic averages. A fundamental question in MCMC applications is when should the sampling stop? That is, when are the ergodic averages good estimates of the desired quantities? We consider a method that stops the MCMC sampling the first time the width of a confidence interval based on the ergodic averages is less than a user-specified value. Hence calculating Monte Carlo standard errors is a critical step in assessing the output of the simulation. In particular, we consider the regenerative simulation and batch means methods of estimating the variance of the asymptotic normal distribution. We describe sufficient conditions for the strong consistency and asymptotic normality of both methods and investigate their finite sample properties in a variety of examples.
Resumo:
Detailed knowledge of the characteristics of the radiation field shaped by a multileaf collimator (MLC) is essential in intensity modulated radiotherapy (IMRT). A previously developed multiple source model (MSM) for a 6 MV beam was extended to a 15 MV beam and supplemented with an accurate model of an 80-leaf dynamic MLC. Using the supplemented MSM and the MC code GEANT, lateral dose distributions were calculated in a water phantom and a portal water phantom. A field which is normally used for the validation of the step and shoot technique and a field from a realistic IMRT treatment plan delivered with dynamic MLC are investigated. To assess possible spectral changes caused by the modulation of beam intensity by an MLC, the energy spectra in five portal planes were calculated for moving slits of different widths. The extension of the MSM to 15 MV was validated by analysing energy fluences, depth doses and dose profiles. In addition, the MC-calculated primary energy spectrum was verified with an energy spectrum which was reconstructed from transmission measurements. MC-calculated dose profiles using the MSM for the step and shoot case and for the dynamic MLC case are in very good agreement with the measured data from film dosimetry. The investigation of a 13 cm wide field shows an increase in mean photon energy of up to 16% for the 0.25 cm slit compared to the open beam for 6 MV and of up to 6% for 15 MV, respectively. In conclusion, the MSM supplemented with the dynamic MLC has proven to be a powerful tool for investigational and benchmarking purposes or even for dose calculations in IMRT.
Resumo:
Monte Carlo simulation is a powerful method in many natural and social sciences. But what sort of method is it? And where does its power come from? Are Monte Carlo simulations experiments, theories or something else? The aim of this talk is to answer these questions and to explain the power of Monte Carlo simulations. I provide a classification of Monte Carlo techniques and defend the claim that Monte Carlo simulation is a sort of inference.
Resumo:
This article proposes computing sensitivities of upper tail probabilities of random sums by the saddlepoint approximation. The considered sensitivity is the derivative of the upper tail probability with respect to the parameter of the summation index distribution. Random sums with Poisson or Geometric distributed summation indices and Gamma or Weibull distributed summands are considered. The score method with importance sampling is considered as an alternative approximation. Numerical studies show that the saddlepoint approximation and the method of score with importance sampling are very accurate. But the saddlepoint approximation is substantially faster than the score method with importance sampling. Thus, the suggested saddlepoint approximation can be conveniently used in various scientific problems.
Resumo:
We model Callisto's exosphere based on its ice as well as non-ice surface via the use of a Monte-Carlo exosphere model. For the ice component we implement two putative compositions that have been computed from two possible extreme formation scenarios of the satellite. One composition represents the oxidizing state and is based on the assumption that the building blocks of Callisto were formed in the protosolar nebula and the other represents the reducing state of the gas, based on the assumption that the satellite accreted from solids condensed in the jovian sub-nebula. For the non-ice component we implemented the compositions of typical CI as well as L type chondrites. Both chondrite types have been suggested to represent Callisto's non-ice composition best. As release processes we consider surface sublimation, ion sputtering and photon-stimulated desorption. Particles are followed on their individual trajectories until they either escape Callisto's gravitational attraction, return to the surface, are ionized, or are fragmented. Our density profiles show that whereas the sublimated species dominate close to the surface on the sun-lit side, their density profiles (with the exception of H and H-2) decrease much more rapidly than the sputtered particles. The Neutral gas and Ion Mass (NIM) spectrometer, which is part of the Particle Environment Package (PEP), will investigate Callisto's exosphere during the JUICE mission. Our simulations show that NIM will be able to detect sublimated and sputtered particles from both the ice and non-ice surface. NIM's measured chemical composition will allow us to distinguish between different formation scenarios. (C) 2015 Elsevier Inc. All rights reserved.
Resumo:
Monte Carlo simulations have been carried out to study the effect of temperature on the growth kinetics of a circular grain. This work demonstrates the importance of roughening fluctuations on the growth dynamics. Since the effect of thermal fluctuations is stronger in d =2 than in d =3, as predicted by d =3 theories of domain kinetics, the circular domain shrinks linearly with time as A (t)=A(0)-αt, where A (0) and A(t) are the initial and instantaneous areas, respectively. However, in contrast to d =3, the slope α is strongly temperature dependent for T≥0.6TC. An analytical theory which considers the thermal fluctuations agrees with the T dependence of the Monte Carlo data in this regime, and this model show that these fluctuations are responsible for the strong temperature dependence of the growth rate for d =2. Our results are particularly relevant to the problem of domain growth in surface science
Resumo:
The simulation of interest rate derivatives is a powerful tool to face the current market fluctuations. However, the complexity of the financial models and the way they are processed require exorbitant computation times, what is in clear conflict with the need of a processing time as short as possible to operate in the financial market. To shorten the computation time of financial derivatives the use of hardware accelerators becomes a must.
Resumo:
A Monte Carlo computer simulation technique, in which a continuum system is modeled employing a discrete lattice, has been applied to the problem of recrystallization. Primary recrystallization is modeled under conditions where the degree of stored energy is varied and nucleation occurs homogeneously (without regard for position in the microstructure). The nucleation rate is chosen as site saturated. Temporal evolution of the simulated microstructures is analyzed to provide the time dependence of the recrystallized volume fraction and grain sizes. The recrystallized volume fraction shows sigmoidal variations with time. The data are approximately fit by the Johnson-Mehl-Avrami equation with the expected exponents, however significant deviations are observed for both small and large recrystallized volume fractions. Under constant rate nucleation conditions, the propensity for irregular grain shapes is decreased and the density of two sided grains increases.
Resumo:
The aim of this work is to optimize a Monte Carlo (MC) kernel for electron radiation therapy (IOERT) compatible with intraoperative usage and to integrate it within an existing IOERT dedicated treatment planning system (TPS)
Resumo:
Meta-análisis del volumen de eritrocitos en altitud
Resumo:
Purpose: A fully three-dimensional (3D) massively parallelizable list-mode ordered-subsets expectation-maximization (LM-OSEM) reconstruction algorithm has been developed for high-resolution PET cameras. System response probabilities are calculated online from a set of parameters derived from Monte Carlo simulations. The shape of a system response for a given line of response (LOR) has been shown to be asymmetrical around the LOR. This work has been focused on the development of efficient region-search techniques to sample the system response probabilities, which are suitable for asymmetric kernel models, including elliptical Gaussian models that allow for high accuracy and high parallelization efficiency. The novel region-search scheme using variable kernel models is applied in the proposed PET reconstruction algorithm. Methods: A novel region-search technique has been used to sample the probability density function in correspondence with a small dynamic subset of the field of view that constitutes the region of response (ROR). The ROR is identified around the LOR by searching for any voxel within a dynamically calculated contour. The contour condition is currently defined as a fixed threshold over the posterior probability, and arbitrary kernel models can be applied using a numerical approach. The processing of the LORs is distributed in batches among the available computing devices, then, individual LORs are processed within different processing units. In this way, both multicore and multiple many-core processing units can be efficiently exploited. Tests have been conducted with probability models that take into account the noncolinearity, positron range, and crystal penetration effects, that produced tubes of response with varying elliptical sections whose axes were a function of the crystal's thickness and angle of incidence of the given LOR. The algorithm treats the probability model as a 3D scalar field defined within a reference system aligned with the ideal LOR. Results: This new technique provides superior image quality in terms of signal-to-noise ratio as compared with the histogram-mode method based on precomputed system matrices available for a commercial small animal scanner. Reconstruction times can be kept low with the use of multicore, many-core architectures, including multiple graphic processing units. Conclusions: A highly parallelizable LM reconstruction method has been proposed based on Monte Carlo simulations and new parallelization techniques aimed at improving the reconstruction speed and the image signal-to-noise of a given OSEM algorithm. The method has been validated using simulated and real phantoms. A special advantage of the new method is the possibility of defining dynamically the cut-off threshold over the calculated probabilities thus allowing for a direct control on the trade-off between speed and quality during the reconstruction.