99 resultados para Direct Simulation Monte Carlo Method
Resumo:
Due the differences between interaction physics process with matter for protons and photons, the proton beam tomography (pCT) has some vantages to comparison with conventional tomography. Also it is confirmed that usually pCT has better dose distribution and highest contrast resolution. The pCT allow not only view the internal structure of an object without destroying it, but also directly measure of volume density of electrons. Also it is confirmed that usually pCT has better dose distribution and highest contrast resolution. At the same time, there are many scientific and technical aspects to a detailed study: the capacity and limitations of the pCT methods are not well clarified. Through computations, based on Monte Carlo Method was carried out a detailed study of the contribution of non-elastic nuclear spreading, and together was compared with an analytical model for the deflection angle and the lateral deflection of protons in the target volume. The programs used were SRIM 2006 code and MCNPX v.2.50 code
Resumo:
In proton therapy, the deposition of secondary particles energy originated by nuclear inelastic process (n, 2H, 3H, 3He and α) has a contribution in the total dose that deserves to be discussed. In calculations of plans implemented for routine treatment, the paid dose is calculated whereas the proton loses energy by ionization and or coulomb excitement. The contribution of inelastic processes associated with nuclear reactions is not considered. There are only estimates for pure materials or simple composition (water, for example), because of the difficulty of processing targets consisting of different materials. For this project, we use the Monte Carlo method employing the code MCNPX v2.50 (Monte Carlo N-Particle eXtended) to present results of the contribution to the total dose of secondary particles. In this work, it was implemented a cylindrical phantom composed by cortical bone, for proton beams between 100 and 200 MeV. With the results obtained, it was possible to generate graphics to analyze: the dose deposition relation with and without nuclear interaction, the multiplicity and percentage of deposited dose for each secondary particle and a radial dispersion of neutrons in the material
Resumo:
In radiotherapy, computational systems are used for radiation dose determination in the treatment’s volume and radiometric parameters quality analysis of equipment and field irradiated. Due to the increasing technological advancement, several research has been performed in brachytherapy for different computational algorithms development which may be incorporated to treatment planning systems, providing greater accuracy and confidence in the dose calculation. Informatics and information technology fields undergo constant updating and refinement, allowing the use Monte Carlo Method to simulate brachytherapy source dose distribution. The methodology formalization employed to dosimetric analysis is based mainly in the American Association of Physicists in Medicine (AAPM) studies, by Task Group nº 43 (TG-43) and protocols aimed at dosimetry of these radiation sources types. This work aims to analyze the feasibility of using the MCNP-5C (Monte Carlo N-Particle) code to obtain radiometric parameters of brachytherapy sources and so to study the radiation dose variation in the treatment planning. Simulations were performed for the radiation dose variation in the source plan and determined the dosimetric parameters required by TG-43 formalism for the characterization of the two high dose rate iridium-192 sources. The calculated values were compared with the presents in the literature, which were obtained with different Monte Carlo simulations codes. The results showed excellent consistency with the compared codes, enhancing MCNP-5C code the capacity and viability in the sources dosimetry employed in HDR brachytherapy. The method employed may suggest a possible incorporation of this code in the treatment planning systems provided by manufactures together with the equipment, since besides reducing acquisition cost, it can also make the used computational routines more comprehensive, facilitating the brachytherapy ...
Resumo:
The Therapy with proton beam has shown more e ective than Radiotherapy for oncology treatment. However, to its planning use photon beam Computing Tomography that not considers the fundamentals di erences the interaction with the matter between X-rays and Protons. Nowadays, there is a great e ort to develop Tomography with proton beam. In this way it is necessary to know the most likely trajectory of proton beam to image reconstruction. In this work was realized calculus of the most likely trajectory of proton beam in homogeneous target compound with water that was considered the inelastic nuclear interaction. Other calculus was the analytical calculation of lateral de ection of proton beam. In the calculation were utilized programs that use Monte Carlo Method: SRIM 2006 (Stopping and Range of Ions in Matter ), MCNPX (Monte Carlo N-Particle eXtended) v2.50. And to analytical calculation was employed the software Wolfram Mathematica v7.0. We obtained how di erent nuclear reaction models modify the trajectory of proton beam and the comparative between analytical and Monte Carlo method
Resumo:
Cosmic radiation has been identi ed as one of the main hazard to crew, aircraft and sensitive equipments involved in long-term missions and even high-altitude commercial ights. Generally, shields are used in spatial units to avoid excessive exposure, by holding the incident radiation. Unfortunatelly, shielding in space is problematic, especially when high-energy cosmic particles are considered, due to the production of large number of secondary particles, mainly neutrons, protons and alpha particles, caused by spallation reactions and quasi-elastic processes of the corpuscular radiation with the shield. Good parameters for checking the secondary particle production at target material are diferential cross section and energy deposited in the shield. Addition experiments, some computer codes based on Monte Carlo method show themselves a suitable tool to calculate shield parameters, due to have evaluated nuclear data libraries implemented on the algorithm. In view of this, the aim of this work is determining the parameters evaluated in shielding materials, by using MCNPX code, who shows good agreement with experimental data from literature. Among the materials, Aluminium had lower emission and production of secondary particles
Resumo:
The contribution of the total dose due to deposition of secondary energy particles caused by nuclear inelastic processes (n, 2H, 3H, 3He and ) in proton therapy is an opened problem and in discussion. In the calculations of plans implemented for routine treatment, the paid dose is calculated whereas that the proton loses energy by ionization and or coulomb excitement. The contribution of inelastic processes associated with nuclear reactions is not considered, mainly due to the difficulty of processing targets consisting of various materials. In this sense, there are only estimates for pure materials or simple composition (water, for example).This work presents the results of simulations by the Monte Carlo method employing the code MCNPX v2.50 (Monte Carlo N-Particle eXtended) of the contribution to the total dose of secondary particles. The study was implemented in a cylindrical phantom composed by compact bone, for monochromatic beams of protons between 100 and 200 MeV with pencil beam form
Resumo:
The sources of betatherapy for clinical use in Brazil are, the vast majority of strontium-90, radioactive element that is not produced in the country, and therefore requires importation of international laboratories accredited by the International Atomic Energy Agency (IAEA).The use of these resources is always limited the crediting of characteristic values supplied by the manufacturer tables that provide the nominal value of activity and dose distribution to determine the irradiation time of the injury. The Institute of Nuclear Energy Research (IPEN / CNEN-SP) has recently researching the emission profile of these types of radiation sources, and some jobs are being developed with ionization chambers extrapolation for the purpose of standardizing a systematic calibration sources betatherapy. Other studies using parallel measures dosimeters (TLD's) and simulations with the Monte Carlo method. Radiological films have also been used in studies of applicators dosimetric analysis of strontium-90. This paper seeks to analyze the different methods for calibration of applicators betatherapy, already consolidated in studies by examining the advantages and disadvantages of each procedure
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The code STATFLUX, implementing a new and simple statistical procedure for the calculation of transfer coefficients in radionuclide transport to animals and plants, is proposed. The method is based on the general multiple-compartment model, which uses a system of linear equations involving geometrical volume considerations. Flow parameters were estimated by employing two different least-squares procedures: Derivative and Gauss-Marquardt methods, with the available experimental data of radionuclide concentrations as the input functions of time. The solution of the inverse problem, which relates a given set of flow parameter with the time evolution of concentration functions, is achieved via a Monte Carlo Simulation procedure.Program summaryTitle of program: STATFLUXCatalogue identifier: ADYS_v1_0Program summary URL: http://cpc.cs.qub.ac.uk/summaries/ADYS_v1_0Program obtainable from: CPC Program Library, Queen's University of Belfast, N. IrelandLicensing provisions: noneComputer for which the program is designed and others on which it has been tested: Micro-computer with Intel Pentium III, 3.0 GHzInstallation: Laboratory of Linear Accelerator, Department of Experimental Physics, University of São Paulo, BrazilOperating system: Windows 2000 and Windows XPProgramming language used: Fortran-77 as implemented in Microsoft Fortran 4.0. NOTE: Microsoft Fortran includes non-standard features which are used in this program. Standard Fortran compilers such as, g77, f77, ifort and NAG95, are not able to compile the code and therefore it has not been possible for the CPC Program Library to test the program.Memory, required to execute with typical data: 8 Mbytes of RAM memory and 100 MB of Hard disk memoryNo. of bits in a word: 16No. of lines in distributed program, including test data, etc.: 6912No. of bytes in distributed Program, including test data, etc.: 229 541Distribution format: tar.gzNature of the physical problem: the investigation of transport mechanisms for radioactive substances, through environmental pathways, is very important for radiological protection of populations. One such pathway, associated with the food chain, is the grass-animal-man sequence. The distribution of trace elements in humans and laboratory animals has been intensively studied over the past 60 years [R.C. Pendlenton, C.W. Mays, R.D. Lloyd, A.L. Brooks, Differential accumulation of iodine-131 from local fallout in people and milk, Health Phys. 9 (1963) 1253-1262]. In addition, investigations on the incidence of cancer in humans, and a possible causal relationship to radioactive fallout, have been undertaken [E.S. Weiss, M.L. Rallison, W.T. London, W.T. Carlyle Thompson, Thyroid nodularity in southwestern Utah school children exposed to fallout radiation, Amer. J. Public Health 61 (1971) 241-249; M.L. Rallison, B.M. Dobyns, F.R. Keating, J.E. Rall, F.H. Tyler, Thyroid diseases in children, Amer. J. Med. 56 (1974) 457-463; J.L. Lyon, M.R. Klauber, J.W. Gardner, K.S. Udall, Childhood leukemia associated with fallout from nuclear testing, N. Engl. J. Med. 300 (1979) 397-402]. From the pathways of entry of radionuclides in the human (or animal) body, ingestion is the most important because it is closely related to life-long alimentary (or dietary) habits. Those radionuclides which are able to enter the living cells by either metabolic or other processes give rise to localized doses which can be very high. The evaluation of these internally localized doses is of paramount importance for the assessment of radiobiological risks and radiological protection. The time behavior of trace concentration in organs is the principal input for prediction of internal doses after acute or chronic exposure. The General Multiple-Compartment Model (GMCM) is the powerful and more accepted method for biokinetical studies, which allows the calculation of concentration of trace elements in organs as a function of time, when the flow parameters of the model are known. However, few biokinetics data exist in the literature, and the determination of flow and transfer parameters by statistical fitting for each system is an open problem.Restriction on the complexity of the problem: This version of the code works with the constant volume approximation, which is valid for many situations where the biological half-live of a trace is lower than the volume rise time. Another restriction is related to the central flux model. The model considered in the code assumes that exist one central compartment (e.g., blood), that connect the flow with all compartments, and the flow between other compartments is not included.Typical running time: Depends on the choice for calculations. Using the Derivative Method the time is very short (a few minutes) for any number of compartments considered. When the Gauss-Marquardt iterative method is used the calculation time can be approximately 5-6 hours when similar to 15 compartments are considered. (C) 2006 Elsevier B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Genetic and environmental heterogeneity of residual variance of weight traits in Nellore beef cattle
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper presents a method for calculating the power flow in distribution networks considering uncertainties in the distribution system. Active and reactive power are used as uncertain variables and probabilistically modeled through probability distribution functions. Uncertainty about the connection of the users with the different feeders is also considered. A Monte Carlo simulation is used to generate the possible load scenarios of the users. The results of the power flow considering uncertainty are the mean values and standard deviations of the variables of interest (voltages in all nodes, active and reactive power flows, etc.), giving the user valuable information about how the network will behave under uncertainty rather than the traditional fixed values at one point in time. The method is tested using real data from a primary feeder system, and results are presented considering uncertainty in demand and also in the connection. To demonstrate the usefulness of the approach, the results are then used in a probabilistic risk analysis to identify potential problems of undervoltage in distribution systems. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
In this work we compared the estimates of the parameters of ARCH models using a complete Bayesian method and an empirical Bayesian method in which we adopted a non-informative prior distribution and informative prior distribution, respectively. We also considered a reparameterization of those models in order to map the space of the parameters into real space. This procedure permits choosing prior normal distributions for the transformed parameters. The posterior summaries were obtained using Monte Carlo Markov chain methods (MCMC). The methodology was evaluated by considering the Telebras series from the Brazilian financial market. The results show that the two methods are able to adjust ARCH models with different numbers of parameters. The empirical Bayesian method provided a more parsimonious model to the data and better adjustment than the complete Bayesian method.
Resumo:
We have been developing a computational code to project optical lenses, with low aberration effects. Our main interest is model the human eye, particularly, project special corrective lenses. As the lens shape is the focus of the optimization, we have coupled a ray tracing method with Monte Carlo techniques. The initial results indicated that the algorithm must be improved in terms of resolution and reliability.
Resumo:
Recent studies have demonstrated that the sheath dynamics in plasma immersion ion implantation (PIII) is significantly affected by an external magnetic field. In this paper, a two-dimensional computer simulation of a magnetic-field-enhanced PHI system is described. Negative bias voltage is applied to a cylindrical target located on the axis of a grounded vacuum chamber filled with uniform molecular nitrogen plasma. A static magnetic field is created by a small coil installed inside the target holder. The vacuum chamber is filled with background nitrogen gas to form a plasma in which collisions of electrons and neutrals are simulated by the Monte Carlo algorithm. It is found that a high-density plasma is formed around the target due to the intense background gas ionization by the magnetized electrons drifting in the crossed E x B fields. The effect of the magnetic field intensity, the target bias, and the gas pressure on the sheath dynamics and implantation current of the PHI system is investigated.