974 resultados para radiotherapy treatments, Monte Carlo techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of Bayes' Theorem to signal processing provides a consistent framework for proceeding from prior knowledge to a posterior inference conditioned on both the prior knowledge and the observed signal data. The first part of the lecture will illustrate how the Bayesian methodology can be applied to a variety of signal processing problems. The second part of the lecture will introduce the concept of Markov Chain Monte-Carlo (MCMC) methods which is an effective approach to overcoming many of the analytical and computational problems inherent in statistical inference. Such techniques are at the centre of the rapidly developing area of Bayesian signal processing which, with the continual increase in available computational power, is likely to provide the underlying framework for most signal processing applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the Gaussian Process Density Sampler (GPDS), an exchangeable generative model for use in nonparametric Bayesian density estimation. Samples drawn from the GPDS are consistent with exact, independent samples from a fixed density function that is a transformation of a function drawn from a Gaussian process prior. Our formulation allows us to infer an unknown density from data using Markov chain Monte Carlo, which gives samples from the posterior distribution over density functions and from the predictive distribution on data space. We can also infer the hyperparameters of the Gaussian process. We compare this density modeling technique to several existing techniques on a toy problem and a skullreconstruction task.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides a direct comparison of two stochastic optimisation techniques (Markov Chain Monte Carlo and Sequential Monte Carlo) when applied to the problem of conflict resolution and aircraft trajectory control in air traffic management. The two methods are then also compared to another existing technique of Mixed-Integer Linear Programming which is also popular in distributed control. © 2011 IFAC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Treatment planning of heavy-ion radiotherapy involves predictive calculation of not only the physical dose but also the biological dose in a patient body. The goal in designing beam-modulating devices for heavy ion therapy is to achieve uniform biological effects across the spread-out Bragg peak (SOBP). To achieve this, a mathematical model of Bragg peak movement is presented. The parameters of this model have been resolved with Monte Carlo method. And a rotating wheel filter is designed basing on the velocity of the Bragg peak movement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to study the gas-phase chemical behavior of transactinides, an on-line isothermal chromatography apparatus has been developed and applied to separate short-lived technetium isotopes in the form of TcO3 from fission products. The fission products from a Cf-252 source were continuously and rapidly transported through the capillary to the isothermal chromatography apparatus using the N-2/KBr gas-jet techniques. Volatile oxide molecules were formed at the reaction zone kept at 900 degrees C since a trace amount of oxygen existed in the N-2 carrier gas. With the new developed isothermal chromatography apparatus, a selective separation of Tc from fission products was achieved. After isothermal chromatographic separation, Tc-101,Tc-103,Tc-104,Tc-105,Tc-106,Tc-107,Tc-108 were dominantly observed together with their Ru daughters in the gamma-spectrum, The chemical yields of Tc-101, and Tc-104 and Tc-105 isotopes with longer half-lives are about 55-57%, and those of Tc-103, Tc-106 and Tc-108 isotopes with shorter half-lives dropped down to 25-28%. The adsorption enthalpy of the investigated compounds on quartz surfaces was determined to be -150 +/- 5 kJ/mol by fitting the measured retention curves with a Monte Carlo model. The observed species of technetium oxide is attributed to TcO3, which is in good agreement with previous experimental results. That means our system worked properly and it can be used to investigate the gas-phase chemical behavior of transactinides.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

研究了放射治疗中X射线在介质中的输运过程,编程实现了基于蒙特卡罗方法的剂量计算.并在便于图形处理的软件Matlab中对光子输运结果进行了可视化处理.对X射线在均匀介质和菲均匀介质中的蒙特卡罗模拟结果与实测结果、其他蒙特卡罗软件模拟结果进行了比较,结果符合较好.实验结果表明该方法既可以获得很快的仿真速度,又能得到精确直观的剂量计算结果,为提高放射治疗水平具有重要的指导意义和应用价值.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geological fluids exist in every geosphere of the Earth and play important roles in many processes of material transformations, energetic interchanges and geochemical interactions. To study the physicochemical properties and geochemical behaviors of geological fluids turn Girt to be one of the challenging issues in geosciences. Compared with conventional approaches of experiments and semi-theoretical modeling, computer simulation on molecular level shows its advantages on quantitative predictions of the physicochemical properties of geological fluids under extreme conditions and emerges as a promising approach to find the characteristics of geological fluids and their interactions in different geospheres of the Earth interior.This dissertation systematically discusses the physicochemical properties of typical geological fluids with state-of-the-art computer simulation techniques. The main results can be summarized as follows: (1) The experimental phase behaviors of the systems CH4-C2H6 and. CO2 have been successfully reproduced with Monte Carlo simulations. (2) Through comprehensive isothermal-isobaric molecular dynamics simulations, the PVT data of water hia^e been extended beyond experimental range to about 2000 K and 20 GPa and an improved equation of state for water has been established. (3) Based on extensive computer simulations, am optimized molecular potential for carbon dioxide have been proposed, this model is expected to predict different properties of carbon dioxide (volumetric properties, phase equilibria, heat of vaporization, structural and dynamical properties) with improved accuracies. (4) On the basis of the above researches of the end-members, a set of parameters for unlike interactions has been proposed by non-linear fitting to the ab initio potential surface of CO2-H2O and is superior to the common used mixing rule and the results of prior workers vs/Ith remarkable accuracies, then a number of simulations of the mixture have been carried out to generate data under high temperatures and pressures as an important complement to the limited experiments. (5) With molecular dynamics simulations, various structural, dynamical and thermodynamical properties of ionic solvations and associations have been oomprehensively analyzed, these results not only agree well with experimental data and first principle calculation results, but also reveal some new insights into the microscopic ionic solvation and association processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

C.R. Bull and R. Zwiggelaar, 'Discrimination between low atomic number materials from their characteristic scattering of X-ray radiation', Journal of Agricultural Engineering Research 68 (2), 77-87 (1997)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-permittivity ("high-k") dielectric materials are used in the transistor gate stack in integrated circuits. As the thickness of silicon oxide dielectric reduces below 2 nm with continued downscaling, the leakage current because of tunnelling increases, leading to high power consumption and reduced device reliability. Hence, research concentrates on finding materials with high dielectric constant that can be easily integrated into a manufacturing process and show the desired properties as a thin film. Atomic layer deposition (ALD) is used practically to deposit high-k materials like HfO2, ZrO2, and Al2O3 as gate oxides. ALD is a technique for producing conformal layers of material with nanometer-scale thickness, used commercially in non-planar electronics and increasingly in other areas of science and technology. ALD is a type of chemical vapor deposition that depends on self-limiting surface chemistry. In ALD, gaseous precursors are allowed individually into the reactor chamber in alternating pulses. Between each pulse, inert gas is admitted to prevent gas phase reactions. This thesis provides a profound understanding of the ALD of oxides such as HfO2, showing how the chemistry affects the properties of the deposited film. Using multi-scale modelling of ALD, the kinetics of reactions at the growing surface is connected to experimental data. In this thesis, we use density functional theory (DFT) method to simulate more realistic models for the growth of HfO2 from Hf(N(CH3)2)4/H2O and HfCl4/H2O and for Al2O3 from Al(CH3)3/H2O.Three major breakthroughs are discovered. First, a new reaction pathway, ’multiple proton diffusion’, is proposed for the growth of HfO2 from Hf(N(CH3)2)4/H2O.1 As a second major breakthrough, a ’cooperative’ action between adsorbed precursors is shown to play an important role in ALD. By this we mean that previously-inert fragments can become reactive once sufficient molecules adsorb in their neighbourhood during either precursor pulse. As a third breakthrough, the ALD of HfO2 from Hf(N(CH3)2)4 and H2O is implemented for the first time into 3D on-lattice kinetic Monte-Carlo (KMC).2 In this integrated approach (DFT+KMC), retaining the accuracy of the atomistic model in the higher-scale model leads to remarkable breakthroughs in our understanding. The resulting atomistic model allows direct comparison with experimental techniques such as X-ray photoelectron spectroscopy and quartz crystal microbalance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transcriptional regulation has been studied intensively in recent decades. One important aspect of this regulation is the interaction between regulatory proteins, such as transcription factors (TF) and nucleosomes, and the genome. Different high-throughput techniques have been invented to map these interactions genome-wide, including ChIP-based methods (ChIP-chip, ChIP-seq, etc.), nuclease digestion methods (DNase-seq, MNase-seq, etc.), and others. However, a single experimental technique often only provides partial and noisy information about the whole picture of protein-DNA interactions. Therefore, the overarching goal of this dissertation is to provide computational developments for jointly modeling different experimental datasets to achieve a holistic inference on the protein-DNA interaction landscape.

We first present a computational framework that can incorporate the protein binding information in MNase-seq data into a thermodynamic model of protein-DNA interaction. We use a correlation-based objective function to model the MNase-seq data and a Markov chain Monte Carlo method to maximize the function. Our results show that the inferred protein-DNA interaction landscape is concordant with the MNase-seq data and provides a mechanistic explanation for the experimentally collected MNase-seq fragments. Our framework is flexible and can easily incorporate other data sources. To demonstrate this flexibility, we use prior distributions to integrate experimentally measured protein concentrations.

We also study the ability of DNase-seq data to position nucleosomes. Traditionally, DNase-seq has only been widely used to identify DNase hypersensitive sites, which tend to be open chromatin regulatory regions devoid of nucleosomes. We reveal for the first time that DNase-seq datasets also contain substantial information about nucleosome translational positioning, and that existing DNase-seq data can be used to infer nucleosome positions with high accuracy. We develop a Bayes-factor-based nucleosome scoring method to position nucleosomes using DNase-seq data. Our approach utilizes several effective strategies to extract nucleosome positioning signals from the noisy DNase-seq data, including jointly modeling data points across the nucleosome body and explicitly modeling the quadratic and oscillatory DNase I digestion pattern on nucleosomes. We show that our DNase-seq-based nucleosome map is highly consistent with previous high-resolution maps. We also show that the oscillatory DNase I digestion pattern is useful in revealing the nucleosome rotational context around TF binding sites.

Finally, we present a state-space model (SSM) for jointly modeling different kinds of genomic data to provide an accurate view of the protein-DNA interaction landscape. We also provide an efficient expectation-maximization algorithm to learn model parameters from data. We first show in simulation studies that the SSM can effectively recover underlying true protein binding configurations. We then apply the SSM to model real genomic data (both DNase-seq and MNase-seq data). Through incrementally increasing the types of genomic data in the SSM, we show that different data types can contribute complementary information for the inference of protein binding landscape and that the most accurate inference comes from modeling all available datasets.

This dissertation provides a foundation for future research by taking a step toward the genome-wide inference of protein-DNA interaction landscape through data integration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses the reliability of power electronics modules. The approach taken combines numerical modeling techniques with experimentation and accelerated testing to identify failure modes and mechanisms for the power module structure and most importantly the root cause of a potential failure. The paper details results for two types of failure (i) wire bond fatigue and (ii) substrate delamination. Finite element method modeling techniques have been used to predict the stress distribution within the module structures. A response surface optimisation approach has been employed to enable the optimal design and parameter sensitivity to be determined. The response surface is used by a Monte Carlo method to determine the effects of uncertainty in the design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present nine newly observed transits of TrES-3, taken as part of a transit timing program using the RISE instrument on the Liverpool Telescope. A Markov-Chain Monte Carlo analysis was used to determine the planet star radius ratio and inclination of the system, which were found to be R-p/R-star = 0.1664(-0.0018)(+0.0011) and i = 81.73(-0.04)(+0.13), respectively, consistent with previous results. The central transit times and uncertainties were also calculated, using a residual-permutation algorithm as an independent check on the errors. A re-analysis of eight previously published TrES-3 light curves was conducted to determine the transit times and uncertainties using consistent techniques. Whilst the transit times were not found to be in agreement with a linear ephemeris, giving chi(2) = 35.07 for 15 degrees of freedom, we interpret this to be the result of systematics in the light curves rather than a real transit timing variation. This is because the light curves that show the largest deviation from a constant period either have relatively little out-of-transit coverage or have clear systematics. A new ephemeris was calculated using the transit times and was found to be T-c(0) = 2454632.62610 +/- 0.00006 HJD and P = 1.3061864 +/- 0.0000005 days. The transit times were then used to place upper mass limits as a function of the period ratio of a potential perturbing planet, showing that our data are sufficiently sensitive to have probed sub-Earth mass planets in both interior and exterior 2:1 resonances, assuming that the additional planet is in an initially circular orbit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Margins are used in radiotherapy to assist in the calculation of planning target volumes. These margins can be determined by analysing the geometric uncertainties inherent to the radiotherapy planning and delivery process. An important part of this process is the study of electronic portal images collected throughout the course of treatment. Set-up uncertainties were determined for prostate radiotherapy treatments at our previous site and the new purpose-built centre, with margins determined using a number of different methods. In addition, the potential effect of reducing the action level from 5 mm to 3 mm for changing a patient set-up, based on off-line bony anatomy-based portal image analysis, was studied. Margins generated using different methodologies were comparable. It was found that set-up errors were reduced following relocation to the new centre. Although a significant increase in the number of corrections to a patient's set-up was predicted if the action level was reduced from 5 mm to 3 mm, minimal reduction in patient set-up uncertainties would be seen as a consequence. Prescriptive geometric uncertainty analysis not only supports calculation and justification of the margins used clinically to generate planning target volumes, but may also best be used to monitor trends in clinical practice or audit changes introduced by new equipment, technology or practice. Simulations on existing data showed that a 3 mm rather than a 5 mm action level during off-line, bony anatomy-based portal imaging would have had a minimal benefit for the patients studied in this work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The validity of load estimates from intermittent, instantaneous grab sampling is dependent on adequate spatial coverage by monitoring networks and a sampling frequency that re?ects the variability in the system under study. Catchments with a ?ashy hydrology due to surface runoff pose a particular challenge as intense short duration rainfall events may account for a signi?cant portion of the total diffuse transfer of pollution from soil to water in any hydrological year. This can also be exacerbated by the presence of strong background pollution signals from point sources during low flows. In this paper, a range of sampling methodologies and load estimation techniques are applied to phosphorus data from such a surface water dominated river system, instrumented at three sub-catchments (ranging from 3 to 5 km2 in area) with near-continuous monitoring stations. Systematic and Monte Carlo approaches were applied to simulate grab sampling using multiple strategies and to calculate an estimated load, Le based on established load estimation methods. Comparison with the actual load, Lt, revealed signi?cant average underestimation, of up to 60%, and high variability for all feasible sampling approaches. Further analysis of the time series provides an insight into these observations; revealing peak frequencies and power-law scaling in the distributions of P concentration, discharge and load associated with surface runoff and background transfers. Results indicate that only near-continuous monitoring that re?ects the rapid temporal changes in these river systems is adequate for comparative monitoring and evaluation purposes. While the implications of this analysis may be more tenable to small scale ?ashy systems, this represents an appropriate scale in terms of evaluating catchment mitigation strategies such as agri-environmental policies for managing diffuse P transfers in complex landscapes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Laser-driven proton and ion acceleration is an area of increasing research interest given the recent development of short pulse-high intensity lasers. Several groups have reported experiments to understand whether a laser-driven beam can be applied for radiobiological purposes and in each of these, the method to obtain dose and spectral analysis was slightly different. The difficulty with these studies is that the very large instantaneous dose rate is a challenge for commonly used dosimetry techniques, so that other more sophisticated procedures need to be explored. This paper aims to explain a method for obtaining the energetic spectrum and the dose of a laser-driven proton beam irradiating a cell dish used for radiobiology studies. The procedure includes the use of a magnet to have charge and energy separation of the laser-driven beam, Gafchromic films to have information on dose and partially on energy, and a Monte Carlo code to expand the measured data in order to obtain specific details of the proton spectrum on the cells. Two specific correction factors have to be calculated: one to take into account the variation of the dose response of the films as a function of the proton energy and the other to obtain the dose to the cell layer starting from the dose measured on the films. This method, particularly suited to irradiation delivered in a single laser shot, can be applied in any other radiobiological experiment performed with laser-driven proton beams, with the only condition that the initial proton spectrum has to be at least roughly known. The method was tested in an experiment conducted at Queen's University of Belfast using the TARANIS laser, where the mean energy of the protons crossing the cells was between 0.9 and 5 MeV, the instantaneous dose rate was estimated to be close to 10(9) Gy s(-1) and doses between 0.8 and 5 Gy were delivered to the cells in a single laser shot. The combination of the applied corrections modified the initial estimate of dose by up to 40%.