974 resultados para Particle Markov chain Monte Carlo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The conformational transition from coil to extended coil for polygalacturonic acid has been studied by conductometric titrations and Monte Carlo simulations. The results of conductometric titrations at different polymer concentrations have been analyzed using the model proposed by Manning,1 which describes the conductivity of polyelectrolitic solutions. This experimental approach provides the transport factor and the average distance between charged groups at different degrees of ionization (α). The mean distances between charged groups have been compared with the values obtained by Monte Carlo simulations. In these simulations the polymer chain is modeled as a self-avoiding random walk in a cubic lattice. The monomers interact through the unscreened Coulombic potential. The ratio between the end-to-end distance and the number of ionized beads provides the average distance between charged monomers. The experimental and theoretical values are in good agreement for the whole range of ionization degrees accessed by conductometric titrations. These results suggest that the electrostatic interactions seem to be the major contribution for the coil to extended coil conformational change. The small deviations for α ≤ 0.5 suggests that the stiffness of the chain, associated with local interactions, becomes increasingly significant as the fraction of charged groups is decreased. © 2000 American Chemical Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diese Arbeit beschäftigt sich mit Strukturbildung im schlechten Lösungsmittel bei ein- und zweikomponentigen Polymerbürsten, bei denen Polymerketten durch Pfropfung am Substrat verankert sind. Solche Systeme zeigen laterale Strukturbildungen, aus denen sich interessante Anwendungen ergeben. Die Bewegung der Polymere erfolgt durch Monte Carlo-Simulationen im Kontinuum, die auf CBMC-Algorithmen sowie lokalen Monomerverschiebungen basieren. Eine neu entwickelte Variante des CBMC-Algorithmus erlaubt die Bewegung innerer Kettenteile, da der bisherige Algorithmus die Monomere in Nähe des Pfropfmonomers nicht gut relaxiert. Zur Untersuchung des Phasenverhaltens werden mehrere Analysemethoden entwickelt und angepasst: Dazu gehören die Minkowski-Maße zur Strukturuntersuchung binären Bürsten und die Pfropfkorrelationen zur Untersuchung des Einflusses von Pfropfmustern. Bei einkomponentigen Bürsten tritt die Strukturbildung nur beim schwach gepfropften System auf, dichte Pfropfungen führen zu geschlossenen Bürsten ohne laterale Struktur. Für den graduellen Übergang zwischen geschlossener und aufgerissener Bürste wird ein Temperaturbereich bestimmt, in dem der Übergang stattfindet. Der Einfluss des Pfropfmusters (Störung der Ausbildung einer langreichweitigen Ordnung) auf die Bürstenkonfiguration wird mit den Pfropfkorrelationen ausgewertet. Bei unregelmäßiger Pfropfung sind die gebildeten Strukturen größer als bei regelmäßiger Pfropfung und auch stabiler gegen höhere Temperaturen. Bei binären Systemen bilden sich Strukturen auch bei dichter Pfropfung aus. Zu den Parametern Temperatur, Pfropfdichte und Pfropfmuster kommt die Zusammensetzung der beiden Komponenten hinzu. So sind weitere Strukturen möglich, bei gleicher Häufigkeit der beiden Komponenten bilden sich streifenförmige, lamellare Muster, bei ungleicher Häufigkeit formt die Minoritätskomponente Cluster, die in der Majoritätskomponente eingebettet sind. Selbst bei gleichmäßig gepfropften Systemen bildet sich keine langreichweitige Ordnung aus. Auch bei binären Bürsten hat das Pfropfmuster großen Einfluss auf die Strukturbildung. Unregelmäßige Pfropfmuster führen schon bei höheren Temperaturen zur Trennung der Komponenten, die gebildeten Strukturen sind aber ungleichmäßiger und etwas größer als bei gleichmäßig gepfropften Systemen. Im Gegensatz zur self consistent field-Theorie berücksichtigen die Simulationen Fluktuationen in der Pfropfung und zeigen daher bessere Übereinstimmungen mit dem Experiment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supernovae are among the most energetic events occurring in the universe and are so far the only verified extrasolar source of neutrinos. As the explosion mechanism is still not well understood, recording a burst of neutrinos from such a stellar explosion would be an important benchmark for particle physics as well as for the core collapse models. The neutrino telescope IceCube is located at the Geographic South Pole and monitors the antarctic glacier for Cherenkov photons. Even though it was conceived for the detection of high energy neutrinos, it is capable of identifying a burst of low energy neutrinos ejected from a supernova in the Milky Way by exploiting the low photomultiplier noise in the antarctic ice and extracting a collective rate increase. A signal Monte Carlo specifically developed for water Cherenkov telescopes is presented. With its help, we will investigate how well IceCube can distinguish between core collapse models and oscillation scenarios. In the second part, nine years of data taken with the IceCube precursor AMANDA will be analyzed. Intensive data cleaning methods will be presented along with a background simulation. From the result, an upper limit on the expected occurrence of supernovae within the Milky Way will be determined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents the implementation and validation of a dose calculation approach for deforming anatomical objects. Deformation is represented by deformation vector fields leading to deformed voxel grids representing the different deformation scenarios. Particle transport in the resulting deformed voxels is handled through the approximation of voxel surfaces by triangles in the geometry implementation of the Swiss Monte Carlo Plan framework. The focus lies on the validation methodology which uses computational phantoms representing the same physical object through regular and irregular voxel grids. These phantoms are chosen such that the new implementation for a deformed voxel grid can be compared directly with an established dose calculation algorithm for regular grids. Furthermore, separate validation of the aspects voxel geometry and the density changes resulting from deformation is achieved through suitable design of the validation phantom. We show that equivalent results are obtained with the proposed method and that no statistically significant errors are introduced through the implementation for irregular voxel geometries. This enables the use of the presented and validated implementation for further investigations of dose calculation on deforming anatomy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently photon Monte Carlo treatment planning (MCTP) for a patient stored in the patient database of a treatment planning system (TPS) can usually only be performed using a cumbersome multi-step procedure where many user interactions are needed. This means automation is needed for usage in clinical routine. In addition, because of the long computing time in MCTP, optimization of the MC calculations is essential. For these purposes a new graphical user interface (GUI)-based photon MC environment has been developed resulting in a very flexible framework. By this means appropriate MC transport methods are assigned to different geometric regions by still benefiting from the features included in the TPS. In order to provide a flexible MC environment, the MC particle transport has been divided into different parts: the source, beam modifiers and the patient. The source part includes the phase-space source, source models and full MC transport through the treatment head. The beam modifier part consists of one module for each beam modifier. To simulate the radiation transport through each individual beam modifier, one out of three full MC transport codes can be selected independently. Additionally, for each beam modifier a simple or an exact geometry can be chosen. Thereby, different complexity levels of radiation transport are applied during the simulation. For the patient dose calculation, two different MC codes are available. A special plug-in in Eclipse providing all necessary information by means of Dicom streams was used to start the developed MC GUI. The implementation of this framework separates the MC transport from the geometry and the modules pass the particles in memory; hence, no files are used as the interface. The implementation is realized for 6 and 15 MV beams of a Varian Clinac 2300 C/D. Several applications demonstrate the usefulness of the framework. Apart from applications dealing with the beam modifiers, two patient cases are shown. Thereby, comparisons are performed between MC calculated dose distributions and those calculated by a pencil beam or the AAA algorithm. Interfacing this flexible and efficient MC environment with Eclipse allows a widespread use for all kinds of investigations from timing and benchmarking studies to clinical patient studies. Additionally, it is possible to add modules keeping the system highly flexible and efficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major barrier to widespread clinical implementation of Monte Carlo dose calculation is the difficulty in characterizing the radiation source within a generalized source model. This work aims to develop a generalized three-component source model (target, primary collimator, flattening filter) for 6- and 18-MV photon beams that match full phase-space data (PSD). Subsource by subsource comparison of dose distributions, using either source PSD or the source model as input, allows accurate source characterization and has the potential to ease the commissioning procedure, since it is possible to obtain information about which subsource needs to be tuned. This source model is unique in that, compared to previous source models, it retains additional correlations among PS variables, which improves accuracy at nonstandard source-to-surface distances (SSDs). In our study, three-dimensional (3D) dose calculations were performed for SSDs ranging from 50 to 200 cm and for field sizes from 1 x 1 to 30 x 30 cm2 as well as a 10 x 10 cm2 field 5 cm off axis in each direction. The 3D dose distributions, using either full PSD or the source model as input, were compared in terms of dose-difference and distance-to-agreement. With this model, over 99% of the voxels agreed within +/-1% or 1 mm for the target, within 2% or 2 mm for the primary collimator, and within +/-2.5% or 2 mm for the flattening filter in all cases studied. For the dose distributions, 99% of the dose voxels agreed within 1% or 1 mm when the combined source model-including a charged particle source and the full PSD as input-was used. The accurate and general characterization of each photon source and knowledge of the subsource dose distributions should facilitate source model commissioning procedures by allowing scaling the histogram distributions representing the subsources to be tuned.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo (code GEANT) produced 6 and 15 MV phase space (PS) data were used to define several simple photon beam models. For creating the PS data the energy of starting electrons hitting the target was tuned to get correct depth dose data compared to measurements. The modeling process used the full PS information within the geometrical boundaries of the beam including all scattered radiation of the accelerator head. Scattered radiation outside the boundaries was neglected. Photons and electrons were assumed to be radiated from point sources. Four different models were investigated which involved different ways to determine the energies and locations of beam particles in the output plane. Depth dose curves, profiles, and relative output factors were calculated with these models for six field sizes from 5x5 to 40x40cm2 and compared to measurements. Model 1 uses a photon energy spectrum independent of location in the PS plane and a constant photon fluence in this plane. Model 2 takes into account the spatial particle fluence distribution in the PS plane. A constant fluence is used again in model 3, but the photon energy spectrum depends upon the off axis position. Model 4, finally uses the spatial particle fluence distribution and off axis dependent photon energy spectra in the PS plane. Depth dose curves and profiles for field sizes up to 10x10cm2 were not model sensitive. Good agreement between measured and calculated depth dose curves and profiles for all field sizes was reached for model 4. However, increasing deviations were found for increasing field sizes for models 1-3. Large deviations resulted for the profiles of models 2 and 3. This is due to the fact that these models overestimate and underestimate the energy fluence at large off axis distances. Relative output factors consistent with measurements resulted only for model 4.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phase-sensitive X-ray imaging shows a high sensitivity towards electron density variations, making it well suited for imaging of soft tissue matter. However, there are still open questions about the details of the image formation process. Here, a framework for numerical simulations of phase-sensitive X-ray imaging is presented, which takes both particle- and wave-like properties of X-rays into consideration. A split approach is presented where we combine a Monte Carlo method (MC) based sample part with a wave optics simulation based propagation part, leading to a framework that takes both particle- and wave-like properties into account. The framework can be adapted to different phase-sensitive imaging methods and has been validated through comparisons with experiments for grating interferometry and propagation-based imaging. The validation of the framework shows that the combination of wave optics and MC has been successfully implemented and yields good agreement between measurements and simulations. This demonstrates that the physical processes relevant for developing a deeper understanding of scattering in the context of phase-sensitive imaging are modelled in a sufficiently accurate manner. The framework can be used for the simulation of phase-sensitive X-ray imaging, for instance for the simulation of grating interferometry or propagation-based imaging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last years, the interest in proton radiotherapy is rapidly increasing. Protons provide superior physical properties compared with conventional radiotherapy using photons. These properties result in depth dose curves with a large dose peak at the end of the proton track and the finite proton range allows sparing the distally located healthy tissue. These properties offer an increased flexibility in proton radiotherapy, but also increase the demand in accurate dose estimations. To carry out accurate dose calculations, first an accurate and detailed characterization of the physical proton beam exiting the treatment head is necessary for both currently available delivery techniques: scattered and scanned proton beams. Since Monte Carlo (MC) methods follow the particle track simulating the interactions from first principles, this technique is perfectly suited to accurately model the treatment head. Nevertheless, careful validation of these MC models is necessary. While for the dose estimation pencil beam algorithms provide the advantage of fast computations, they are limited in accuracy. In contrast, MC dose calculation algorithms overcome these limitations and due to recent improvements in efficiency, these algorithms are expected to improve the accuracy of the calculated dose distributions and to be introduced in clinical routine in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We model Callisto's exosphere based on its ice as well as non-ice surface via the use of a Monte-Carlo exosphere model. For the ice component we implement two putative compositions that have been computed from two possible extreme formation scenarios of the satellite. One composition represents the oxidizing state and is based on the assumption that the building blocks of Callisto were formed in the protosolar nebula and the other represents the reducing state of the gas, based on the assumption that the satellite accreted from solids condensed in the jovian sub-nebula. For the non-ice component we implemented the compositions of typical CI as well as L type chondrites. Both chondrite types have been suggested to represent Callisto's non-ice composition best. As release processes we consider surface sublimation, ion sputtering and photon-stimulated desorption. Particles are followed on their individual trajectories until they either escape Callisto's gravitational attraction, return to the surface, are ionized, or are fragmented. Our density profiles show that whereas the sublimated species dominate close to the surface on the sun-lit side, their density profiles (with the exception of H and H-2) decrease much more rapidly than the sputtered particles. The Neutral gas and Ion Mass (NIM) spectrometer, which is part of the Particle Environment Package (PEP), will investigate Callisto's exosphere during the JUICE mission. Our simulations show that NIM will be able to detect sublimated and sputtered particles from both the ice and non-ice surface. NIM's measured chemical composition will allow us to distinguish between different formation scenarios. (C) 2015 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The energy and specific energy absorbed in the main cell compartments (nucleus and cytoplasm) in typical radiobiology experiments are usually estimated by calculations as they are not accessible for a direct measurement. In most of the work, the cell geometry is modelled using the combination of simple mathematical volumes. We propose a method based on high resolution confocal imaging and ion beam analysis (IBA) in order to import realistic cell nuclei geometries in Monte-Carlo simulations and thus take into account the variety of different geometries encountered in a typical cell population. Seventy-six cell nuclei have been imaged using confocal microscopy and their chemical composition has been measured using IBA. A cellular phantom was created from these data using the ImageJ image analysis software and imported in the Geant4 Monte-Carlo simulation toolkit. Total energy and specific energy distributions in the 76 cell nuclei have been calculated for two types of irradiation protocols: a 3 MeV alpha particle microbeam used for targeted irradiation and a 239Pu alpha source used for large angle random irradiation. Qualitative images of the energy deposited along the particle tracks have been produced and show good agreement with images of DNA double strand break signalling proteins obtained experimentally. The methodology presented in this paper provides microdosimetric quantities calculated from realistic cellular volumes. It is based on open-source oriented software that is publicly available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-label classification (MLC) is the supervised learning problem where an instance may be associated with multiple labels. Modeling dependencies between labels allows MLC methods to improve their performance at the expense of an increased computational cost. In this paper we focus on the classifier chains (CC) approach for modeling dependencies. On the one hand, the original CC algorithm makes a greedy approximation, and is fast but tends to propagate errors down the chain. On the other hand, a recent Bayes-optimal method improves the performance, but is computationally intractable in practice. Here we present a novel double-Monte Carlo scheme (M2CC), both for finding a good chain sequence and performing efficient inference. The M2CC algorithm remains tractable for high-dimensional data sets and obtains the best overall accuracy, as shown on several real data sets with input dimension as high as 1449 and up to 103 labels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-dimensional classification (MDC) is the supervised learning problem where an instance is associated with multiple classes, rather than with a single class, as in traditional classification problems. Since these classes are often strongly correlated, modeling the dependencies between them allows MDC methods to improve their performance – at the expense of an increased computational cost. In this paper we focus on the classifier chains (CC) approach for modeling dependencies, one of the most popular and highest-performing methods for multi-label classification (MLC), a particular case of MDC which involves only binary classes (i.e., labels). The original CC algorithm makes a greedy approximation, and is fast but tends to propagate errors along the chain. Here we present novel Monte Carlo schemes, both for finding a good chain sequence and performing efficient inference. Our algorithms remain tractable for high-dimensional data sets and obtain the best predictive performance across several real data sets.