982 resultados para Gel Dosimetry, Monte Carlo Modelling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the analysis of probabilistic corrosion time initiation in reinforced concrete structures exposed to ions chloride penetration. Structural durability is an important criterion which must be evaluated in every type of structure, especially when these structures are constructed in aggressive atmospheres. Considering reinforced concrete members, chloride diffusion process is widely used to evaluate the durability. Therefore, at modelling this phenomenon, corrosion of reinforcements can be better estimated and prevented. These processes begin when a threshold level of chlorides concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in the literature, deterministic approaches fail to predict accurately the corrosion time initiation due to the inherently randomness observed in this process. In this regard, the durability can be more realistically represented using probabilistic approaches. A probabilistic analysis of ions chloride penetration is presented in this paper. The ions chloride penetration is simulated using the Fick's second law of diffusion. This law represents the chloride diffusion process, considering time dependent effects. The probability of failure is calculated using Monte Carlo simulation and the First Order Reliability Method (FORM) with a direct coupling approach. Some examples are considered in order to study these phenomena and a simplified method is proposed to determine optimal values for concrete cover.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents Bayesian solutions to inference problems for three types of social network data structures: a single observation of a social network, repeated observations on the same social network, and repeated observations on a social network developing through time. A social network is conceived as being a structure consisting of actors and their social interaction with each other. A common conceptualisation of social networks is to let the actors be represented by nodes in a graph with edges between pairs of nodes that are relationally tied to each other according to some definition. Statistical analysis of social networks is to a large extent concerned with modelling of these relational ties, which lends itself to empirical evaluation. The first paper deals with a family of statistical models for social networks called exponential random graphs that takes various structural features of the network into account. In general, the likelihood functions of exponential random graphs are only known up to a constant of proportionality. A procedure for performing Bayesian inference using Markov chain Monte Carlo (MCMC) methods is presented. The algorithm consists of two basic steps, one in which an ordinary Metropolis-Hastings up-dating step is used, and another in which an importance sampling scheme is used to calculate the acceptance probability of the Metropolis-Hastings step. In paper number two a method for modelling reports given by actors (or other informants) on their social interaction with others is investigated in a Bayesian framework. The model contains two basic ingredients: the unknown network structure and functions that link this unknown network structure to the reports given by the actors. These functions take the form of probit link functions. An intrinsic problem is that the model is not identified, meaning that there are combinations of values on the unknown structure and the parameters in the probit link functions that are observationally equivalent. Instead of using restrictions for achieving identification, it is proposed that the different observationally equivalent combinations of parameters and unknown structure be investigated a posteriori. Estimation of parameters is carried out using Gibbs sampling with a switching devise that enables transitions between posterior modal regions. The main goal of the procedures is to provide tools for comparisons of different model specifications. Papers 3 and 4, propose Bayesian methods for longitudinal social networks. The premise of the models investigated is that overall change in social networks occurs as a consequence of sequences of incremental changes. Models for the evolution of social networks using continuos-time Markov chains are meant to capture these dynamics. Paper 3 presents an MCMC algorithm for exploring the posteriors of parameters for such Markov chains. More specifically, the unobserved evolution of the network in-between observations is explicitly modelled thereby avoiding the need to deal with explicit formulas for the transition probabilities. This enables likelihood based parameter inference in a wider class of network evolution models than has been available before. Paper 4 builds on the proposed inference procedure of Paper 3 and demonstrates how to perform model selection for a class of network evolution models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In dieser Dissertation wurden die Methoden Homologiemodellierung und Molekulardynamik genutzt, um die Struktur und das Verhalten von Proteinen in Lösung zu beschreiben. Mit Hilfe der Röntgenkleinwinkelstreuung wurden die mit den Computermethoden erzeugten Vorhersagen verifiziert. Für das alpha-Hämolysin, ein Toxin von Staphylococcus aureus, das eine heptamere Pore formen kann, wurde erstmalig die monomere Struktur des Protein in Lösung beschrieben. Homologiemodellierung auf Basis verwandter Proteine, deren monomere Struktur bekannt war, wurde verwendet, um die monomere Struktur des Toxins vorherzusagen. Flexibilität von Strukturelementen in einer Molekulardynamiksimulation konnte mit der Funktionalität des Proteines korreliert werden: Intrinsische Flexibilität versetzt das Protein in die Lage den Konformationswechsel zur Pore nach Assemblierung zu vollziehen. Röntgenkleinwinkelstreuung bewies die Unterschiede der monomeren Struktur zu den Strukturen der verwandten Proteine und belegt den eigenen Vorschlag zur Struktur. Überdies konnten Arbeiten an einer Mutante, die in einer sogenannten Präporenkonformation arretiert und nicht in der Lage ist eine Pore zu formen, zeigen, dass dieser Übergangszustand mit der Rotationsachse senkrecht zur Membran gelagert ist. Eine geometrische Analyse beweist, dass es sterisch möglich ist ausgehend von dieser Konformation die Konformation der Pore zu erreichen. Eine energetische und kinetische Analyse dieses Konformationswechsels steht noch aus. Ein weiterer Teil der Arbeit befasst sich mit den Konformationswechseln von Hämocyaninen. Diese wurden experimentell mittels Röntgenkleinwinkelstreuung verfolgt. Konformationswechsel im Zusammenhang mit der Oxygenierung konnten für die 24meren Hämocyanine von Eurypelma californicum und Pandinus imperator beschrieben werden. Für eine Reihe von Hämocyaninen ist nachgewiesen, dass sie unter Einfluss des Agenz SDS Tyrosinaseaktivität entfalten können. Der Konformationswechsel der Hämocyanine von E. californicum und P. imperator bei der Aktivierung zur Tyrosinase mittels SDS wurde experimentell bestätigt und die Stellung der Dodekamere der Hämocyanine als wesentlich bei der Aktivierung festgestellt. Im Zusammenhang mit anderen Arbeiten gilt damit die Relaxierung der Struktur unter SDS-Einfluss und der sterische Einfluss auf die verbindenden Untereinheiten b & c als wahrscheinliche Ursache für die Aktivierung zur Tyrosinase. Eigene Software zum sogenannten rigid body-Modellierung auf der Basis von Röntgenkleinwinkelstreudaten wurde erstellt, um die Streudaten des hexameren Hämocyanins von Palinurus elephas und Palinurus argus unter Einfluss der Effektoren Urat und Koffein strukturell zu interpretieren. Die Software ist die erste Implementierung eines Monte Carlo-Algorithmus zum rigid body-Modelling. Sie beherrscht zwei Varianten des Algorithmus: In Verbindung mit simulated annealing können wahrscheinliche Konformationen ausgefiltert werden und in einer anschließenden systematischen Analyse kann eine Konformation geometrisch beschrieben werden. Andererseits ist ein weiterer, reiner Monte Carlo-Algorithmus in der Lage die Konformation als Dichteverteilung zu beschreiben.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kernkollaps-Supernovae werden von einem massiven Ausbruch niederenergetischer Neutrinos begleitet. Sie zählen zu den energiereichsten Erscheinungen im Universum und stellen die derzeit einzig bekannte Quelle extrasolarer Neutrinos dar.rnDie Detektion einer solchen Neutrinosignatur würde zu einem tieferen Verständnis des bislang unzureichend bekannten stellaren Explosionsmechanismus führen. rnDarüber hinaus würden neue Einblicke in den Bereich der Teilchenphysik und der Supernova-Modellierung ermöglicht. Das sich zur Zeit am geographischen Südpol im Aufbau befindliche Neutrinoteleskop IceCube wird 2011 fertig gestellt sein.rnIceCube besteht im endgültigen Ausbau aus 5160 Photovervielfachern, die sich in gitterförmiger Anordnung in Tiefen zwischen 1450m und 2450m unter der Eisoberfläche befinden. Durch den Nachweis von Tscherenkow-Photonenrnim antarktischen Gletscher ist es in der Lage, galaktische Supernovae über einen kollektiven Anstieg der Rauschraten in seinen Photonenvervielfachern nachzuweisen.rnIn dieser Arbeit werden verschiedene Studien zur Implementierung einer künstlichen Totzeit vorgestellt, welche korreliertes Rauschen unterdrücken und somit das Signal-Untergund-Verhältnis maximieren würden.rnEin weiterer Teil dieser Dissertation bestand in der Integration der Supernova-Datenakquise eine neue Experiment-Steuerungssoftware.rnFür den Analyseteil der Arbeit wurde ein Monte-Carlo für IceCube entwickelt und Neutinooszillations-Mechanismen und eine Reihe von Signalmodellen integriert. Ein Likelihoodhypothesen-Test wurde verwendet, um die Unterscheidbarkeit verschiedener Supernova- beziehungsweise Neutrinooszillations-Szenarien zu untersuchen. Desweiteren wurde analysiert inwieweit sich Schock-Anregungen und QCD-Phasenübergnag im Verlauf des Explosionsprozesses detektieren lassen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

I present a new experimental method called Total Internal Reflection Fluorescence Cross-Correlation Spectroscopy (TIR-FCCS). It is a method that can probe hydrodynamic flows near solid surfaces, on length scales of tens of nanometres. Fluorescent tracers flowing with the liquid are excited by evanescent light, produced by epi-illumination through the periphery of a high NA oil-immersion objective. Due to the fast decay of the evanescent wave, fluorescence only occurs for tracers in the ~100 nm proximity of the surface, thus resulting in very high normal resolution. The time-resolved fluorescence intensity signals from two laterally shifted (in flow direction) observation volumes, created by two confocal pinholes are independently measured and recorded. The cross-correlation of these signals provides important information for the tracers’ motion and thus their flow velocity. Due to the high sensitivity of the method, fluorescent species with different size, down to single dye molecules can be used as tracers. The aim of my work was to build an experimental setup for TIR-FCCS and use it to experimentally measure the shear rate and slip length of water flowing on hydrophilic and hydrophobic surfaces. However, in order to extract these parameters from the measured correlation curves a quantitative data analysis is needed. This is not straightforward task due to the complexity of the problem, which makes the derivation of analytical expressions for the correlation functions needed to fit the experimental data, impossible. Therefore in order to process and interpret the experimental results I also describe a new numerical method of data analysis of the acquired auto- and cross-correlation curves – Brownian Dynamics techniques are used to produce simulated auto- and cross-correlation functions and to fit the corresponding experimental data. I show how to combine detailed and fairly realistic theoretical modelling of the phenomena with accurate measurements of the correlation functions, in order to establish a fully quantitative method to retrieve the flow properties from the experiments. An importance-sampling Monte Carlo procedure is employed in order to fit the experiments. This provides the optimum parameter values together with their statistical error bars. The approach is well suited for both modern desktop PC machines and massively parallel computers. The latter allows making the data analysis within short computing times. I applied this method to study flow of aqueous electrolyte solution near smooth hydrophilic and hydrophobic surfaces. Generally on hydrophilic surface slip is not expected, while on hydrophobic surface some slippage may exists. Our results show that on both hydrophilic and moderately hydrophobic (contact angle ~85°) surfaces the slip length is ~10-15nm or lower, and within the limitations of the experiments and the model, indistinguishable from zero.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Excess adiposity is associated with increased risks of developing adult malignancies. To inform public health policy and guide further research, the incident cancer burden attributable to excess body mass index (BMI >or= 25 kg/m(2)) across 30 European countries were estimated. Population attributable risks (PARs) were calculated using European- and gender-specific risk estimates from a published meta-analysis and gender-specific mean BMI estimates from a World Health Organization Global Infobase. Country-specific numbers of new cancers were derived from Globocan2002. A ten-year lag-period between risk exposure and cancer incidence was assumed and 95% confidence intervals (CI) were estimated in Monte Carlo simulations. In 2002, there were 2,171,351 new all cancer diagnoses in the 30 countries of Europe. Estimated PARs were 2.5% (95% CI 1.5-3.6%) in men and 4.1% (2.3-5.9%) in women. These collectively corresponded to 70,288 (95% CI 40,069-100,668) new cases. Sensitivity analyses revealed estimates were most influenced by the assumed shape of the BMI distribution in the population and cancer-specific risk estimates. In a scenario analysis of a plausible contemporary (2008) population, the estimated PARs increased to 3.2% (2.1-4.3%) and 8.6% (5.6-11.5%), respectively, in men and women. Endometrial, post-menopausal breast and colorectal cancers accounted for 65% of these cancers. This analysis quantifies the burden of incident cancers attributable to excess BMI in Europe. The estimates reported here provide a baseline for future modelling, and underline the need for research into interventions to control weight in the context of endometrial, breast and colorectal cancer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radiocarbon production, solar activity, total solar irradiance (TSI) and solar-induced climate change are reconstructed for the Holocene (10 to 0 kyr BP), and TSI is predicted for the next centuries. The IntCal09/SHCal04 radiocarbon and ice core CO2 records, reconstructions of the geomagnetic dipole, and instrumental data of solar activity are applied in the Bern3D-LPJ, a fully featured Earth system model of intermediate complexity including a 3-D dynamic ocean, ocean sediments, and a dynamic vegetation model, and in formulations linking radiocarbon production, the solar modulation potential, and TSI. Uncertainties are assessed using Monte Carlo simulations and bounding scenarios. Transient climate simulations span the past 21 thousand years, thereby considering the time lags and uncertainties associated with the last glacial termination. Our carbon-cycle-based modern estimate of radiocarbon production of 1.7 atoms cm−2 s−1 is lower than previously reported for the cosmogenic nuclide production model by Masarik and Beer (2009) and is more in-line with Kovaltsov et al. (2012). In contrast to earlier studies, periods of high solar activity were quite common not only in recent millennia, but throughout the Holocene. Notable deviations compared to earlier reconstructions are also found on decadal to centennial timescales. We show that earlier Holocene reconstructions, not accounting for the interhemispheric gradients in radiocarbon, are biased low. Solar activity is during 28% of the time higher than the modern average (650 MeV), but the absolute values remain weakly constrained due to uncertainties in the normalisation of the solar modulation to instrumental data. A recently published solar activity–TSI relationship yields small changes in Holocene TSI of the order of 1 W m−2 with a Maunder Minimum irradiance reduction of 0.85 ± 0.16 W m−2. Related solar-induced variations in global mean surface air temperature are simulated to be within 0.1 K. Autoregressive modelling suggests a declining trend of solar activity in the 21st century towards average Holocene conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project assessed the effectiveness of polymer gel dosimeters as tools for measuring the dose deposited by and LET of a proton beam. A total of three BANG® dosimeter formulations were evaluated: BANG®-3-Pro-2 BANGkits™ for dose measurement and two BANG®-3 variants, the LET-Baseline and LET-Meter dosimeters, for LET measurement. All dosimeters were read out using an OCT scanner. The basic characteristics of the BANGkits™ were assessed in a series of photon and electron irradiations. The dose-response relationship was found to be sigmoidal with a threshold for response of approximately 15 cGy. The active region of the dosimeter, the volume in which dosimeter response is not inhibited by oxygen, was found to make up roughly one fourth of the total dosimeter volume. Delivering a dose across multiple fractions was found to yield a greater response than delivering the same dose in a single irradiation. The dosimeter was found to accurately measure a dose distribution produced by overlapping photon fields, yielding gamma pass rates of 95.4% and 93.1% from two planar gamma analyses. Proton irradiations were performed for measurements of proton dose and LET. Initial irradiations performed through the side of a dosimeter led to OCT artifacts. Gamma pass rates of 85.7% and 89.9% were observed in two planar gamma analyses. In irradiations performed through the base of a dosimeter, gel response was found to increase with height in the dosimeter, even in areas of constant dose. After a correction was applied, gamma pass rates of 94.6% and 99.3% were observed in two planar gamma analyses. Absolute dose measurements were substantially higher (33%-100%) than the delivered doses for proton irradiations. Issues encountered while calibrating the LET-Meter gel restricted analysis of the LET measurement data to the SOBP of a proton beam. LET-Meter overresponse was found to increase linearly with track-average LET across the LET range that could be investigated (1.5 keV/micron – 3.5 keV/micron).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the seventeenth of a series of symposia devoted to talks by students about their biochemical engineering research. The first, third, fifth, ninth, twelfth, and sixteenth were at Kansas State University, the second and fourth were at the University of Nebraska-Lincoln, the sixth was in Kansas City and was hosted by Iowa State University, the seventh, tenth, thirteenth, and seventeenth were at Iowa State University, the eighth and fourteenth were at the University of Missouri–Columbia, and the eleventh and fifteenth were at Colorado State University. Next year's symposium will be at the University of Colorado. Symposium proceedings are edited by faculty of the host institution. Because final publication usually takes place elsewhere, papers here are brief, and often cover work in progress. ContentsThe Effect of Polymer Dosage Conditions on the Properties of ProteinPolyelectrolyte Precipitates, K. H. Clark and C. E. Glatz, Iowa State University An Immobilized Enzyme Reactor/Separator for the Hydrolysis of Casein by Subtilisin Carlsberg, A. J. Bream, R. A. Yoshisato, and G. R. Carmichael, University of Iowa Cell Density Measurements in Hollow Fiber Bioreactors, Thomas Blute, Colorado State University The Hydrodynamics in an Air-Lift Reactor, Peter Sohn, George Y. Preckshot, and Rakesh K. Bajpai, University of Missouri–Columbia Local Liquid Velocity Measurements in a Split Cylinder Airlift Column, G. Travis Jones, Kansas State University Fluidized Bed Solid Substrate Trichoderma reesei Fermentation, S. Adisasmito, H. N. Karim, and R. P. Tengerdy, Colorado State University The Effect of 2,4-D Concentration on the Growth of Streptanthus tortuosis Cells in Shake Flask and Air-Lift Permenter Culture, I. C. Kong, R. D. Sjolund, and R. A. Yoshisato, University of Iowa Protein Engineering of Aspergillus niger Glucoamylase, Michael R. Sierks, Iowa State University Structured Kinetic Modeling of Hybidoma Growth and Monoclonal Antibody Production in Suspension Cultures, Brian C. Batt and Dhinakar S. Kampala, University of Colorado Modelling and Control of a Zymomonas mobilis Fermentation, John F. Kramer, M. N. Karim, and J. Linden, Colorado State University Modeling of Brettanomyces clausenii Fermentation on Mixtures of Glucose and Cellobiose, Max T. Bynum and Dhinakar S. Kampala, University of Colorado, Karel Grohmann and Charles E. Yyman, Solar Energy Research Institute Master Equation Modeling and Monte Carlo Simulation of Predator-Prey Interactions, R. 0. Fox, Y. Y. Huang, and L. T. Fan, Kansas State University Kinetics and Equilibria of Condensation Reactions Between Two Different Monosaccharides Catalyzed by Aspergillus niger Glucoamylase, Sabine Pestlin, Iowa State University Biodegradation of Metalworking Fluids, S. M. Lee, Ayush Gupta, L. E. Erickson, and L. T. Fan, Kansas State University Redox Potential, Toxicity and Oscillations in Solvent Fermentations, Kim Joong, Rakesh Bajpai, and Eugene L. Iannotti, University of Missouri–Columbia Using Structured Kinetic Models for Analyzing Instability in Recombinant Bacterial Cultures, William E. Bentley and Dhinakar S. Kompala, University of Colorado

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A stratigraphy-based chronology for the North Greenland Eemian Ice Drilling (NEEM) ice core has been derived by transferring the annual layer counted Greenland Ice Core Chronology 2005 (GICC05) and its model extension (GICC05modelext) from the NGRIP core to the NEEM core using 787 match points of mainly volcanic origin identified in the electrical conductivity measurement (ECM) and dielectrical profiling (DEP) records. Tephra horizons found in both the NEEM and NGRIP ice cores are used to test the matching based on ECM and DEP and provide five additional horizons used for the timescale transfer. A thinning function reflecting the accumulated strain along the core has been determined using a Dansgaard-Johnsen flow model and an isotope-dependent accumulation rate parameterization. Flow parameters are determined from Monte Carlo analysis constrained by the observed depth-age horizons. In order to construct a chronology for the gas phase, the ice age-gas age difference (Delta age) has been reconstructed using a coupled firn densification-heat diffusion model. Temperature and accumulation inputs to the Delta age model, initially derived from the water isotope proxies, have been adjusted to optimize the fit to timing constraints from d15N of nitrogen and high-resolution methane data during the abrupt onset of Greenland interstadials. The ice and gas chronologies and the corresponding thinning function represent the first chronology for the NEEM core, named GICC05modelext-NEEM-1. Based on both the flow and firn modelling results, the accumulation history for the NEEM site has been reconstructed. Together, the timescale and accumulation reconstruction provide the necessary basis for further analysis of the records from NEEM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a new method for ranking alternatives in multicriteria decision-making problems when there is imprecision concerning the alternative performances, component utility functions and weights. We assume decision maker?s preferences are represented by an additive multiattribute utility function, in which weights can be modeled by independent normal variables, fuzzy numbers, value intervals or by an ordinal relation. The approaches are based on dominance measures or exploring the weight space in order to describe which ratings would make each alternative the preferred one. On the one hand, the approaches based on dominance measures compute the minimum utility difference among pairs of alternatives. Then, they compute a measure by which to rank the alternatives. On the other hand, the approaches based on exploring the weight space compute confidence factors describing the reliability of the analysis. These methods are compared using Monte Carlo simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte-Carlo (MC) methods are a valuable tool for dosimetry in radiotherapy, including Intra-Operative Electron Radiotherapy (IOERT), since effects such as inhomogeneities or beam hardening may be realistically reproduced.