296 resultados para Kintetic Monte Carlo


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose The purpose of this investigation was to assess the angular dependence of a commercial optically stimulated luminescence dosimeter (OSLD) dosimetry system in MV x-ray beams at depths beyondd max and to find ways to mitigate this dependence for measurements in phantoms. Methods Two special holders were designed which allow a dosimeter to be rotated around the center of its sensitive volume. The dosimeter's sensitive volume is a disk, 5 mm in diameter and 0.2 mm thick. The first holder rotates the disk in the traditional way. It positions the disk perpendicular to the beam (gantry pointing to the floor) in the initial position (0°). When the holder is rotated the angle of the disk towards the beam increases until the disk is parallel with the beam (“edge on,” 90°). This is referred to as Setup 1. The second holder offers a new, alternative measurement position. It positions the disk parallel to the beam for all angles while rotating around its center (Setup 2). Measurements with five to ten dosimeters per point were carried out for 6 MV at 3 and 10 cm depth. Monte Carlo simulations using GEANT4 were performed to simulate the response of the active detector material for several angles. Detector and housing were simulated in detail based on microCT data and communications with the manufacturer. Various material compositions and an all-water geometry were considered. Results For the traditional Setup 1 the response of the OSLD dropped on average by 1.4% ± 0.7% (measurement) and 2.1% ± 0.3% (Monte Carlo simulation) for the 90° orientation compared to 0°. Monte Carlo simulations also showed a strong dependence of the effect on the composition of the sensitive layer. Assuming the layer to completely consist of the active material (Al2O3) results in a 7% drop in response for 90° compared to 0°. Assuming the layer to be completely water, results in a flat response within the simulation uncertainty of about 1%. For the new Setup 2, measurements and Monte Carlo simulations found the angular dependence of the dosimeter to be below 1% and within the measurement uncertainty. Conclusions The dosimeter system exhibits a small angular dependence of approximately 2% which needs to be considered for measurements involving other than normal incident beams angles. This applies in particular to clinicalin vivo measurements where the orientation of the dosimeter is dictated by clinical circumstances and cannot be optimized as otherwise suggested here. When measuring in a phantom, the proposed new setup should be considered. It changes the orientation of the dosimeter so that a coplanar beam arrangement always hits the disk shaped detector material from the thin side and thereby reduces the angular dependence of the response to within the measurement uncertainty of about 1%. This improvement makes the dosimeter more attractive for clinical measurements with multiple coplanar beams in phantoms, as the overall measurement uncertainty is reduced. Similarly, phantom based postal audits can transition from the traditional TLD to the more accurate and convenient OSLD.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper demonstrates the procedures for probabilistic assessment of a pesticide fate and transport model, PCPF-1, to elucidate the modeling uncertainty using the Monte Carlo technique. Sensitivity analyses are performed to investigate the influence of herbicide characteristics and related soil properties on model outputs using four popular rice herbicides: mefenacet, pretilachlor, bensulfuron-methyl and imazosulfuron. Uncertainty quantification showed that the simulated concentrations in paddy water varied more than those of paddy soil. This tendency decreased as the simulation proceeded to a later period but remained important for herbicides having either high solubility or a high 1st-order dissolution rate. The sensitivity analysis indicated that PCPF-1 parameters requiring careful determination are primarily those involve with herbicide adsorption (the organic carbon content, the bulk density and the volumetric saturated water content), secondary parameters related with herbicide mass distribution between paddy water and soil (1st-order desorption and dissolution rates) and lastly, those involving herbicide degradations. © Pesticide Science Society of Japan.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Uncertainty assessments of herbicide losses from rice paddies in Japan associated with local meteorological conditions and water management practices were performed using a pesticide fate and transport model, PCPF-1, under the Monte Carlo (MC) simulation scheme. First, MC simulations were conducted for five different cities with a prescribed water management scenario and a 10-year meteorological dataset of each city. The effectiveness of water management was observed regarding the reduction of pesticide runoff. However, a greater potential of pesticide runoff remained in Western Japan. Secondly, an extended analysis was attempted to evaluate the effects of local water management and meteorological conditions between the Chikugo River basin and the Sakura River basin using uncertainty inputs processed from observed water management data. The results showed that because of more severe rainfall events, significant pesticide runoff occurred in the Chikugo River basin even when appropriate irrigation practices were implemented. © Pesticide Science Society of Japan.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Monitoring studies revealed high concentrations of pesticides in the drainage canal of paddy fields. It is important to have a way to predict these concentrations in different management scenarios as an assessment tool. A simulation model for predicting the pesticide concentration in a paddy block (PCPF-B) was evaluated and then used to assess the effect of water management practices for controlling pesticide runoff from paddy fields. RESULTS: The PCPF-B model achieved an acceptable performance. The model was applied to a constrained probabilistic approach using the Monte Carlo technique to evaluate the best management practices for reducing runoff of pretilachlor into the canal. The probabilistic model predictions using actual data of pesticide use and hydrological data in the canal showed that the water holding period (WHP) and the excess water storage depth (EWSD) effectively reduced the loss and concentration of pretilachlor from paddy fields to the drainage canal. The WHP also reduced the timespan of pesticide exposure in the drainage canal. CONCLUSIONS: It is recommended that: (1) the WHP be applied for as long as possible, but for at least 7 days, depending on the pesticide and field conditions; (2) an EWSD greater than 2 cm be maintained to store substantial rainfall in order to prevent paddy runoff, especially during the WHP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The contemporary methodology for growth models of organisms is based on continuous trajectories and thus it hinders us from modelling stepwise growth in crustacean populations. Growth models for fish are normally assumed to follow a continuous function, but a different type of model is needed for crustacean growth. Crustaceans must moult in order for them to grow. The growth of crustaceans is a discontinuous process due to the periodical shedding of the exoskeleton in moulting. The stepwise growth of crustaceans through the moulting process makes the growth estimation more complex. Stochastic approaches can be used to model discontinuous growth or what are commonly known as "jumps" (Figure 1). However, in stochastic growth model we need to ensure that the stochastic growth model results in only positive jumps. In view of this, we will introduce a subordinator that is a special case of a Levy process. A subordinator is a non-decreasing Levy process, that will assist in modelling crustacean growth for better understanding of the individual variability and stochasticity in moulting periods and increments. We develop the estimation methods for parameter estimation and illustrate them with the help of a dataset from laboratory experiments. The motivational dataset is from the ornate rock lobster, Panulirus ornatus, which can be found between Australia and Papua New Guinea. Due to the presence of sex effects on the growth (Munday et al., 2004), we estimate the growth parameters separately for each sex. Since all hard parts are shed too often, the exact age determination of a lobster can be challenging. However, the growth parameters for the aforementioned moult processes from tank data being able to estimate through: (i) inter-moult periods, and (ii) moult increment. We will attempt to derive a joint density, which is made up of two functions: one for moult increments and the other for time intervals between moults. We claim these functions are conditionally independent given pre-moult length and the inter-moult periods. The variables moult increments and inter-moult periods are said to be independent because of the Markov property or conditional probability. Hence, the parameters in each function can be estimated separately. Subsequently, we integrate both of the functions through a Monte Carlo method. We can therefore obtain a population mean for crustacean growth (e. g. red curve in Figure 1). [GRAPHICS]

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A modeling paradigm is proposed for covariate, variance and working correlation structure selection for longitudinal data analysis. Appropriate selection of covariates is pertinent to correct variance modeling and selecting the appropriate covariates and variance function is vital to correlation structure selection. This leads to a stepwise model selection procedure that deploys a combination of different model selection criteria. Although these criteria find a common theoretical root based on approximating the Kullback-Leibler distance, they are designed to address different aspects of model selection and have different merits and limitations. For example, the extended quasi-likelihood information criterion (EQIC) with a covariance penalty performs well for covariate selection even when the working variance function is misspecified, but EQIC contains little information on correlation structures. The proposed model selection strategies are outlined and a Monte Carlo assessment of their finite sample properties is reported. Two longitudinal studies are used for illustration.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes solutions to three issues pertaining to the estimation of finite mixture models with an unknown number of components: the non-identifiability induced by overfitting the number of components, the mixing limitations of standard Markov Chain Monte Carlo (MCMC) sampling techniques, and the related label switching problem. An overfitting approach is used to estimate the number of components in a finite mixture model via a Zmix algorithm. Zmix provides a bridge between multidimensional samplers and test based estimation methods, whereby priors are chosen to encourage extra groups to have weights approaching zero. MCMC sampling is made possible by the implementation of prior parallel tempering, an extension of parallel tempering. Zmix can accurately estimate the number of components, posterior parameter estimates and allocation probabilities given a sufficiently large sample size. The results will reflect uncertainty in the final model and will report the range of possible candidate models and their respective estimated probabilities from a single run. Label switching is resolved with a computationally light-weight method, Zswitch, developed for overfitted mixtures by exploiting the intuitiveness of allocation-based relabelling algorithms and the precision of label-invariant loss functions. Four simulation studies are included to illustrate Zmix and Zswitch, as well as three case studies from the literature. All methods are available as part of the R package Zmix, which can currently be applied to univariate Gaussian mixture models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pseudo-marginal methods such as the grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms have been introduced in the literature as an approach to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we propose to use Gaussian processes (GP) to accelerate the GIMH method, whilst using a short pilot run of MCWM to train the GP. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Between-subject and within-subject variability is ubiquitous in biology and physiology and understanding and dealing with this is one of the biggest challenges in medicine. At the same time it is difficult to investigate this variability by experiments alone. A recent modelling and simulation approach, known as population of models (POM), allows this exploration to take place by building a mathematical model consisting of multiple parameter sets calibrated against experimental data. However, finding such sets within a high-dimensional parameter space of complex electrophysiological models is computationally challenging. By placing the POM approach within a statistical framework, we develop a novel and efficient algorithm based on sequential Monte Carlo (SMC). We compare the SMC approach with Latin hypercube sampling (LHS), a method commonly adopted in the literature for obtaining the POM, in terms of efficiency and output variability in the presence of a drug block through an in-depth investigation via the Beeler-Reuter cardiac electrophysiological model. We show improved efficiency via SMC and that it produces similar responses to LHS when making out-of-sample predictions in the presence of a simulated drug block.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fatigue of the steel in rails continues to be of major concern to heavy haul track owners despite careful selection and maintenance of rails. The persistence of fatigue is due in part to the erroneous assumption that the maximum loads on, and stresses in, the rails are predictable. Recent analysis of extensive wheel impact detector data from a number of heavy haul tracks has shown that the most damaging forces are in fact randomly distributed with time and location and can be much greater than generally expected. Large- scale Monte-Carlo simulations have been used to identify rail stresses caused by actual, measured distributions of wheel-rail forces on heavy haul tracks. The simulations show that fatigue failure of the rail foot can occur in situations which would be overlooked by traditional analyses. The most serious of these situations are those where track is accessed by multiple operators and in situations where there is a mix of heavy haul, general freight and/or passenger traffic. The least serious are those where the track is carrying single-operator-owned heavy haul unit trains. The paper shows how using the nominal maximum axle load of passing traffic, which is the key issue in traditional analyses, is insufficient and must be augmented with consideration of important operational factors. Ignoring such factors can be costly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Airborne particles, including both ultrafine and supermicrometric particles, contain various carcinogens. Exposure and risk-assessment studies regularly use particle mass concentration as dosimetry parameter, therefore neglecting the potential impact of ultrafine particles due to their negligible mass compared to supermicrometric particles. The main purpose of this study was the characterization of lung cancer risk due to exposure to polycyclic aromatic hydrocarbons and some heavy metals associated with particle inhalation by Italian non-smoking people. A risk-assessment scheme, modified from an existing risk model, was applied to estimate the cancer risk contribution from both ultrafine and supermicrometric particles. Exposure assessment was carried out on the basis of particle number distributions measured in 25 smoke-free microenvironments in Italy. The predicted lung cancer risk was then compared to the cancer incidence rate in Italy to assess the number of lung cancer cases attributed to airborne particle inhalation, which represents one of the main causes of lung cancer, apart from smoking. Ultrafine particles are associated with a much higher risk than supermicrometric particles, and the modified risk-assessment scheme provided a more accurate estimate than the conventional scheme. Great attention has to be paid to indoor microenvironments and, in particular, to cooking and eating times, which represent the major contributors to lung cancer incidence in the Italian population. The modified risk assessment scheme can serve as a tool for assessing environmental quality, as well as setting up exposure standards for particulate matter.