119 resultados para Simulations de Monte-Carlo


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim A new method of penumbral analysis is implemented which allows an unambiguous determination of field size and penumbra size and quality for small fields and other non-standard fields. Both source occlusion and lateral electronic disequilibrium will affect the size and shape of cross-axis profile penumbrae; each is examined in detail. Method A new method of penumbral analysis is implemented where the square of the derivative of the cross-axis profile is plotted. The resultant graph displays two peaks in the place of the two penumbrae. This allows a strong visualisation of the quality of a field penumbra, as well as a mathematically consistent method of determining field size (distance between the two peak’s maxima), and penumbra (full-widthtenth-maximum of peak). Cross-axis profiles were simulated in a water phantom at a depth of 5 cm using Monte Carlo modelling, for field sizes between 5 and 30 mm. The field size and penumbra size of each field was calculated using the method above, as well as traditional definitions set out in IEC976. The effect of source occlusion and lateral electronic disequilibrium on the penumbrae was isolated by repeating the simulations removing electron transport and using an electron spot size of 0 mm, respectively. Results All field sizes calculated using the traditional and proposed methods agreed within 0.2 mm. The penumbra size measured using the proposed method was systematically 1.8 mm larger than the traditional method at all field sizes. The size of the source had a larger effect on the size of the penumbra than did lateral electronic disequilibrium, particularly at very small field sizes. Conclusion Traditional methods of calculating field size and penumbra are proved to be mathematically adequate for small fields. However, the field size definition proposed in this study would be more robust amongst other nonstandard fields, such as flattening filter free. Source occlusion plays a bigger role than lateral electronic disequilibrium in small field penumbra size.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most of the existing algorithms for approximate Bayesian computation (ABC) assume that it is feasible to simulate pseudo-data from the model at each iteration. However, the computational cost of these simulations can be prohibitive for high dimensional data. An important example is the Potts model, which is commonly used in image analysis. Images encountered in real world applications can have millions of pixels, therefore scalability is a major concern. We apply ABC with a synthetic likelihood to the hidden Potts model with additive Gaussian noise. Using a pre-processing step, we fit a binding function to model the relationship between the model parameters and the synthetic likelihood parameters. Our numerical experiments demonstrate that the precomputed binding function dramatically improves the scalability of ABC, reducing the average runtime required for model fitting from 71 hours to only 7 minutes. We also illustrate the method by estimating the smoothing parameter for remotely sensed satellite imagery. Without precomputation, Bayesian inference is impractical for datasets of that scale.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

E. coli does chemotaxis by performing a biased random walk composed of alternating periods of swimming (runs) and reorientations (tumbles). Tumbles are typically modelled as complete directional randomisations but it is known that in wild type E. coli, successive run directions are actually weakly correlated, with a mean directional difference of ∼63°. We recently presented a model of the evolution of chemotactic swimming strategies in bacteria which is able to quantitatively reproduce the emergence of this correlation. The agreement between model and experiments suggests that directional persistence may serve some function, a hypothesis supported by the results of an earlier model. Here we investigate the effect of persistence on chemotactic efficiency, using a spatial Monte Carlo model of bacterial swimming in a gradient, combined with simulations of natural selection based on chemotactic efficiency. A direct search of the parameter space reveals two attractant gradient regimes, (a) a low-gradient regime, in which efficiency is unaffected by directional persistence and (b) a high-gradient regime, in which persistence can improve chemotactic efficiency. The value of the persistence parameter that maximises this effect corresponds very closely with the value observed experimentally. This result is matched by independent simulations of the evolution of directional memory in a population of model bacteria, which also predict the emergence of persistence in high-gradient conditions. The relationship between optimality and persistence in different environments may reflect a universal property of random-walk foraging algorithms, which must strike a compromise between two competing aims: exploration and exploitation. We also present a new graphical way to generally illustrate the evolution of a particular trait in a population, in terms of variations in an evolvable parameter.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents an extension to the Rapidly-exploring Random Tree (RRT) algorithm applied to autonomous, drifting underwater vehicles. The proposed algorithm is able to plan paths that guarantee convergence in the presence of time-varying ocean dynamics. The method utilizes 4-Dimensional, ocean model prediction data as an evolving basis for expanding the tree from the start location to the goal. The performance of the proposed method is validated through Monte-Carlo simulations. Results illustrate the importance of the temporal variance in path execution, and demonstrate the convergence guarantee of the proposed methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There have been substantial advances in small field dosimetry techniques and technologies, over the last decade, which have dramatically improved the achievable accuracy of small field dose measurements. This educational note aims to help radiation oncology medical physicists to apply some of these advances in clinical practice. The evaluation of a set of small field output factors (total scatter factors) is used to exemplify a detailed measurement and simulation procedure and as a basis for discussing the possible effects of simplifying that procedure. Field output factors were measured with an unshielded diode and a micro-ionisation chamber, at the centre of a set of square fields defined by a micro-multileaf collimator. Nominal field sizes investigated ranged from 6×6 to 98×98 mm2. Diode measurements in fields smaller than 30 mm across were corrected using response factors calculated using Monte Carlo simulations of the full diode geometry and daisy-chained to match micro-chamber measurements at intermediate field sizes. Diode measurements in fields smaller than 15 mm across were repeated twelve times over three separate measurement sessions, to evaluate the to evaluate the reproducibility of the radiation field size and its correspondence with the nominal field size. The five readings that contributed to each measurement on each day varied by up to 0.26%, for the “very small” fields smaller than 15 mm, and 0.18% for the fields larger than 15 mm. The diode response factors calculated for the unshielded diode agreed with previously published results, within 1.6%. The measured dimensions of the very small fields differed by up to 0.3 mm, across the different measurement sessions, contributing an uncertainty of up to 1.2% to the very small field output factors. The overall uncertainties in the field output factors were 1.8% for the very small fields and 1.1% for the fields larger than 15 mm across. Recommended steps for acquiring small field output factor measurements for use in radiotherapy treatment planning system beam configuration data are provided.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Field emission (FE) electron gun sources provide new capabilities for high lateral resolution EPMA. The determination of analytical lateral resolution is not as straightforward as that for electron microscopy imaging. Results from two sets of experiments to determine the actual lateral resolution for accurate EPMA are presented for Kα X-ray lines of Si and Al and Lα of Fe at 5 and 7 keV in a silicate glass. These results are compared to theoretical predictions and Monte Carlo simulations of analytical lateral resolution. The experiments suggest little is gained in lateral resolution by dropping from 7 to 5 keV in EPMA of this silicate glass.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study compares Value-at-Risk (VaR) measures for Australian banks over a period that includes the Global Financial Crisis (GFC) to determine whether the methodology and parameter selection are important for capital adequacy holdings that will ultimately support a bank in a crisis period. VaR methodology promoted under Basel II was largely criticised during the GFC for its failure to capture downside risk. However, results from this study indicate that 1-year parametric and historical models produce better measures of VaR than models with longer time frames. VaR estimates produced using Monte Carlo simulations show a high percentage of violations but with lower average magnitude of a violation when they occur. VaR estimates produced by the ARMA GARCH model also show a relatively high percentage of violations, however, the average magnitude of a violation is quite low. Our findings support the design of the revised Basel II VaR methodology which has also been adopted under Basel III.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper it is demonstrated how the Bayesian parametric bootstrap can be adapted to models with intractable likelihoods. The approach is most appealing when the semi-automatic approximate Bayesian computation (ABC) summary statistics are selected. After a pilot run of ABC, the likelihood-free parametric bootstrap approach requires very few model simulations to produce an approximate posterior, which can be a useful approximation in its own right. An alternative is to use this approximation as a proposal distribution in ABC algorithms to make them more efficient. In this paper, the parametric bootstrap approximation is used to form the initial importance distribution for the sequential Monte Carlo and the ABC importance and rejection sampling algorithms. The new approach is illustrated through a simulation study of the univariate g-and- k quantile distribution, and is used to infer parameter values of a stochastic model describing expanding melanoma cell colonies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Change point estimation is recognized as an essential tool of root cause analyses within quality control programs as it enables clinical experts to search for potential causes of change in hospital outcomes more effectively. In this paper, we consider estimation of the time when a linear trend disturbance has occurred in survival time following an in-control clinical intervention in the presence of variable patient mix. To model the process and change point, a linear trend in the survival time of patients who underwent cardiac surgery is formulated using hierarchical models in a Bayesian framework. The data are right censored since the monitoring is conducted over a limited follow-up period. We capture the effect of risk factors prior to the surgery using a Weibull accelerated failure time regression model. We use Markov Chain Monte Carlo to obtain posterior distributions of the change point parameters including the location and the slope size of the trend and also corresponding probabilistic intervals and inferences. The performance of the Bayesian estimator is investigated through simulations and the result shows that precise estimates can be obtained when they are used in conjunction with the risk-adjusted survival time cumulative sum control chart (CUSUM) control charts for different trend scenarios. In comparison with the alternatives, step change point model and built-in CUSUM estimator, more accurate and precise estimates are obtained by the proposed Bayesian estimator over linear trends. These superiorities are enhanced when probability quantification, flexibility and generalizability of the Bayesian change point detection model are also considered.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose The purpose of this investigation was to assess the angular dependence of a commercial optically stimulated luminescence dosimeter (OSLD) dosimetry system in MV x-ray beams at depths beyondd max and to find ways to mitigate this dependence for measurements in phantoms. Methods Two special holders were designed which allow a dosimeter to be rotated around the center of its sensitive volume. The dosimeter's sensitive volume is a disk, 5 mm in diameter and 0.2 mm thick. The first holder rotates the disk in the traditional way. It positions the disk perpendicular to the beam (gantry pointing to the floor) in the initial position (0°). When the holder is rotated the angle of the disk towards the beam increases until the disk is parallel with the beam (“edge on,” 90°). This is referred to as Setup 1. The second holder offers a new, alternative measurement position. It positions the disk parallel to the beam for all angles while rotating around its center (Setup 2). Measurements with five to ten dosimeters per point were carried out for 6 MV at 3 and 10 cm depth. Monte Carlo simulations using GEANT4 were performed to simulate the response of the active detector material for several angles. Detector and housing were simulated in detail based on microCT data and communications with the manufacturer. Various material compositions and an all-water geometry were considered. Results For the traditional Setup 1 the response of the OSLD dropped on average by 1.4% ± 0.7% (measurement) and 2.1% ± 0.3% (Monte Carlo simulation) for the 90° orientation compared to 0°. Monte Carlo simulations also showed a strong dependence of the effect on the composition of the sensitive layer. Assuming the layer to completely consist of the active material (Al2O3) results in a 7% drop in response for 90° compared to 0°. Assuming the layer to be completely water, results in a flat response within the simulation uncertainty of about 1%. For the new Setup 2, measurements and Monte Carlo simulations found the angular dependence of the dosimeter to be below 1% and within the measurement uncertainty. Conclusions The dosimeter system exhibits a small angular dependence of approximately 2% which needs to be considered for measurements involving other than normal incident beams angles. This applies in particular to clinicalin vivo measurements where the orientation of the dosimeter is dictated by clinical circumstances and cannot be optimized as otherwise suggested here. When measuring in a phantom, the proposed new setup should be considered. It changes the orientation of the dosimeter so that a coplanar beam arrangement always hits the disk shaped detector material from the thin side and thereby reduces the angular dependence of the response to within the measurement uncertainty of about 1%. This improvement makes the dosimeter more attractive for clinical measurements with multiple coplanar beams in phantoms, as the overall measurement uncertainty is reduced. Similarly, phantom based postal audits can transition from the traditional TLD to the more accurate and convenient OSLD.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Uncertainty assessments of herbicide losses from rice paddies in Japan associated with local meteorological conditions and water management practices were performed using a pesticide fate and transport model, PCPF-1, under the Monte Carlo (MC) simulation scheme. First, MC simulations were conducted for five different cities with a prescribed water management scenario and a 10-year meteorological dataset of each city. The effectiveness of water management was observed regarding the reduction of pesticide runoff. However, a greater potential of pesticide runoff remained in Western Japan. Secondly, an extended analysis was attempted to evaluate the effects of local water management and meteorological conditions between the Chikugo River basin and the Sakura River basin using uncertainty inputs processed from observed water management data. The results showed that because of more severe rainfall events, significant pesticide runoff occurred in the Chikugo River basin even when appropriate irrigation practices were implemented. © Pesticide Science Society of Japan.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fatigue of the steel in rails continues to be of major concern to heavy haul track owners despite careful selection and maintenance of rails. The persistence of fatigue is due in part to the erroneous assumption that the maximum loads on, and stresses in, the rails are predictable. Recent analysis of extensive wheel impact detector data from a number of heavy haul tracks has shown that the most damaging forces are in fact randomly distributed with time and location and can be much greater than generally expected. Large- scale Monte-Carlo simulations have been used to identify rail stresses caused by actual, measured distributions of wheel-rail forces on heavy haul tracks. The simulations show that fatigue failure of the rail foot can occur in situations which would be overlooked by traditional analyses. The most serious of these situations are those where track is accessed by multiple operators and in situations where there is a mix of heavy haul, general freight and/or passenger traffic. The least serious are those where the track is carrying single-operator-owned heavy haul unit trains. The paper shows how using the nominal maximum axle load of passing traffic, which is the key issue in traditional analyses, is insufficient and must be augmented with consideration of important operational factors. Ignoring such factors can be costly.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Large integration of solar Photo Voltaic (PV) in distribution network has resulted in over-voltage problems. Several control techniques are developed to address over-voltage problem using Deterministic Load Flow (DLF). However, intermittent characteristics of PV generation require Probabilistic Load Flow (PLF) to introduce variability in analysis that is ignored in DLF. The traditional PLF techniques are not suitable for distribution systems and suffer from several drawbacks such as computational burden (Monte Carlo, Conventional convolution), sensitive accuracy with the complexity of system (point estimation method), requirement of necessary linearization (multi-linear simulation) and convergence problem (Gram–Charlier expansion, Cornish Fisher expansion). In this research, Latin Hypercube Sampling with Cholesky Decomposition (LHS-CD) is used to quantify the over-voltage issues with and without the voltage control algorithm in the distribution network with active generation. LHS technique is verified with a test network and real system from an Australian distribution network service provider. Accuracy and computational burden of simulated results are also compared with Monte Carlo simulations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this paper is to provide a Bayesian formulation of the so-called magnitude-based inference approach to quantifying and interpreting effects, and in a case study example provide accurate probabilistic statements that correspond to the intended magnitude-based inferences. The model is described in the context of a published small-scale athlete study which employed a magnitude-based inference approach to compare the effect of two altitude training regimens (live high-train low (LHTL), and intermittent hypoxic exposure (IHE)) on running performance and blood measurements of elite triathletes. The posterior distributions, and corresponding point and interval estimates, for the parameters and associated effects and comparisons of interest, were estimated using Markov chain Monte Carlo simulations. The Bayesian analysis was shown to provide more direct probabilistic comparisons of treatments and able to identify small effects of interest. The approach avoided asymptotic assumptions and overcame issues such as multiple testing. Bayesian analysis of unscaled effects showed a probability of 0.96 that LHTL yields a substantially greater increase in hemoglobin mass than IHE, a 0.93 probability of a substantially greater improvement in running economy and a greater than 0.96 probability that both IHE and LHTL yield a substantially greater improvement in maximum blood lactate concentration compared to a Placebo. The conclusions are consistent with those obtained using a ‘magnitude-based inference’ approach that has been promoted in the field. The paper demonstrates that a fully Bayesian analysis is a simple and effective way of analysing small effects, providing a rich set of results that are straightforward to interpret in terms of probabilistic statements.