940 resultados para Monte Carlo, Método de
Resumo:
In this research the reliability and availability of fiberboard pressing plant is assessed and a cost-based optimization of the system using the Monte- Carlo simulation method is performed. The woodchip and pulp or engineered wood industry in Australia and around the world is a lucrative industry. One such industry is hardboard. The pressing system is the main system, as it converts the wet pulp to fiberboard. The assessment identified the pressing system has the highest downtime throughout the plant plus it represents the bottleneck in the process. A survey in the late nineties revealed there are over one thousand plants around the world, with the pressing system being a common system among these plants. No work has been done to assess or estimate the reliability of such a pressing system; therefore this assessment can be used for assessing any plant of this type.
Resumo:
This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.
Resumo:
The iPlan treatment planning sys-tem uses a pencil beam algorithm, with density cor-rections, to predict the doses delivered by very small (stereotactic) radiotherapy fields. This study tests the accuracy of dose predictions made by iPlan, for small-field treatments delivered to a planar solid wa-ter phantom and to heterogeneous human tissue using the BrainLAB m3 micro-multileaf collimator.
Resumo:
The purpose of this work is to validate and automate the use of DYNJAWS; a new component module (CM) in the BEAMnrc Monte Carlo (MC) user code. The DYNJAWS CM simulates dynamic wedges and can be used in three modes; dynamic, step-and-shoot and static. The step-and-shoot and dynamic modes require an additional input file defining the positions of the jaw that constitutes the dynamic wedge, at regular intervals during its motion. A method for automating the generation of the input file is presented which will allow for the more efficient use of the DYNJAWS CM. Wedged profiles have been measured and simulated for 6 and 10 MV photons at three field sizes (5 cm x 5 cm , 10 cm x10 cm and 20 cm x 20 cm), four wedge angles (15, 30, 45 and 60 degrees), at dmax and at 10 cm depth. Results of this study show agreement between the measured and the MC profiles to within 3% of absolute dose or 3 mm distance to agreement for all wedge angles at both energies and depths. The gamma analysis suggests that dynamic mode is more accurate than the step-and-shoot mode. The DYNJAWS CM is an important addition to the BEAMnrc code and will enable the MC verification of patient treatments involving dynamic wedges.
Resumo:
Different international plant protection organisations advocate different schemes for conducting pest risk assessments. Most of these schemes use structured questionnaire in which experts are asked to score several items using an ordinal scale. The scores are then combined using a range of procedures, such as simple arithmetic mean, weighted averages, multiplication of scores, and cumulative sums. The most useful schemes will correctly identify harmful pests and identify ones that are not. As the quality of a pest risk assessment can depend on the characteristics of the scoring system used by the risk assessors (i.e., on the number of points of the scale and on the method used for combining the component scores), it is important to assess and compare the performance of different scoring systems. In this article, we proposed a new method for assessing scoring systems. Its principle is to simulate virtual data using a stochastic model and, then, to estimate sensitivity and specificity values from these data for different scoring systems. The interest of our approach was illustrated in a case study where several scoring systems were compared. Data for this analysis were generated using a probabilistic model describing the pest introduction process. The generated data were then used to simulate the outcome of scoring systems and to assess the accuracy of the decisions about positive and negative introduction. The results showed that ordinal scales with at most 5 or 6 points were sufficient and that the multiplication-based scoring systems performed better than their sum-based counterparts. The proposed method could be used in the future to assess a great diversity of scoring systems.
Resumo:
This study uses dosimetry film measurements and Monte Carlo simulations to investigate the accuracy of type-a (pencil-beam) dose calculations for predicting the radiation doses delivered during stereotactic radiotherapy treatments of the brain. It is shown that when evaluating doses in a water phantom, the type-a algorithm provides dose predictions which are accurate to within clinically relevant criteria, gamma(3%,3mm), but these predictions are nonetheless subtly different from the results of evaluating doses from the same fields using radiochromic film and Monte Carlo simulations. An analysis of a clinical meningioma treatment suggests that when predicting stereotactic radiotherapy doses to the brain, the inaccuracies of the type-a algorithm can be exacerbated by inadequate evaluation of the effects of nearby bone or air, resulting in dose differences of up to 10% for individual fields. The results of this study indicate the possible advantage of using Monte Carlo calculations, as well as measurements with high-spatial resolution media, to verify type-a predictions of dose delivered in cranial treatments.
Resumo:
Asset health inspections can produce two types of indicators: (1) direct indicators (e.g. the thickness of a brake pad, and the crack depth on a gear) which directly relate to a failure mechanism; and (2) indirect indicators (e.g. the indicators extracted from vibration signals and oil analysis data) which can only partially reveal a failure mechanism. While direct indicators enable more precise references to asset health condition, they are often more difficult to obtain than indirect indicators. The state space model provides an efficient approach to estimating direct indicators by using indirect indicators. However, existing state space models to estimate direct indicators largely depend on assumptions such as, discrete time, discrete state, linearity, and Gaussianity. The discrete time assumption requires fixed inspection intervals. The discrete state assumption entails discretising continuous degradation indicators, which often introduces additional errors. The linear and Gaussian assumptions are not consistent with nonlinear and irreversible degradation processes in most engineering assets. This paper proposes a state space model without these assumptions. Monte Carlo-based algorithms are developed to estimate the model parameters and the remaining useful life. These algorithms are evaluated for performance using numerical simulations through MATLAB. The result shows that both the parameters and the remaining useful life are estimated accurately. Finally, the new state space model is used to process vibration and crack depth data from an accelerated test of a gearbox. During this application, the new state space model shows a better fitness result than the state space model with linear and Gaussian assumption.
Resumo:
Markov chain Monte Carlo (MCMC) estimation provides a solution to the complex integration problems that are faced in the Bayesian analysis of statistical problems. The implementation of MCMC algorithms is, however, code intensive and time consuming. We have developed a Python package, which is called PyMCMC, that aids in the construction of MCMC samplers and helps to substantially reduce the likelihood of coding error, as well as aid in the minimisation of repetitive code. PyMCMC contains classes for Gibbs, Metropolis Hastings, independent Metropolis Hastings, random walk Metropolis Hastings, orientational bias Monte Carlo and slice samplers as well as specific modules for common models such as a module for Bayesian regression analysis. PyMCMC is straightforward to optimise, taking advantage of the Python libraries Numpy and Scipy, as well as being readily extensible with C or Fortran.
Resumo:
Here we present a sequential Monte Carlo approach that can be used to find optimal designs. Our focus is on the design of phase III clinical trials where the derivation of sampling windows is required, along with the optimal sampling schedule. The search is conducted via a particle filter which traverses a sequence of target distributions artificially constructed via an annealed utility. The algorithm derives a catalogue of highly efficient designs which, not only contain the optimal, but can also be used to derive sampling windows. We demonstrate our approach by designing a hypothetical phase III clinical trial.
Resumo:
Radiotherapy is a cancer treatment modality in which a dose of ionising radiation is delivered to a tumour. The accurate calculation of the dose to the patient is very important in the design of an effective therapeutic strategy. This study aimed to systematically examine the accuracy of the radiotherapy dose calculations performed by clinical treatment planning systems by comparison againstMonte Carlo simulations of the treatment delivery. A suite of software tools known as MCDTK (Monte Carlo DICOM ToolKit) was developed for this purpose, and is capable of: • Importing DICOM-format radiotherapy treatment plans and producing Monte Carlo simulation input files (allowing simple simulation of complex treatments), and calibrating the results; • Analysing the predicted doses of and deviations between the Monte Carlo simulation results and treatment planning system calculations in regions of interest (tumours and organs-at-risk) and generating dose-volume histograms, so that conformity with dose prescriptions can be evaluated. The code has been tested against various treatment planning systems, linear acceleratormodels and treatment complexities. Six clinical head and neck cancer treatments were simulated and the results analysed using this software. The deviations were greatest where the treatment volume encompassed tissues on both sides of an air cavity. This was likely due to the method the planning system used to model low density media.