17 resultados para Monte Carle Simulation

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

An inherent weakness in the management of large scale projects is the failure to achieve the scheduled completion date. When projects are planned with the objective of time achievement, the initial planning plays a vital role in the successful achievement of project deadlines. Cost and quality are additional priorities when such projects are being executed. This article proposes a methodology for achieving time duration of a project through risk analysis with the application of a Monte Carlo simulation technique. The methodology is demonstrated using a case application of a cross-country petroleum pipeline construction project.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Implementation of a Monte Carlo simulation for the solution of population balance equations (PBEs) requires choice of initial sample number (N0), number of replicates (M), and number of bins for probability distribution reconstruction (n). It is found that Squared Hellinger Distance, H2, is a useful measurement of the accuracy of Monte Carlo (MC) simulation, and can be related directly to N0, M, and n. Asymptotic approximations of H2 are deduced and tested for both one-dimensional (1-D) and 2-D PBEs with coalescence. The central processing unit (CPU) cost, C, is found in a power-law relationship, C= aMNb0, with the CPU cost index, b, indicating the weighting of N0 in the total CPU cost. n must be chosen to balance accuracy and resolution. For fixed n, M × N0 determines the accuracy of MC prediction; if b > 1, then the optimal solution strategy uses multiple replications and small sample size. Conversely, if 0 < b < 1, one replicate and a large initial sample size is preferred. © 2015 American Institute of Chemical Engineers AIChE J, 61: 2394–2402, 2015

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper introduces a new technique in the investigation of limited-dependent variable models. This paper illustrates that variable precision rough set theory (VPRS), allied with the use of a modern method of classification, or discretisation of data, can out-perform the more standard approaches that are employed in economics, such as a probit model. These approaches and certain inductive decision tree methods are compared (through a Monte Carlo simulation approach) in the analysis of the decisions reached by the UK Monopolies and Mergers Committee. We show that, particularly in small samples, the VPRS model can improve on more traditional models, both in-sample, and particularly in out-of-sample prediction. A similar improvement in out-of-sample prediction over the decision tree methods is also shown.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Marketing scholars are increasingly recognizing the importance of investigating phenomena at multiple levels. However, the analyses methods that are currently dominant within marketing may not be appropriate to dealing with multilevel or nested data structures. We identify the state of contemporary multilevel marketing research, finding that typical empirical approaches within marketing research may be less effective at explicitly taking account of multilevel data structures than those in other organizational disciplines. A Monte Carlo simulation, based on results from a previously published marketing study, demonstrates that different approaches to analysis of the same data can result in very different results (both in terms of power and effect size). The implication is that marketing scholars should be cautious when analyzing multilevel or other grouped data, and we provide a discussion and introduction to the use of hierarchical linear modeling for this purpose.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Visual detection performance (d') is usually an accelerating function of stimulus contrast, which could imply a smooth, threshold-like nonlinearity in the sensory response. Alternatively, Pelli (1985 Journal of the Optical Society of America A 2 1508 - 1532) developed the 'uncertainty model' in which responses were linear with contrast, but the observer was uncertain about which of many noisy channels contained the signal. Such internal uncertainty effectively adds noise to weak signals, and predicts the nonlinear psychometric function. We re-examined these ideas by plotting psychometric functions (as z-scores) for two observers (SAW, PRM) with high precision. The task was to detect a single, vertical, blurred line at the fixation point, or identify its polarity (light vs dark). Detection of a known polarity was nearly linear for SAW but very nonlinear for PRM. Randomly interleaving light and dark trials reduced performance and rendered it non-linear for SAW, but had little effect for PRM. This occurred for both single-interval and 2AFC procedures. The whole pattern of results was well predicted by our Monte Carlo simulation of Pelli's model, with only two free parameters. SAW (highly practised) had very low uncertainty. PRM (with little prior practice) had much greater uncertainty, resulting in lower contrast sensitivity, nonlinear performance, and no effect of external (polarity) uncertainty. For SAW, identification was about v2 better than detection, implying statistically independent channels for stimuli of opposite polarity, rather than an opponent (light - dark) channel. These findings strongly suggest that noise and uncertainty, rather than sensory nonlinearity, limit visual detection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fault tree analysis is used as a tool within hazard and operability (Hazop) studies. The present study proposes a new methodology for obtaining the exact TOP event probability of coherent fault trees. The technique uses a top-down approach similar to that of FATRAM. This new Fault Tree Disjoint Reduction Algorithm resolves all the intermediate events in the tree except OR gates with basic event inputs so that a near minimal cut sets expression is obtained. Then Bennetts' disjoint technique is applied and remaining OR gates are resolved. The technique has been found to be appropriate as an alternative to Monte Carlo simulation methods when rare events are countered and exact results are needed. The algorithm has been developed in FORTRAN 77 on the Perq workstation as an addition to the Aston Hazop package. The Perq graphical environment enabled a friendly user interface to be created. The total package takes as its input cause and symptom equations using Lihou's form of coding and produces both drawings of fault trees and the Boolean sum of products expression into which reliability data can be substituted directly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We have developed a new technique for extracting histological parameters from multi-spectral images of the ocular fundus. The new method uses a Monte Carlo simulation of the reflectance of the fundus to model how the spectral reflectance of the tissue varies with differing tissue histology. The model is parameterised by the concentrations of the five main absorbers found in the fundus: retinal haemoglobins, choroidal haemoglobins, choroidal melanin, RPE melanin and macular pigment. These parameters are shown to give rise to distinct variations in the tissue colouration. We use the results of the Monte Carlo simulations to construct an inverse model which maps tissue colouration onto the model parameters. This allows the concentration and distribution of the five main absorbers to be determined from suitable multi-spectral images. We propose the use of "image quotients" to allow this information to be extracted from uncalibrated image data. The filters used to acquire the images are selected to ensure a one-to-one mapping between model parameters and image quotients. To recover five model parameters uniquely, images must be acquired in six distinct spectral bands. Theoretical investigations suggest that retinal haemoglobins and macular pigment can be recovered with RMS errors of less than 10%. We present parametric maps showing the variation of these parameters across the posterior pole of the fundus. The results are in agreement with known tissue histology for normal healthy subjects. We also present an early result which suggests that, with further development, the technique could be used to successfully detect retinal haemorrhages.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose - To generate a reflectance model of the fundus that allows an accurate non-invasive quantification of blood and pigments. Methods - A Monte Carlo simulation was used to produce a mathematical model of light interaction with the fundus at different wavelengths. The model predictions were compared with fundus images from normal volunteers in several spectral bands (peaks at 507, 525, 552, 585, 596 and 611nm). Th e model was then used to calculate the concentration and distribution of the known absorbing components of the fundus. Results - The shape of the statistical distribution of the image data generally corresponded to that of the model data; the model however appears to overestimate the reflectance of the fundus in the longer wavelength region.As the absorption by xanthophyll has no significant eff ect on light transport above 534nm, its distribution in the fundus was quantified: the wavelengths where both shape and distribution of image and model data matched (<553nm) were used to train a neural network which was then applied to every point in the image data. The xanthophyll distribution thus found was in agreement with published literature data in normal subjects. Conclusion - We have developed a method for optimising multi-spectral imaging of the fundus and a computer image analysis capable of estimating information about the structure and properties of the fundus. Th e technique successfully calculates the distribution of xanthophyll in the fundus of healthy volunteers. Further improvement of the model is required to allow the deduction of other parameters from images; investigations in known pathology models are also necessary to establish if this method is of clinical use in detecting early chroido-retinopathies, hence providing a useful screening and diagnostic tool.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper shows how the angular uncertainties can be determined for a rotary-laser automatic theodolite of the type used in (indoor-GPS) iGPS networks. Initially, the fundamental physics of the rotating head device is used to propagate uncertainties using Monte Carlo simulation. This theoretical element of the study shows how the angular uncertainty is affected by internal parameters, the actual values of which are estimated. Experiments are then carried out to determine the actual uncertainty in the azimuth angle. Results are presented that show that uncertainty decreases with sampling duration. Other significant findings are that uncertainty is relatively constant throughout the working volume and that the uncertainty value is not dependent on the size of the reference angle. © 2009 IMechE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: To investigate the use of MRIA for quantitative characterisation of subretinal fibrosis secondary to nAMD. Methods: MRIA images of the posterior pole were acquired over 4 months from 20 eyes including those with inactive subretinal fibrosis and those being treated with ranibizumab for nAMD. Changes in morphology of the macula affected by nAMD were modelled and reflectance spectra at the MRIA acquisition wavelengths (507, 525, 552, 585, 596, 611 and 650nm) were computed using Monte Carlo simulation. Quantitative indicators of fibrosis were derived by matching image spectra to the model spectra of known morphological properties. Results: The model spectra were comparable to the image spectra, both normal and pathological. The key morphological changes that the model associated with nAMD were gliosis of the IS-OS junction, decrease in retinal blood and decrease in RPE melanin. However, these changes were not specific to fibrosis and none of the quantitative indicators showed a unique association with the degree of fibrosis. Moderate correlations were found with the clinical assessment, but not with the treatment program. Conclusion: MRIA can distinguish subretinal fibrosis from healthy tissue. The methods used show high sensitivity but low specificity, being unable to distinguish scarring from other abnormalities like atrophy. Quantification of scarring was not achieved with the wavelengths used due to the complex structural changes to retinal tissues in the process of nAMD. Further studies, incorporating other wavelengths, will establish whether MRIA has a role in the assessment of subretinal fibrosis in the context of retinal and choroidal pathology

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The mechanics-based analysis framework predicts top-down fatigue cracking initiation time in asphalt concrete pavements by utilising fracture mechanics and mixture morphology-based property. To reduce the level of complexity involved, traffic data were characterised and incorporated into the framework using the equivalent single axle load (ESAL) approach. There is a concern that this kind of simplistic traffic characterisation might result in erroneous performance predictions and pavement structural designs. This paper integrates axle load spectra and other traffic characterisation parameters into the mechanics-based analysis framework and studies the impact these traffic characterisation parameters have on predicted fatigue cracking performance. The traffic characterisation inputs studied are traffic growth rate, axle load spectra, lateral wheel wander and volume adjustment factors. For this purpose, a traffic integration approach which incorporates Monte Carlo simulation and representative traffic characterisation inputs was developed. The significance of these traffic characterisation parameters was established by evaluating a number of field pavement sections. It is evident from the results that all the traffic characterisation parameters except truck wheel wander have been observed to have significant influence on predicted top-down fatigue cracking performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following thesis describes the computer modelling of radio frequency capacitively coupled methane/hydrogen plasmas and the consequences for the reactive ion etching of (100) GaAs surfaces. In addition a range of etching experiments was undertaken over a matrix of pressure, power and methane concentration. The resulting surfaces were investigated using X-ray photoelectron spectroscopy and the results were discussed in terms of physical and chemical models of particle/surface interactions in addition to the predictions for energies, angles and relative fluxes to the substrate of the various plasma species. The model consisted of a Monte Carlo code which followed electrons and ions through the plasma and sheath potentials whilst taking account of collisions with background neutral gas molecules. The ionisation profile output from the electron module was used as input for the ionic module. Momentum scattering interactions of ions with gas molecules were investigated via different models and compared against results given by quantum mechanical code. The interactions were treated as central potential scattering events and the resulting neutral cascades were followed. The resulting predictions for ion energies at the cathode compared well to experimental ion energy distributions and this verified the particular form of the electrical potentials used and their applicability in the particular geometry plasma cell used in the etching experiments. The final code was used to investigate the effect of external plasma parameters on the mass distribution, energy and angles of all species impingent on the electrodes. Comparisons of electron energies in the plasma also agreed favourably with measurements made using a Langmuir electric probe. The surface analysis showed the surfaces all to be depleted in arsenic due to its preferential removal and the resultant Ga:As ratio in the surface was found to be directly linked to the etch rate. The etch rate was determined by the methane flux which was predicted by the code.